What Is A Search Engine Spider?
A search engine spider is an Internet bot that systematically browses the World Wide Web in order to create an index of data. Also known as a web crawler or search engine crawler.
Crawlers are called spiders because they crawl the Internet in the same way spiders crawl on their webs. These bots analyze all data they find on a web page, including text, meta tags, images, videos, and hyperlinks.
Search engine spiders crawl between 40% and 70% of pages on the Internet because they’re programmed to focus on pages with high authority and new content is being published every day that has not been discovered. Search engines commonly program spiders to operate in different modes, so the bot will use one mode to queue pages pending review and another to add a page to a database.
Search engine spiders use a variety of factors to assign a crawl budget to each website, which assess how many of the web pages should be crawled and how often they should revisit a page that was already crawled.
Visit the SEO Glossary
Go to the SEO Glossary to find more terms and definitions that relate to the field of search engine optimization.

The Editorial Staff at SEO Chatter is a team of search engine optimization and digital marketing experts led by Stephen Hockman with more than 15 years of experience in search engine marketing. We publish guides on the fundamentals of SEO for beginner marketers.