Select Page
Clear all

What is crawling?

New Member

Crawling is the process by which search engines discover updated content on the web, such as new sites or pages, changes to existing sites, and dead links. To do this, a search engine uses a program that can be referred to as a ‘crawler’, ‘bot’ or ‘spider’ (each search engine has its own type) which follows an algorithmic process to determine which sites to crawl and how often. As a search engine’s crawler moves through your site it will also detect and record any links it finds on these pages and add them to a list that will be crawled later. This is how new content is discovered.

Topic starter Posted : 10/09/2021 10:54 am
New Member

I used to wonder what it had to do with the web.

I like how you have broken down those elements!

Posted : 14/09/2021 3:40 pm