Two weeks ago a client reset its bot-blocker, unintentionally blocking Googlebot. We had SEORadar monitoring the site so we quickly discovered the problem and alerted the client. Unfortunately, by the time they fixed the bot-blocker settings, they had lost about 100,000 daily visitors from Google. Of course, the first thing they asked was:
How Long Will It Take Our Google Traffic To Recover From Blocking Googlebot?
While your mileage may vary, in this case the answer is about one week.
Here’s my theory on how this process works:
- You block Googlebot from crawling your site (the most common reasons I see are improper bot-blocking settings or adding a “Disallow: /” rule to the robots.txt file).
- Googlebot gets a 403 error when it tries to crawl the site or just stops crawling because of the robots rule. After hitting the home page (or robots.txt) a few times, it gets the message and starts demoting the site’s URLs. Traffic drops dramatically within a few hours. In this case, the site saw about a -50% drop within two hours and a -60% drop within 24 hours that held for most of the time Googlebot was blocked.
- GSC showed that crawl rate dropped from about 400,000 URLs/day (it’s a 5MM URL site) to about 11,000 URLs/day. I haven’t investigated how Googlebot was able to crawl 11,000 blocked URLs yet. That’s for another post.
- When you unblock Googlebot, it starts to crawl again. In this case it immediately went back to its pre-block levels, but if you don’t have a strong domain, you may need to do something to spur crawling (aka “get links”).
- As Google recrawls previously inaccessible URLs, it starts reevaluating their rankings. As best I can tell these URLs were never excluded from Google’s index (the URLs still showed up in site: queries), but it does appear the content of their Google caches were deleted. So Google needs to “see” the page again and reapply its algorithms.
- On a big site, or a small site with weak backlinks, it may take several days/weeks for Googlebot to recrawl all of the URLs it had demoted. So the recovery pattern can be gradual. Here’s what it looked like for the site in question:
On the bright side, when you block Googlebot from your entire site, your avg time downloading a page metrics improve quite a bit pic.twitter.com/CGV3UItX0z
— Andrew Shotland (@localseoguide) August 18, 2018