That’s what a retailer client asked me a few days ago. It was odd because before last week, their home page traffic was killing it. But a quick look at the home page traffic in Google Search Analytics looked like a text-book case of “softness”:
Soft Home Page

Besides the double-black-diamond slope, what was really troubling about this graph was that the majority of the traffic to this client’s home page was for brand queries. So either people had stopped searching for their brand or something else was now outranking them for their brand queries.

Sure enough, when I searched their brand name in Google, the first result was a yellowpages.com page called “<BRAND> Locations & Hours Near Pleasanton, CA” – pretty awful (and pretty good strategy for YP.com considering the brand does not have a location anywhere near Pleasanton). Here’s an example page for IKEA Locations & Hours Near Pleasanton (note: IKEA is not the client and the nearest IKEA is about 31 miles away):

https://www.yellowpages.com/pleasanton-ca/ikea

I quickly did various diagnostics on the home page and it seemed fine. It worked in various browsers with JS/cookies on/off. ScreamingFrog could crawl it no problem. Then I did a Fetch as Google in GSC and whammo:

Redirected
I can’t show you where the home page was redirecting Googlebot but let’s just say it went to a URL that looked something like /wtf-lol-zomg-#soft.

I alerted the dev team and it turned out that they had recently moved delivery of the home page to a new “high-performance” system and in the process I guess they had forgotten that this system had bot blocking turned on by default. #Doh!

The moral of the story: Bot blockers don’t kill SEO. People do.

 

Share This Story!

About Author

No Comment

Comments are closed.