Whoa, Dan! How did you get a time machine back to the late 90’s to show us this rockin’ GeoCities site (personally, I was an AngelFire guy)? Well, easily surprised reader you are wrong! This is slammin’ retro site is so totally tubular that it ranks on the first page of Google searches for a query bucket that has 250k monthly searches!
Yup, you read that right. This site was previously ranking 3rd/4th for all variations of “wish promo code” (though now it’s settled in comfortably at #8). Check out the search volume, per keywordtool.io:
I think 250k monthly queries related to a major brand makes this a pretty competitive query wouldn’t you say? Not only that, it came out of nowhere and is rapidly ranking for more and more search queries. Just check out the organic keyword report from SEMrush:
So if they aren’t ranking this site for its link profile, then I guess it’s ranking because of it’s ‘quality content’ right? After all ‘quality content’ without links can rank…
What does this mean for SEOs
Honestly, why are we as an industry still taking comments like this at face value? And aren’t comments like this detrimental without any context?
How often is this true? Is it for high volume or low volume searches? How often does Google rank low-quality pages by these processes aka what’s the fail rate? Google spokespeople hype up the ability of their machine process to solve incredibly complicated problems. And they do it! But they don’t do it anywhere near 100% of the time. And without the context and data, it feels irresponsible to run your digital business (whether agency side or in-house) based on statements like these.
Traditional SEO practices still work, Google doesn’t programmatically understand qualitative concepts like content quality nearly as much as they want you to think they do.
SEO isn’t dying anytime soon, and solid technical SEO, links etc still work etc.