Part 3 in the SEO Death series
There are all sorts of things that can kill your search engine traffic, but perhaps none is more humiliating than leaving your development server accessible to the search engine robots. If your test environment gets crawled and the pages get indexed you can suddenly have a duplicate of your entire site competing with your live site and your rankings are guaranteed to go any direction except up. And to make matters worse, most of the pages on the dev site will probably have bad or no data on them.
A good way to check if your dev server is in the index is to search for one of the dev URLs. Start with the homepage. If it’s in the index chances are you are screwed.
Note to engineers: make sure your dev server is inaccessible to robots (and anyone else) by implementing a login. If you want to be super cautious, which is always a good idea, triple bag that sucker by also disallowing robots in the robots.txt file and tagging all dev pages as “noindex” (just remember to remove the noindex tags when you push the pages live).
Additional SEO=Death Posts: