A number of my clients have been having site performance issues as of late. It’s easy to see in Google Webmaster Tools’ Crawl Statistics report that if your pages load slowly, your site does not get crawled as much.

But your fast-loading pages could be hurting your SEO efforts as well.  I have a client site with two main types of pages: “Good” pages and “Bad” pages.  There are an equal amount of Good and Bad pages and they have roughly the same number of internal links.  Here are the current index numbers shown by Google for each, the avg. download speed for each and the number of monthly search engine referrals:

Good: 49,000 pages indexed, 3.5 seconds, 47,000 SE referrals

Bad: 221,000 pages indexed, .2 seconds, 2,000 SE referrals

It’s not rocket science to think that with a 3.5 second download Google is not going to crawl and index many of these Good pages.  But what’s interesting to me is whether or not the speedy Bad pages are making matters even worse?  If they weren’t there would we be seeing more slow Good pages in the index and getting more traffic or would the number of slow pages stay the same?

In the long run the answer won’t matter much as we are going to noindex these Bad pages and speed up the Good ones, but this data just underscores the need to be constantly monitoring site performance and to keep the bots away from your Bad pages as much as possible.


Share This Story!

About Author

13 Response Comments

  • AhmedF  January 23, 2009 at 9:39 am

    Two thoughts:

    1. How were you determining how long it took for good vs bad pages to load – internal tools?

    2. Even the ‘fastest’ loading time for those pages (which looks like 1-1.25 seconds) is pretty high. Considering it is pure HTML that Google is downloading, it should be at max 750 ms (we aim for 400 ms). They really need to fix up those load times 🙂

  • Andrew Shotland  January 23, 2009 at 9:48 am

    We are using internal tools. The Good Bad pages loading time was 200ms.

  • Andrew Shotland  January 23, 2009 at 9:49 am

    But yes they do need to fix their performance issues

  • Andrew Shotland  January 23, 2009 at 9:52 am

    If you are referring to the avg load time displayed in GWT I think the issue is that the slow loading Good pages are bringing the avg time way up. If we fix the Good pages issues that should bring everything down to a good level.

  • AhmedF  January 23, 2009 at 10:18 am

    First you said bad pages were 200 ms, now you say good pages.

    You are gonna make my head explode.

    Still – I am surprised that Google is hitting a site that is so slow so hard. Every time one of my sites goes over 1.5 seconds they basically kick it to the curb.

  • Andrew Shotland  January 23, 2009 at 10:22 am

    I stand corrected Ahmed. I meant to type “Bad” and “Good” came out. Sorry for the explosion.

    Re the curb, my clients are not exactly off the curb. If you look at the current pages crawled # it’s pretty low.

  • Stever  January 23, 2009 at 10:43 am

    The graphs you are showing are for a pretty big site, probably an IYP client?? at 100k pages crawled per day at the peaks.

    So for large database driven websites you probably have to start addressing the way the scripts and pages are coded to cut down on database queries. On high traffic, big sites, you probably need to hire a database and server wizard to make things more efficient, put images on a separate server, etc. to speed things up.

    At 2 seconds and higher you not only limit the bots from crawling but the user experience takes a hit too and instead of waiting for pages to load the trusty back button starts to get tempting.

    And what do you mean by “good” and “bad”? Is that like “white” and “black”, or “shades of grey” in the hat you’re wearing?

  • Andrew Shotland  January 23, 2009 at 10:58 am

    Sorry to be cryptic about this but I can’t say much about the nature of the client’s site. Re the “good” and “bad” I mean that the “good” have a value for search – they target good terms. The “bad” have low value for search – they target terms that aren’t searched often.

  • AhmedF  January 23, 2009 at 1:59 pm

    100,000 pages a day is, if you think about it, not that much. 86,400 seconds in a day – so it is just a shade over 1 page per second. A good programmer does not need to be a DB wizard to be able to manage that.

    Plus Google does not download images heavily.

  • How to make wine  February 1, 2009 at 4:48 pm

    Hmm… interesting find but I think this applies only for super big sites… I have 3 sites hosted in different servers and have different loading speed… I didn’t notice this..

  • Sean  February 2, 2009 at 7:41 pm

    Good post here Andrew. I think this makes the case for building a search engine friendly website in xHTML with optimized images. If a client has “bad” or poorly constructed non-optimized pages, would you suggest adding them to the robots.txt file to prevent Google from spending time on these bad pages?

  • Andrew Shotland  February 2, 2009 at 8:15 pm

    Absolutely a good tactic Sean. Why give the image bots “non-content” files to waste their time on?

  • Ricky C  February 9, 2009 at 9:35 pm

    Google can make a wrong calculations from page indexed. Sometimes they will say that the good pages will get more indexed if your site has good speed but in other time they will say that the bad pages will get more indexed if they are contain single line text only or doesn’t have so many word inside of it. Go figures.