via GIPHY

Google just published an article on how to “Get Started With Dynamic Rendering.” If you are working on a site with a “modern framework” (e.g. Angular, React, or other tech with a lot of JavaScript features), you’ll want to bookmark that post. If reading is not your thing, a few weeks ago I put together Server Side Rendering For Dummies (& Non-Technical SEO Decision Makers), which boils down a lot of the Google techno-jargon into a single PowerPoint slide.

While that Google post has most of what you’ll need to get started with server side rendering, I’d like to focus on the Troubleshooting section – talk all you want about answering user questions, relevance, domain authority, etc. – if I had to define 2018 SEO with one word, it would be “troubleshooting.”

Google gives you most of what you need to troubleshoot prerendering problems in the “Verify your configuration” and “Troubleshooting” sections. Here’s what they say to do (edited for brevity):

Verify your configuration

Check a URL with the following tests:

  1. Test your mobile content with the Mobile-Friendly Test to make sure Google can see your content.
  2. Test your desktop content with Fetch as Google to make sure that the desktop content is also visible on the rendered page (the rendered page is how Googlebot sees your page)
  3. If you use structured data, test that your structured data renders properly with the Structured Data Testing Tool.

Troubleshooting

If your content is showing errors in the Mobile-Friendly Test or if it isn’t appearing in Google Search results, try to resolve the most common issues listed below.

Content is incomplete or looks different

What caused the issue: Your renderer might be misconfigured or your web application might be incompatible with your rendering solution. Sometimes timeouts can also cause content to not be rendered correctly.

High response times

What caused the issue: Using a headless browser to render pages on demand often causes high response times, which can cause crawlers to cancel the request and not index your content. High response times can also result in crawlers reducing their crawl-rate when crawling and indexing your content.

Structured data is missing

What caused the issue: Missing the structured data user agent, or not including JSON-LD script tags in the output can cause structured data errors.

We call these “Smoke Tests.” Here’s a little more nuance to server side rendering troubleshooting based on some real-world situations we’ve encountered.

  1. How To Test Server Side Rendering On A New Site Before It’s Launched
    It often is the case that SEOs get brought into the process well after a site has been built, but only a few days before it will be launched. We will need a way to test the new site in Google without competing in Google with the old site. For a variety of reasons we don’t want the entire new site to get crawled and indexed, but we want to know that Googlebot can index the content on a URL, that it can crawl internal links and that it can rank for relevant queries. Here’s how to do this:

    1. Create test URLs on new site for each template (or use URLs that have already been built) and make sure they are linked from the home page.
    2. Add a robots.txt file that allows only these test URLs to be crawled.
      Here’s an example:
      User-Agent: Googlebot
      Disallow: / (this means don’t crawl the entire site)
      Allow: /$ (allow Gbot to crawl only the home page even though the rest of the site is blocked in the line above)
      Allow: /test-directory/$ (allow crawling of just the /test-directory/ URL)
      Allow: /test-directory/test-url (allow crawling of /test-directory/test-url)(you can add as many URLs as you want to test – the more you test, the more certain you can be, but a handful is usually fine)
    3. Once the robots.txt is set up, verify the test site in Google Search Console.
    4. Use the Fetch as Google tool to fetch and render the home page and request crawling of all linked URLs. We will be testing here that Google can index all of the content on the home page and can crawl the links to find the test URLs. You can view how the content on the home page looks in the Fetch tool, but I wouldn’t necessarily trust it – we sometimes see this tool out of sync with what actually appears in Google.
    5. In a few minutes, at least the test home page should be indexed. Do exact match searches for text that appears in the title tag and in the body of the home page. If the text is generic, you may have to include  site:domain.com in your query to focus only on the test domain. You are looking for your test URL to show up in the results. This is a signal that at least Google can index and understand the content on your home page. This does not mean the page will rank well, but at least it now has a shot.
    6. If the test links are crawlable, soon you should the test URLs linked from the home page show up in Google. Do the same tests. If they don’t show up within 24 hours, while this doesn’t necessarily mean the links aren’t crawlable, it’s at least a signal in that direction. You can also look at the text-only cache of the indexed test home page. If the links are crawlable, you should see them there.
    7. If you want to get more data, unblock more URLs in robots.txt and request more indexing.
    8. Once you have finished the test, request removal of the test domain in GSC via the Remove URLs tool.
    9. We often can get this process done in 24 hours, but we recommend to clients giving it a week in case we run into any issues.
    10. Pro-tip: If you are using Chrome and looking at a test URL for the SEO content like title tag text, often SEO extensions and viewing the source will only show the “hooks” (e.g. {metaservice.metaTitle}) and not the actual text. Open Chrome Developer Tools and look in the Elements section. The SEO stuff should be there.
  2. Do Not Block Googlebot on Your PreRender Server
    Believe it or not, we had a client do this. Someone was afraid that Googlebot was going to eat up a lot of bandwidth and cost them $. I guess they were less afraid of not making money to pay for that bandwidth.
  3. Do Not Throttle Googlebot on Your PreRender Server
    We convinced the same client to unblock Googlebot, but noticed in Google Search Console’s crawl report that pages crawled per day was very low. Again someone was trying to save money in a way that guaranteed them to lose money. There may be some threshold where you may want to limit Googlebot’s crawling, but my sense is Googlebot is pretty good at figuring that out for you.

Share This Story!

About Author

No Comment

Comments are closed.