Share on FacebookEmail to someoneTweet about this on TwitterGoogle+Share on LinkedInShare on Reddit

Google Search Console

Hey it’s a free tool (thanks guys!) and certainly better than nothing, but #IHaveADream

  1. THE NAME
    The name, while arguably more descriptive, cannot escape its history as “Google Webmaster Tools”. Maybe in a few years we will all stop trying to say “Google Search Console” while actually saying “Webmaster Tools”. Maybe.
  2. HTML IMPROVEMENTS
    How about a filter for Dupe Titles/Meta Descriptions to remove URLs with canonical tags, pagination tags, hreflang, etc. Regex filters would be great here.
  3. LINKS TO YOUR SITE
    How about a lot more granularity in downloading specific links, links from specific domains, etc. (see AHrefs, Majestic, etc.)? And a very clear step-by-step process for fixing Penguin issues from within GSC would be nice.
  4. INTERNAL LINKS TOOL
    Has anyone ever actually gotten a result for a URL that wasn’t in the navigation?
  5. SEARCH ANALYTICS
    Great tool. Suspect data. Regex filters would make it a lot better. And for us Local SEO types, how about integrating GMB Insights data? Maybe differentiating Local Pack/Maps positions v. regular old organic?
  6. INTERNATIONAL TARGETING
    I’d like to remove this section from all of our site audits: “Ignore the Hreflang errors in GSC. We have never seen a case where it’s accurate.”
  7. GOOGLE DATA HIGHLIGHTER
    Puh-lease
  8. FETCH AS GOOGLEBOT
    Export the code Gbot fetched so we can easily search it or make it so we can easily search it in GSC
  9. SITEMAPS
    It would be great if we could see which URLs in a XML sitemap are indexed so we could figure out why the others are not. An “Is It Indexed” tool would be great not just for URLs but for content on those URLs.
  10. TIME SPENT DOWNLOADING A PAGE
    Explain how this data is different from Average Page Load Time in Google Analytics. I am tired of doing this for you. And how about breaking this out by page or page type? Same thing with Pages Crawled Per Day.

I am sure there are more, but that’s what I’ve got this morning. Feel free to add your favorites in the comments.


 
 

11 Comments

  • Jacques Bouchard  April 12, 2016 at 11:22 am

    I’d like a feature added that shows ALL pages that are indexed – sitemap or not. It’d be extremely useful in identifying orphaned content before a website launch, and I’m sure for countless other purposes as well.

    Reply
  • Joy Hawkins  April 12, 2016 at 8:05 pm

    Does Search Console show data from the 3-pack at all? I have always observed the position matches organic (not the 3-pack) but someone using example.com/#123 said that URL specifically shows data in SC.

    Reply
  • Andrew Shotland
    Andrew Shotland  April 12, 2016 at 8:10 pm

    I may have seen some Local Pack data in there once but it was a pretty flaky example.

    Reply
  • Marcus  April 12, 2016 at 10:48 pm

    The whole thing is an exercise in partial and unreliable data. Useful as a starting point but the links, sitemap data, indexation data – it’s all half a job. Certainly more transparency with these elements would be helpful.

    We have found fetch and render to be really inconsistent as well. Will work for some sites, not for others, will report problems that don’t exist. Works one time, not another.

    Reply
  • Andy Kuiper  April 13, 2016 at 8:43 am

    I spoke with Mr. Google this morning Andrew… he told me these reccomendations would be implemented this afternoon, or something along those lines 😉
    *if only it was this easy

    Reply
  • Joe Goldstein  April 13, 2016 at 12:15 pm

    Totally agreed with all of that, and Jacques too. If Google gave us better tools to find and nuke dummy content, old content in the same file directory, and other garbage like it, we’d all have tighter crawls and they’d spend less time crawling crap. Win/win, right?

    But out of all those dreams, I just wish they’d fix Search Analytics. Your url builder trick for tracking locals helps a lot (seriously, thanks for that) but we shouldn’t have to hack it. Also the BS impression counts from non-geo keywords with geo intent seriously wreck the average page positions and CTRs.

    Reply
  • Robert Ramirez  April 13, 2016 at 1:54 pm

    Search Analytics reporting continues to be an absolute mystery. As others have pointed out, do local rankings impact this data? Is average position determined through a weighted average? Or does an avg. position of #100 with 1 impression and the avg poistion of #1 with 1,000,000 impressions = an avg position of 50 for the keyword? I could go on and on…

    Reply
  • Jennifer L Metro  April 13, 2016 at 4:20 pm

    I laughed out loud when I read:

    #7 GOOGLE DATA HIGHLIGHTER
    Puh-lease

    I agree and recently got plus oned for calling it wonky.

    Reply
  • Michael Field  April 18, 2016 at 4:02 am

    More than 90 days worth of data would be nice.

    Filtering by cities would be nice.

    Filtering by more than one keyword would be nice.

    Reply
  • Patrick Leonard  November 4, 2017 at 2:27 pm

    Andrew is a reliably good resource for seo information. Search Console is a reliably bad resource for accurate search data

    Reply

Leave A Comment

Please enter your name. Please enter an valid email address. Please enter a message.