Hey it’s a free tool (thanks guys!) and certainly better than nothing, but #IHaveADream…
- THE NAME
The name, while arguably more descriptive, cannot escape its history as “Google Webmaster Tools”. Maybe in a few years we will all stop trying to say “Google Search Console” while actually saying “Webmaster Tools”. Maybe.
- HTML IMPROVEMENTS
How about a filter for Dupe Titles/Meta Descriptions to remove URLs with canonical tags, pagination tags, hreflang, etc. Regex filters would be great here.
- LINKS TO YOUR SITE
How about a lot more granularity in downloading specific links, links from specific domains, etc. (see AHrefs, Majestic, etc.)? And a very clear step-by-step process for fixing Penguin issues from within GSC would be nice.
- INTERNAL LINKS TOOL
Has anyone ever actually gotten a result for a URL that wasn’t in the navigation?
- SEARCH ANALYTICS
Great tool. Suspect data. Regex filters would make it a lot better. And for us Local SEO types, how about integrating GMB Insights data? Maybe differentiating Local Pack/Maps positions v. regular old organic?
- INTERNATIONAL TARGETING
I’d like to remove this section from all of our site audits: “Ignore the Hreflang errors in GSC. We have never seen a case where it’s accurate.”
- GOOGLE DATA HIGHLIGHTER
- FETCH AS GOOGLEBOT
Export the code Gbot fetched so we can easily search it or make it so we can easily search it in GSC
It would be great if we could see which URLs in a XML sitemap are indexed so we could figure out why the others are not. An “Is It Indexed” tool would be great not just for URLs but for content on those URLs.
- TIME SPENT DOWNLOADING A PAGE
Explain how this data is different from Average Page Load Time in Google Analytics. I am tired of doing this for you. And how about breaking this out by page or page type? Same thing with Pages Crawled Per Day.
I am sure there are more, but that’s what I’ve got this morning. Feel free to add your favorites in the comments.
11 Response Comments
I’d like a feature added that shows ALL pages that are indexed – sitemap or not. It’d be extremely useful in identifying orphaned content before a website launch, and I’m sure for countless other purposes as well.
Does Search Console show data from the 3-pack at all? I have always observed the position matches organic (not the 3-pack) but someone using example.com/#123 said that URL specifically shows data in SC.
I may have seen some Local Pack data in there once but it was a pretty flaky example.
The whole thing is an exercise in partial and unreliable data. Useful as a starting point but the links, sitemap data, indexation data – it’s all half a job. Certainly more transparency with these elements would be helpful.
We have found fetch and render to be really inconsistent as well. Will work for some sites, not for others, will report problems that don’t exist. Works one time, not another.
I spoke with Mr. Google this morning Andrew… he told me these reccomendations would be implemented this afternoon, or something along those lines 😉
*if only it was this easy
Please give my regards next time you speak
Totally agreed with all of that, and Jacques too. If Google gave us better tools to find and nuke dummy content, old content in the same file directory, and other garbage like it, we’d all have tighter crawls and they’d spend less time crawling crap. Win/win, right?
But out of all those dreams, I just wish they’d fix Search Analytics. Your url builder trick for tracking locals helps a lot (seriously, thanks for that) but we shouldn’t have to hack it. Also the BS impression counts from non-geo keywords with geo intent seriously wreck the average page positions and CTRs.
Search Analytics reporting continues to be an absolute mystery. As others have pointed out, do local rankings impact this data? Is average position determined through a weighted average? Or does an avg. position of #100 with 1 impression and the avg poistion of #1 with 1,000,000 impressions = an avg position of 50 for the keyword? I could go on and on…
I laughed out loud when I read:
#7 GOOGLE DATA HIGHLIGHTER
I agree and recently got plus oned for calling it wonky.
More than 90 days worth of data would be nice.
Filtering by cities would be nice.
Filtering by more than one keyword would be nice.
Andrew is a reliably good resource for seo information. Search Console is a reliably bad resource for accurate search data