LOC@L SEO GUIDE

LOCAL SEARCH ENGINE OPTIMIZATION & ENTERPRISE SEO MADE SIMPLE

 

Google’s Top Heavy Ad Algorithm & The SEO Catch-22

August 6th, 2014

Catch 22 cover

 

The SEO Catch-22:

  1. You’ve done all of the typical technical and content SEO stuff but your organic traffic keeps trending downwards
  2. Your SEO guy suspects the culprits are the above-the-fold in-content Adsense units designed to look like content
  3. Problem is they are your top performing ad units by a factor of at least 10x
  4. Fix the ad units and maybe your traffic will turn around, but for sure your revenue will dive before then
  5. Don’t fix the ad units and maybe your traffic will continue to tank and your revenue will dive

 ”…a concern for one’s own safety in the face of dangers that were real and immediate was the process of a rational mind. Orr was crazy and could be grounded. All he had to do was ask; and as soon as he did, he would no longer be crazy and would have to fly more missions. Orr would be crazy to fly more missions and sane if he didn’t, but if he was sane, he had to fly them. If he flew them, he was crazy and didn’t have to; but if he didn’t want to, he was sane and had to.” 

→ 1 CommentTags: Google
Posted by Andrew Shotland

“There Is Really No Way To Optimize For This Algorithm Because The Results Are Random And Make No Sense…”

August 5th, 2014

My initial review of Google’s Pigeon results, Picking Through Google’s Pigeon Droppings…, was posted on SEL yesterday, but it was written last week and a lot has changed since I submitted it. I was going to write an update here but Linda Buquet’s massive review of pre and post Pigeon SERPs does the job well, so let’s leave it to her. Here’s her take on what you can do for now, but read the whole post:

1) Google is in control, so not whole lot we can directly do to change things.

BUT they do everything for users! So there is something with this algo that they believe would offer a better search result. (Which is why I’m calling this collateral damage. I don’t think the innocent businesses that are getting hurt are the target.)

2) I don’t think Google really cares much what a bunch of SEOs think, so our complaints may fall on deaf ears.

They do however care very much for what users think!

So on the examples above and any other bad results you find where spam or bad results (dead listings or bogus listings) are ranking, use the “SEND FEEDBACK” link at the bottom of the SERPs.
It will let you explain and select part of the screen.

Do it from the office. Then do it again from home. Get friends to report bad results too! Someone does read these reports, in order to get an aggregate view of how accurate results are.

3) If you see really spammy listings or bogus listings with parked pages and disconnected numbers, try reporting them and try to get them taken down.

Even if mods won’t deal with the problem maybe if they see a big uptick in spam reports, they’ll realize that this algo is surfacing too many bad listings.

4) If you have a client that suffers due to spam in the SERPs, explain it’s a bad update and reassure them it won’t last forever and will likely be corrected. Repeat #3 if there are bad listings knocking them out of the SERPs. You could also point them to this thread so they realize it’s not just them or not something you did. 

5) Continue to work on all the best practice stuff, just like you always have. When this algo shifts to something more logical and fair – you will benefit.

6) There is really no way to “optimize” for this algo because the results are random and make no sense and again it’s changing almost daily.

So turn off your ranking reports and stop looking at SERPs til this thing blows over.
Or as Mike would say “Take 2 beers and call me in the morning!” 

Read the full post: Google Pigeon Collateral Damage & What You Can Do About It

 

 

 

→ 7 CommentsTags: Google
Posted by Andrew Shotland

How To Find The Source of An Auto-Generated Google My Business Page in 30 Seconds

August 5th, 2014

A dentist called me a few minutes ago with a request to help get rid of a Google My Business page that had been automatically created by Google. The page was for his nephew who had considered joining the practice but never did. Somehow Google got hold of his data and created the page. It will be easy enough to get Google to shut the page down via GMB support and I could quickly check the main data aggregators, but I wanted to make sure we also nipped this problem at all possible sources of the data. But how to find them?

This is where our free Local SEO productivity tool, NAP Hunter, comes in:

STEP 1
I put the nephew’s name and the office location into NAP Hunter and hit “Hunt”:
NAP Hunter Screenshot

 

STEP 2
The app quickly generated browser tabs of Google SERPs for different combinations of the NAP elements.

Here’s one for NAME + ADDRESS:
Name + Address

The first result from ucomparehealth.com had a full listing for Alexander Jubb at the business’ address:
ALexander Jubb Uhealthcompare

 

 

I then found this listing, strangely, in the NAME – ADDRESS SERP:

Alexander Jubb Angieslist

Here’s the complete profile on Angieslist:
Alexander Jubb Angieslist

And a CitySearch profile:

Alexander Jubb CitySearch

Knowing how these sites source data quickly led me to Factual which of course had a profile for Jubb:
Alexander Jubb Factual

And voila, mystery solved, in about 30 seconds, thanks to NAP Hunter.

→ 12 CommentsTags: Citation Research
Posted by Andrew Shotland

Did Pigeon 2.0 Just Take A Bite Out of Recipe SEO?

August 1st, 2014

Woke up this morning to find what appears to be yet another shuffling of Google’s Pigeon update. In this case I am seeing some food query SERPs display local packs, directories and knowledge graph results – but no recipe results.

Taco
Taco SERP
Click to Enlarge for Analysis

Hamburger
Hamburger SERPs

Granted I am only seeing this for queries using singular nouns. Recipes still come up for “hamburgers” and “tacos”. But my hunch is that Google is now overweighting local results in its new algo. That’s good for all of us local folk, but not so great for the poor single guy who just wants to make a taco for himself, watch the game and fall asleep in his Lazy Boy, half-full beer can in his hand…wait, I’m projecting now. End of post.

→ No CommentsTags: Google
Posted by Andrew Shotland

New Google My Business Hours Requirement?

July 28th, 2014

The other day I was doing the local SEO thing and called Google My Business support to get a client’s duplicate listing removed. After a pretty standard conversation about merging this dupe removed we hung up. A few minutes later I got a call, and it was the Google rep I just spoke with letting me know that their “Places Consult Team” had requested that we add hours of operation for the business to the My Business listing.

Hours of operation

Seemed like a pretty random request so I asked what that had to do with getting a dupe removed and was told that having hours of operation listed is a requirement for My Business listings and unless we add it they won’t merge the listings. Now I am pretty familiar with the local guidelines, and other then stating that you should be able to serve customers during you listed hours, or that you should be available for phone verification during your listed hours there is nothing about requiring business hours in that document. The rep volunteered to send me some documentation explaining the requirement, but so far that hasn’t happened. There were some other choice nuggets that I am going to assume are because the local support team is undertrained and over worked, but since I got a call back over the business hours I’m wondering if there isn’t something else going on…

→ 13 CommentsTags: Google My Business
Posted by Dan Leibson

Schema.org/SPAM – Google Should Get Rid of All of It

July 18th, 2014

I have a client who had implemented schema incorrectly and received a notice from Google that they needed to fix it “if you would like it featured in Google search results”. Of course they didn’t provide any specifics and Schema.org documentation is often confusing:

Then I noticed this very helpful review of Wisconsin Health Insurance, because so many people, well at least 2, are reviewing Wisconsin Health Insurance, which is actually not anything…except a valuable keyword of course:
Rich Snippet SPAM

And if you look at the URL in Google’s Rich Snippet Testing Tool, er, I mean the Structured Data Testing Tool, you see that it employs hreview-Aggregate markup:

hReview-Aggregate

And if you look on the URL, you’ll see a sentence about these ratings nicely tucked away on the right hand side of the page, where it’s sure to be really really helpful to users:

Schema.org/SPAM

 

A few weeks ago Google did away with author images in the SERPs “to make the experience better” or some such thing. And while the message Google sent to my client shows it is clearly trying to clean up the schema SPAM, I have seen way too much of this crap in the SERPs.

I vote for getting rid of all of it.

→ 5 CommentsTags: Uncategorized
Posted by Andrew Shotland

The Night Google Took Pity On A Client…

July 8th, 2014

Yesterday a client pushed out a new homepage without telling us. I found out because late last night I got this email in my inbox from an actual living, breathing member of Google’s Search Quality team:

Google Notification 2

 

Say what you will about Google (and clients who double-noindex their homepage), but this is pretty awesome.

→ 12 CommentsTags: Google
Posted by Andrew Shotland

NAP Hunter! Lite now in the Chrome store

June 26th, 2014

Hey local SEOers! NAP Hunter! Lite, our Chrome extension for citation research and audits, is now available in the Chrome store. This should streamline installations, so you can get to automating. For more info on how to run the extension check out the “How to Run NAP Hunter! Lite” section at the end of this page.

→ 2 CommentsTags: Citation Research
Posted by Dan Leibson

Yext Duplicate Listings Suppression Launches (aka Powerlistings Über)

June 23rd, 2014

Yext launched a duplicate business listings suppression service today they are calling Powerlistings Über. That’s fancy talk for saying you can now kill duplicate business listings on publishers in the Yext Powerlistings Network with extreme prejudice. For those of you who have tried to squash duplicate citations and found it to be an endless and expensive game of whack a mole, you may want to check out Yext.

Here’s how it works:

  • Yext automatically identifies duplicate listings on a particular local search site in Yext’s dashboard. In the case below, the “Suppress Duplicates” option would appear next to a listing that has been flagged as having dupes.

Publisher Level Suppression

  • Clicking “Suppress Duplicate” brings up a screen that shows the listings for a specific NAP on the publisher site. Select the listings you want to torch and click “Suppress this Listing”
    Unlimited Suppression
  • The request then gets reviewed by both Yext and the publisher to make sure the user hasn’t made a serious mistake (e.g. deleting all their listings). Once it passes the review, it’s toast.
    Supression Status

Some key points about the service:

  1. You must be a Yext Powerlistings subscriber to get access to the duplicate suppression service
  2. Yext is pricing Powerlistings Über at a 25% mark-up to the retail rate of Powerlistings (a similar mark-up applies to resellers).
  3. The price is for an annual subscription and covers unlimited dupe suppressions. So even if the dupe reappears, you don’t have to pay to kill it again.
  4. When a dupe gets suppressed the publishers have the option to 301 redirect the dupe to the canonical listing on their site and potentially can merge reviews and other data from the dupe. Not all publishers will do this, but for those who do, this could have some SEO benefits.
  5. You can un-suppress a suppressed listing at any time
  6. If you cancel your Yext subscription, you unlock the dupe suppression and it is likely that over time the dupes will reappear

Last month Yext published a white paper I wrote on how to deal with duplicates where I suggested that given the problems with how local search services and the big data aggregators work together, the best solution for solving duplicate business listings is to lock the dupe at the publisher level. This is how Yext’s service works and in my view is the only way you can reliably squash dupes and keep them squashed. And the fact that Yext can do this automatically for potentially hundreds of thousands of listings at once means that it is an incredible solution for large and small multi-location businesses. If you consider how much it costs to manually clean up duplicate listings, this service starts to look pretty cost-effective. Check it out and let me know what you think.

Full disclosure, I do some consulting work for Yext and have been on their asses for a year to build this :) Oh and check out their Whack a Dupe game too.

→ 12 CommentsTags: Yext.com
Posted by Andrew Shotland

Introducing N.A.P. Hunter Lite!

June 10th, 2014

Hey local SEO geniuses! We just released a free Google Chrome extension to help you speed through citation research. It’s called N.A.P. Hunter Lite. It basically runs a set of standard N.A.P. queries through Google so you can quickly find various citations, their URLs and their Google position.

N.A.P. Hunter Lite

Check it out here and let me know what you think.

Big kudos to Dan Leibson for spearheading this effort! It’s a great little tool.

 

→ 11 CommentsTags: Citation Research
Posted by Andrew Shotland