Shameless traffic driver. I know. Sorry.
July 9th, 2013
July 2nd, 2013
“Nostalgia means the pain from an old wound…” – Don Draper
When Google’s new Local Carousel came to a town near you a couple of weeks ago, I immediately envisioned the Don Drapers of local search in conference rooms around the country dramatically presenting their initial impressions. Of course at the end, instead of Duck saying “Good luck at your next meeting,” he said “Good luck at your next job.”
My initial impression was that the imagery and prominence of the Local Carousel was just so…juicy…that it would immediately attract clicks away from the SERPs where the local directories vied for your attention (can anyone come up with a catchy rubric for that section?).
So now that we are a little more than two weeks into the PC era (“Post-Carousel”), I checked to see how some of the sites I work on that target searches that bring up the Carousel were doing. Based on about 10 million queries worth of data, it looks like the Carousel has virtually no effect on local directory traffic. As an example, here’s the Google referral data for June for a site that primarily targets local restaurant queries:
In retrospect, this is not too surprising. A click to the Carousel merely returns a brand search result that typically contains plenty of local directories. I think the technical term is that these SERPs are “lousy” with local directories:
So have no fear you Don’s of the local search world. You can put that bottle of Scotch away…for now.
Some excellent observations from the Local SEO peanut gallery:
Local Carousel Reporting From Around the Web - Mike Blumenthal
Google’s Local Carousel – Trapped in Google’s World – Mike Blumenthal
54 Keywords Triggering Google’s Local Carousel – Adam Dorfman
10 Random People’s Reactions To Google Local Carousel - Mike Ramsey
New Local Carousel – Aaron Wall
June 21st, 2013
UPDATE: This just in from Chris Andrews, one of the top contributors to the Google News Help Forum
“Google News is working on a system-wide fix for the ‘not in the Google News database‘ error-message error.
Nothing is being done on an individual site basis, so publishers do not need to post their sites in the Google News Help Forum. It is already a known issue that is being worked on.
Also, this error is not having an effect on the crawling of news-sitemaps that are actually in the database. The crawling and indexing is continuing as normal.
This issue is expected to be resolved by Tuesday next week (6/25/2013). Publishers that continue to see this error after that point may ask for assistance or seek updated information in the forum or by using the Google News Publisher report an issue form.”
Over the past few days I have been contacted by several Google News publishers who have received the following message in their Google Webmaster Tools News Sitemaps Report and have noticed a problem with Google News not indexing their site:
“Your Sitemap is on a site that is not in the Google News database. Google News can only accept Sitemaps from sites that we crawl. If your site is crawled by Google News, please check that the URL of your Sitemap agrees with the URLs of your articles as they appear on Google News, including any leading “www”. If you would like to request inclusion of your site in Google News, please contact the Google News support team.”
This message has typically coincided with the publisher’s site either not showing up in Google News’ index or with much slower indexing of newly published articles. Not fun.
The good news is that this appears to be a glitch on Google News’ part. If you have received this notice and Google News is not indexing your Google News XML Sitemap, I recommend you head over to the Google News Publishers Help Forum and post there about your issue. Thus far the Google News team has responded pretty quickly to these issues and fixed the problem. It’s not clear why they haven’t done this yet for all sites, but the sooner you alert them to your site’s issue, the sooner it may get taken care of.
Note to Christian Bale’s agent: No more musicals
June 18th, 2013
Aaron Bradley dropping some Local knowledge, along with a lot of other knowledge, in his fantastic post about the Semantic Web:
“…search is no longer about words, but about the things to which the words on a web page describe and make reference…”
“Why is this important? It’s important because when Google receives a user query it’s increasingly not trying to provide a match for the query keywords, but (informed, whenever it’s possible for them to do, by the context of the query) to understand the meaning underlying the query, and then return information about the entities it has identified.”
“The process of navigating to web pages, and moving back and forth between search results and the web pages they reference, is trivial on desktop computers but a royal pain on a hand-held mobile device.
This situation provides a compelling incentive for the search engines to circumvent additional web page visits altogether, and instead present answers to queries – especially straightforward informational queries – directly in the search results.
While many in the search marketing field have suggested that the search engines have increasingly introduced direct answers in the search results to rob publishers of clicks, there’s more than a trivial case to be made that this is in the best interest of mobile users. Is it really a good thing to compel an iPhone user to browse to a web page – which may or may not be optimized for mobile – and wait for it to load in order to learn the height of the Eiffel Tower?
This also sheds considerable light on the usability impetuous behind Google+ Local. A well-formed Google+ Local Page enables Google to display things like business hours and an interactive map in a mobile-friendly fashion in response to a query like “Jones Aquarium Supplies” (which is, of course, an entity).”
— Dennis Goedegebuure (@TheNextCorner) June 18, 2013
June 18th, 2013
I tend to get a lot of calls from start-ups trying to go the small business SEO agency reseller route to get their services in the hands of SMBs. Often they are looking for feedback on their service, introductions to potential resellers, or a SEO strategy to get in front of potential customers. I have had this conversation so often in the past month, that I figured I would save everyone some time and give you my initial feedback on your business model here.
Before calling a potential reseller, ask yourself the following questions:
- How does your service make the agency more money than their current system? (I can basically stop right here, but you called me for help, so I guess I owe you a bit more than that. #Etiquette)
- How does your service make the agency’s job more effective/efficient, etc.? Can you prove it?
- Why is this worth the agency’s time v. the fifty other things they could be doing? See #’s 1 & 2.
- How long can you hold your breath? Give yourself a long time before you see meaningful traction. Big agencies/yellow pages companies, etc. are notoriously slow in making decisions on adding new SKUs to their menus. They already have a hard time getting sales people educated on the stuff they already sell and new services, particularly those with low/no financial incentives for the sales people and the agency tend not to get mentioned when talking to the client. And once they decide to do a deal, they might roll out a small test and then take six months to really get behind it. And smaller agencies who might be faster to take on your service, will sell a few packages and then probably forget about it because they are too busy and it wasn’t enough revenue or it was too much effort to sell. Remember, small search marketing agencies are basically SMBs, and trying to do business with them is no different than trying to do business with a kitchen counter guy.
For a SEO agency reseller model to succeed, I think you need to hit one of the following points, in no particular order:
- Your service is a better substitute for something the agency is already doing. For example, if you can produce crappy infographics cheaper/faster/better than the agency’s current solution, perhaps they should give you a shot. Showing data that proves results from actual case-studies always helps.
- Your service does something the agency doesn’t, but needs to do. Emphasis on the “needs”. For example, if Google announced last week that it’s going to demote all non-smartphone-optimized sites and you have a smartphone-optimized site builder, then perhaps an agency that doesn’t have a smartphone SEO solution should give you a shot.
- Your service does not require the agency to ask their clients for more money. I am not saying an upsell model can’t work, but it’s definitely going to have a harder time getting traction unless it’s the kind of thing that clients are already asking for, like a smartphone-optimized website. Most small business SEO clients are not spending a lot per month and if your amazing new thing is going to cost 25% of the client’s billings, the agency is not likely to want to eat the costs, and the client that already feels like they are spending way too much money on this stuff (even at $100/month!) isn’t going to want to spend more – unless perhaps it’s for a smartphone-optimized website or something.
There are plenty of great businesses that rely on agency resellers – SEO tools, data management, smart-phone optimized websites, etc. - so I don’t want to be a wet blanket. But when you enter the Matrix of small business SEO agencies, I always recommend you take the red pill.
June 7th, 2013
WASHINGTON – Following revelations that the U.S. government’s PRISM program has been secretly collecting information on foreigners overseas from the nation’s largest Internet companies like Google, Facebook and, most recently, Apple, in search of national security threats, President Obama admitted that Google’s policy of hiding keywords in Google Analytics was the deciding factor in moving ahead with the controversial program.
“Not being able to see our keywords created a grave threat to our national security,” President Obama said today in a hastily-called press conference to address the growing scandal.
About two years ago, Google started to protect the privacy of users logged into its system by not passing their keyword data along to publishers’ websites – unless the publishers also advertised with Google. The search giant claimed the move would affect less than ten percent of searches.
“When Google started hiding keywords in 2011, Matt Cutts said it would affect only about 10% of searches,” said Obama. ”But over the past two years, we are now unable to see over 60% of our referring keywords. And thanks to the Sequester, our AdWords budget has been cut, so we had to do something.”
Frequent critic of the White House Sen. Lindsey Graham (R-S.C.) is defending Google’s actions, saying he’s more concerned about the President’s strategy, or lack thereof.
“Everybody knows the Panda update was about quality,” said Graham, “If the President would stop focusing on “keywords” and “rankings” and start creating compelling content that users care about, this wouldn’t be an issue.”
When asked for comment, the President responded “I know that Americans are concerned about the balance between safety and liberty. With that in mind, I have appointed a blue ribbon commission to work on a content marketing strategy that combines guest posting on relevant sites, natural-looking anchor text and a weekly infographic.”
UPDATED 6.10.13 9:35am PDT - Edward Snowden, a former SEO consultant for the Obama Administration, has identified himself as the source of leaked information about the National Security Agency’s PRISM surveillance program. Snowden claims his actions were forced by a chief executive “obsessed with ranking reports.”
“I kept telling him that trying to outrank Wikipedia for “best president ever” searches was a waste of tax-payer money,” said Snowden, “Obama may play the long game, but what about the long tail? I don’t want to live in a society that doesn’t understand the basics of search engine optimization.”
June 3rd, 2013
Since Penguin 2.0 hit a couple of weeks ago, a number of clients have received a large volume of requests to remove links from their sites that goes something like this:
“I recently received notice from Google that my website has been assessed a penalty after they “detected unnatural links” pointing to my website http://www.spamdawber.com/. Can you please remove the links to my site from the following URLs:
If the link doesn’t get removed, we are going to have to to file a “Disavow Link” report with Google. If we do this, it may affect your site’s Google rankings…Thanks!”
I got one of these last week for someone who had somehow gotten through my hi-tech security system and comment-spammed this site a few years ago.
Of course I am always happy to help a screwed website in need, but in most of these cases, it appears that the spammers were also building spammy links to the client sites in an attempt to drive PageRank to the /spammy-profile URL to either get that URL ranked or flow it back to the spammer site linked from the profile.
And of course, Google doesn’t know that the spammer built the link to your site, it just knows your site has a spammy link.
So Spammers, when you are requesting link removal, how about also providing a list of all of the spammy links you have built to the site in question so we can clean this crap up and protect our own rankings?
“Nothing is less important than which fork you use. Etiquette is the science of living. It embraces everything. It is ethics. It is honor.” – Emily Post
May 22nd, 2013
Yesterday I talked about my initial thoughts on Google’s new Local Business Data Highlighter. Tyler Bell of Factual was quick to point out an important point:
If you want to have the best chance of ranking for local queries in Google, you should definitely use the tools they give you – provided they don’t break something (jury’s still out on that). But Tyler’s point is spot-on. This tool has two purposes:
- Make it easier for Google to figure out what your website is about
- Make it harder for anyone else to
So while using a tool like the Data Highlighter is a quick, easy way to give Google what it wants, that doesn’t mean you shouldn’t add schema or hcard markup to your site. That markup makes it easier for other services such as Factual, Bing, etc. to figure out what your website is about. When it comes to meta data, you should give away the milk for free, cause you never know where the next cow buyer is going to come from.
May 21st, 2013
Google’s new Data Highlighter for Local Businesses is definitely worth checking out for anyone trying to goose their local rankings. I have found it to be a bit buggy and it’s a new tool, so proceed with caution. Here are some basics on how to use it along with some tricks I have figured out thus far.
- Start Highlighting
Go to the Optimization section of GWT and click on “Data Highlighter”.
Don’t watch the video, just proceed to the blue “Start Highlighting” button.
- Add Your URL & the Select “Local Business” Information Type
- Select “Tag Just This Page” or “Tag This Page and Others Like It”
If you have similar data types (e.g. Name, Address & Phone Number) across multiple pages, you may want to try the “Tag this page and others like it” option. This will allow you to tag URLs in batches selected by Google.
- Tag The First Page
The tool will show a framed “Bot-view” version of the page you have selected as your first page. The top shows where you are at in each step of the process and the right side bar shows you the data you have tagged. When you highlight the data on the page, you get a drop down that allows you to select what type of data it is, Current Local Business data types include name, address, phone number, opening hours, category, department, image, URL, average rating and review:
- How To Add Missing Tags
If you can’t find the right words on the page to highlight such as a particular category keyword, then click on the Gear icon in the top right of the UI and you’ll see an option for adding a missing tag (as well as for clearing all tags you have set) (hat tip to Darren Shaw for spotting this one – I found the category tagging to be quite buggy btw)
- Create a Page Set
If you have only selected one page, you can hit publish and you are good to go. But if you are tagging multiple pages, you are then offered the option to create a “Page Set” or to choose the set of pages that the tool chooses for you as the set. As you can see from the screenshot, the tool uses wild cards to select entire directories to apply the tags to:
If you select “Create your own page set” you can add custom URLs to tag. Once you have selected the URLs, the tool then gives you the chance to verify the tags for a sample of URLs before you publish. As you’ll see in a sec, it is important that you check these tags thoroughly.Once you’ve checked them, hit publish and wait for the local SEO magic to begin. Now for a few insights:
- Use The Bot View of Tool for a Quick Keyword Targeting Review
The example above is from a BMW dealer site I just started working on. When I looked for words on the page to tag, I quickly realized there were few, if any keywords that they should want to rank for such as “new cars”, “used cars”, etc. If you can’t find the right text to tag, there’s a good chance you need to improve the keyword targeting of the page’s content.
- The Tag Verification Process May Show You Where Google Is Having Problems With Your Site
With the BMW dealer, I let the tool select multiple URLs for tagging the dealer’s NAP. During verification, several of the tags were flagged as possible errors. For example, on one URL the zip code showed up tagged as the phone number. When I looked at the NAP text in the code, I noticed that the text was missing a lot of spaces. It had a format like <Biz NameStreet AddressCity, STZipPhone>. This is likely making it hard for Googlebot to get the NAP right, which is one of the reasons why Google created this tool and even better, one of the reasons why this site may have problems with local rankings. So tagging the NAP correctly could be a quick win, as would just fixing the text so it’s properly spaced.If this tool really is giving a bot view of the site, in some ways, this is a more powerful tool than the Fetch as Googlebot feature. Now, not only can we see how the page looks to Googlebot, we are also told what some of the problems are.
- Danger – This Is a New Tool – Proceed With Caution
For now I am only testing the tool on sites that have nowhere to go but up. Google has a habit of rolling out new features, particularly with Local, that sound great but have unintended consequences for the early adopters. For example, I have no idea what will happen if your category tags are different from those that are in the Google+ Local Places profile for a site.I’ll update this as I figure more stuff out. If you have any insights, please add in the comments or tweet me.
May 10th, 2013
It links to this article from 2007.
For all of you marketers recommending that your clients use Google Alerts to monitor what people are saying, be sure to include that it works better if you use it in conjunction with a time machine, or better yet…