Ok, here is a poorly formatted liveblogging of this session:
Maile Ohye of Google sez:
Talking about ecommerce issues where we have a site with 158 products but because of filtering there are 380,000 URLs, so Google doesn’t know what to crawl.
- Maintain a consistent URL structure
- Directories and filenames are case sensitive. http://apple.com/itunes/ & http://apple.com/ITUNES/ are considered to be different URLs
- Keeping consistent reduces duplication, facilitates more accurate indexing and simplifies your robots.txt configuration
- 301s & rel=canonical are crawled much less frequently than 200s
- 404/410 URLs are crawled less frequently
- 500 errors are treated as a transient error. Pages not removed from index. We will retry in the near future.
- Use standard encodings & key=value (e.g. /product.php?item=nexus-one&category=mobile) v. non-standard
- Crawlers interpret standard keys & values.
- Use the URL parameter tool with Yahoo & Google Webmaster Tools. Tells bots which parameters are relevant and which they can ignore
- Indexing priorities: Googlebot looks for what users will find relevant: URLs with updated content, URLs with unique/important content (as determined by linking signals); Sitemap info and bandwidth considerations
- How to increase Googlebot visits: Strengthen indexing signals via uniqueness & freshness. How well the page is lined from your site and other pages on the Web.
- Use proper response coes
- Serve content reliably
- Prevent crawling of unnecessary pages
- Optimize performance: Shopzilla improved conversions by 7-12% just by increasing site speed
- Improve long-tail content: unique & fresh content, get links to these URLs
- Reduce duplicate content: Choose canonical URLs and be consistent. Include the canonical URL in internal links and sitemap. Use 301 & rel=canonical.
- Include microformats & RDFa: Enhances results with rich snippets – ability to include reviews, recipes, people & events.
- Use Video Sitemaps
Adam Audette of Audette Media
It’s all about user experience. Users come first and then the SEO.
4 Big Issues with SEO & IA Right Now
- Categorization, Search & Browse: Amazon provides key categories on the homepage but as you click into categories you get relevant sub categories and links to important product URLs in the category.
- Make Use of Link Relationships
- Know Your Internal Link Profile
- Content is more important than ever
Great user experience but bad for bots
- Rewrite facets to pretty URLs based on priority
- Place faceted experience in a folder for more control
- Append “overhead” attributes (e.g extra parameters) to the pretty URLs; rel=canonical back
Image Search Signals:
Content signals (color, facila recognition, etc.)
Attribute signals (ALT text)
Textual signals (captions)
Quality signals (pixels, etc.)
Images inside of flash/js are hard for the bots to access.
Provide dimensions of each image in the href whenever possible
Use JPEG for photos (strip meta when appropriate)
Use PNG for graphics
Use GIFs for small and animated images
If you register image with Creative Commons you can add more data to your image.
Provide as much info as possible along with your image:
Putting images in keyword relevant directories will help (e.g. images/lady-gaga/)
According to Maile, the major signal for site speed is client side rendering
14 Response Comments
Use Video Site maps as in site maps of video catalog?
si use video xml sitemaps
Are there differences with local search when using a mobile browser? if not, should there be?
There are differences when using a mobile browser as the search engines seem to assume that more searches have a local intent, therefore more local results show up.
Agreed, nice recap.
Creating an xml video sitemap is different than a ‘regular’ xml sitemap. It contains different parameters. Basically you want to tell Google (ahem, and other search engines) where your video file is (eg flv), what page it is embedded on (eg .html, .php…), where they can find the thumbnail image (eg .jpg) and some other meta data for the video, such as the Video Title.
They are a pain in the ass to make. Does anyone know of tools that will create them for you, like all the tools that can make your XML sitemap file?
I digg to find Video Sitemap Generator Tool and find http://www.microsystools.com/products/sitemap-generator/tutorials/video-xml-sitemaps.php
I try it’s Demo and it can be useful to your needs.
Thanks Everett and Hiren for the info and insight.
Andrew thanks for bouncing Ahole. Didn’t know initially what to think about that.
Also – is there some science to this typing comments in 2 pt font that I am unaware of?
If not, is there anyway you could make typing in the comments box AARP ready? Either I am way older than everybody else here and really do need to get glasses or this 2pt font size is designed to discourage commenting.
Thanks for the recap! Poorly formatted on not, it was great info. As a copywriter, I’m always looking to stay on top of ways to use more SEO techniques in my writing since more and more local people are coming my way for SEO copy help.
Nice recap, we are going to start using video sitemaps also!
Andrew, I’m catching up on all the goodness of your blogging I missed… but I can’t figure this out:
Great user experience but bad for bots
1. Rewrite facets to pretty URLs based on priority
2. Place faceted experience in a folder for more control
3. Append “overhead” attributes (e.g extra parameters) to the pretty URLs; rel=canonical back
What’s faceted nav? Why is it bad? (Js?) What do 2 and 3 mean?
Yo Gab, this post might be informative:
I am sure site speed will increase if taken care of many factors mainly with images……………
Andrew, thanks a lot for the review. Definitely a very useful list for those who runs SEO for larger websites.
Thanks for the info. I hadn’t heard of 304 unmodified URL signals before.