Core Web Vitals are everywhere in the SEO news these days. We’ve known for years that slower website loading results in conversion loss, and that it impacts ranking, but now we have three metrics to focus on: Cumulative Layout Shift, Largest Contentful Paint, and First Input Delay.
For businesses with physical locations, location pages tend to be some of the most important on the website. As such, we want location pages to have a fast and stable loading experience, which we can now measure with Core Web Vitals (CWV) metrics.
But for businesses with hundreds or thousands of locations, location page creation and maintenance is frequently done by a third party. Companies like Uberall, Rio SEO, Yext, and BirdEye all provide location pages for multi-location brands.
So how do these third party companies stack up when it comes to Core Web Vitals? I checked!
But Why Though?
Often when working with multi-location brands I get questions about which location provider offers the best services. Now that Core Web Vitals are a ranking factor, it’s important to keep performance scores in mind when choosing a location page provider. Of course, CWVs are only one of many ranking factors, so this shouldn’t be the only SEO consideration when choosing a provider. The intention of my research is to help brands understand what they should be looking for from a performance standpoint when choosing how to manage their location pages.
I looked at over 100 multi-location business websites and made note of their location provider when it was managed by a third party. I then took up to 100 location pages (unique URLs) from each site and ran them through LightHouse, which simulates a Poor 4G mobile connection. (By me, I mean our fantastic TechOps team created a script that allows me to run URLs through LightHouse. This team is seriously the best.) I then analyzed each of the providers for average CLS, LCP, and TBT (using LightHouse means I have to use Total Blocking Time instead of FID). I did this exercise twice to ensure relative consistency in results, since LightHouse returns lab data (real time performance).
- It’s harder than you might expect to figure out which company is providing location pages for a particular website. Brands like Yext and Chatmeter put it clearly in the code (which you’re able to find by inspecting the page), but other companies don’t. I did my best to find several domains for each brand and to match them to the correct provider, but there’s a chance I’m wrong on a few of these. Call me out on it and I’ll update the data!
- Most of these examples are for brands in the United States. If your website is in another country and managed by these location providers, I’d recommend asking for examples of domains in your country managed by these brands to perform manual checks yourself.
- For some of these providers it’s not clear who is responsible for hosting the location pages. If the hosting is different, pagespeed is also likely different. And some providers offer very custom pages which may impact CWV.
The location page providers that I encountered most frequently and used in this analysis are:
- Rio SEO
Cumulative Layout Shift
CLS scores measure visual stability of the page. The size of an element and the amount that it shifts impact the score. CLS is the only score not based on speed. Good CLS scores are .1 and under. The “needs improvement” range is between .1 and .25, and anything over .25 is considered bad.
Overall, the location page providers all have acceptable cumulative layout shift scores. Yext and SOCi have the worst scores, but these are still in the “needs improvement” range. In fact, only three of the 21 domains I analyzed have CLS scores in the “bad” range (over .2).
For SOCi, I only had 2 domain examples – anytimefitness.com and nekterjuicebar.com. Anytime Fitness has terrible CLS scores due to an above-the-fold image lazy loading into place, but Nekter Juice Bar has minimal shifting (.02). I’m hesitant to make any final calls on SOCi with only two data points, but am happy to run these tests again if I encounter additional location pages created by SOCi.
Yext has poor CLS scores due to issues with the pages. The Loft and Hollywood Feed received the worst scores (Hollywood Feed is also using a mobile interstitial on the page). From my analysis, it appears that the websites with more on-page content receive worse CLS scores because there are bigger blocks of content to shift around the page. That’s right, if your location pages use Yext and you include a lot of SEO-friendly content, you’re actually hurting your CLS score. Oops!
But the poor CLS blame doesn’t all belong to the location page providers. It’s totally possible for brands to screw up CLS on their own with images, pop ups, and map placements. From what I see, above-the-fold maps and top-of-page popups are the worst offenders, so it’s worth checking your performance scores, even if you use a provider with great CLS by default.
Overall, these third party location page providers are doing okay when it comes to cumulative layout shift. But Yext is super disappointing. Causing worse CLS scores if you add additional page content is unacceptable for SEOs. A fast and empty page is not the way to rank. BirdEye and SOCi may need to make some CLS changes too.
Largest Contentful Paint
Wow, these largest contentful paint scores are bad. So bad. Really bad. In fact, I went back and ran everything again and did some manual checks because I just didn’t believe how bad the scores are! Turns out, they’re right… but they’re also so so wrong.
To be considered ‘good’, Largest Contentful Paint scores should be at or under 2.5 seconds. The ‘needs improvement’ range is between 2.5 and 4 seconds. Anything over 4 seconds is considered bad. No location page provider had an average load time below 5.2 seconds.
There are no good options when it comes to choosing a location page provider based on LCP. The very best LCP score, at 3.1 seconds, was stores.petco.com (managed by Rio SEO), placing it firmly in the “needs improvement” range. The worst LCP score was 22.3 seconds for sherwin-williams.com, who is managed by ChatMeter.
Rio SEO, ChatMeter, and Yext all had one domain they manage that fell into the “needs improvement” category. That doesn’t tell us much, but does indicate that it is possible to achieve acceptable scores from these providers.
So, we’ve established that every one of the location page providers could use a lot of work on LCP. But are there any providers you should absolutely avoid?
Based on these scores, I’d recommend avoiding ChatMeter and Uberall in the United States for location pages, if you’re concerned about LCP. One Uberall domain actually performed the fourth-best with LCP, but it was for McDonalds Germany. The US and Japan-based domains had the third and fourth longest LCP times (averaging 16.2 and 17.8 seconds). One ChatMeter domain fell into the “needs improvement” category, but two of the other three domains had the slowest LCP times (averaging 18.8 and 22.3 seconds).
Wow, these LCP scores are abismal. How can every single location page provider have an average LCP score in the bad range? You have to hope these providers (and brands!) are working on solutions now, because the Page Experience Update is rolling out and these providers are falling behind.
Total Blocking Time
With lab metrics, we use Total Block Time as a stand-in for First Input Delay, which is only a field metric. Good TBT scores are under 2 seconds, “needs improvement” scores are between 2 and 6 seconds, and anything over 6 seconds is considered bad.
Every location page provider other than Uberall had Total Blocking Time in the good range. Only Uberall and Yext had any domains outside of the good TBT range.
Three of the four Yext domains were under .6 seconds and the two best TBT scores were received by domains managed by Yext. However, Yext also had the second slowest TBT score, at 3.6 seconds, so using Yext doesn’t necessarily mean you’ll have passing TBT/FID scores. One strange thing I noticed was that the pages for Mexico locations of The Loft had higher scores than their US counterparts. All were still in the good range, but US locations had a TBT of around .06 seconds and Mexico locations were over .2 seconds and as high as 1.2 seconds. If you’re using Yext for a multi-country business, check your non-US location pages first, as they may have higher total blocking times.
Uberall is a bit complicated. The Germany and Japanese sites scored in the good range (4 and 5 seconds), but ulta.com (based in the US) scored the very worst at nearly 7 seconds the first time and over 23 seconds the second. Because of these TBT scores, I’m concerned to recommend that a US-based business use Uberall.
With LCP scores as bad as they were, I expected some nasty TBT scores too. But ChatMeter, Rio SEO, SOCi, and BirdEye all had fast total blocking times. Brandify and Yext seemed okay too. The only provider with inconsistent and really terrible TBT scores is Uberall.
So Who’s The Best (and Worst) Location Page Provider for CWVs?
For this analysis, I used a pretty small sample of domains. I’d love to complete this analysis again with a much larger dataset and restrict it to domains and location pages for United States based businesses, since the country did have an impact on Core Web Vital metrics. But there were still insightful takeaways from the research for both brands and providers.
Let’s chat about which providers are best and worst prepared for the Page Experience Update (Spoilers: It’s none of them!).
Which location page provider has the best overall Core Web Vital scores?
- The brands with the lowest CLS scores were: Uberall, Brandify, ChatMeter, and Rio SEO
- The brands with the lowest LCP scores were: Rio SEO and ChatMeter, but ChatMeter also had two of the highest LCP scores.
- The brands with the lowest TBT scores were: ChatMeter, Rio SEO, SOCi, and BirdEye
Which location page provider has the worst overall Core Web Vitals scores?
- The brands with the highest CLS scores were: Yext and SOCi
- The brands with the highest LCP scores were: Uberall and SOCi
- The brands with the highest TBT scores were: Uberall
Rio SEO seems to be best prepared for the Page Experience Update, but that doesn’t mean they are prepared. CLS scores were low, TBT was fast, but LCP was still terrible. Stores.guess.com has an LCP of over 14 seconds – that’s 10 seconds longer than the cutoff for “bad” LCP. The brands who own the domains (in this case, Guess), may have some blame for the LCP problems, but even Rio SEO’s best performing site for LCP was in the “needs improvement” range.
ChatMeter domains performed really well for CLS and TBT too, but two of the domains I checked had absurdly high LCP times – 18 and 22 seconds. ChatMeter managed to have the two worst LCP scores out of 21 websites with poor LCP scores.
None of these providers are standout “winners,” but I know who I’ll be actively avoiding. SOCi and Uberall scored very poorly for two of three CWV metrics and that is unacceptable.
The Page Experience Update is already rolling out and they’ve had over a year to prepare – why are the location provider scores still so bad?
So Now What?
So what options do multi-location brands have if every location page provider has CWV issues? Well, we did see a few examples of brands who are successfully using a location page provider and are in the good range for CLS and TBT and in the “needs improvement” range for LCP. So it is possible for brands to use a location page provider and have not-terrible CWV scores.
But I’m aiming for higher than not-terrible for my clients. This research highlights why, when brands can afford it, we recommend owned technology. Being dependent on third party providers to navigate SEO updates for important page types is risky. You need to be able to trust them to address changing technology and, in the case of Core Web Vitals, it’s hard to place much trust in any provider.
Have thoughts, feelings, or want to send me some domains you know are managed by these providers? Feel free to reach out to me on Twitter.
Core Web Vital Scores by Domain
*Additional Methodology Information
If you’re interested in the way LSG runs LightHouse reports, I’d encourage you to check out the GitHub documentation. Here’s some additional information from Sam Capeheart, who built our tool:
“The Lighthouse Reporter runs on an AWS EC2 instance (t2 medium) with “Low to moderate” network performance. That translates to anywhere from 50-300+ mb/s. However, Lighthouse applies network throttling by default to simulate a specific tier of lower level networking capabilities. See this official doc for more info.
A certain level of variability also occurs across different device types. By default, Lighthouse applies a 4x CPU slowdown to simulate a mid-tier mobile device when run on a high-end desktop computer. However, given the different hardware capabilities of the Lighthouse reporter’s machine, we apply a 2x slowdown instead. More info on CPU throttling values here.”