How important is Lighthouse page speed measurement?
-
Hi,
Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse).
When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40.
Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async.
Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async?
Thank you,
Dave
-
it is important to distinguish between PageSpeed Insights and Lighthouse. Maybe it's more important to follow PageSpeed Insights for your website. It becomes rather clear after reading this article https://rush-analytics.com/blog/google-pagespeed-insights-vs-lighthouse-how-do-they-differ. The differences between PageSpeed Insights and Lighthouse are explained in an easy way.
-
My understanding is that "Page Experience" signals (including the new "core web vitals) will be combined with existing signals like mobile friendliness and https-security in May, 2021. This is according to announcements by Google.
https://developers.google.com/search/blog/2020/05/evaluating-page-experience
https://developers.google.com/search/blog/2020/11/timing-for-page-experience
So, these will be search signlas, but there are lots of other very important search signals which can outweigh these. Even if a page on John Deere doesn't pass the Core Web Vitals criteria, it is still likely to rank highly for "garden tractors".
If you are looking at Lighthouse, I would point out a few things:
- The Lighthouse audits on your own local machine are going to differ from those run on hosted servers like Page Speed Insights. And those will differ from "field data" from the Chrome UX Report
- In the end, it's the "field data" that will be used for the Page Experience validation, according to Google. But, lab-based tools are very helpful to get immediate feedback, rather than waiting 28 days or more for field data.
- If your concern is solely about the impact on search rankings, then it makes sense to pay attention specifically to the 3 scores being considered as part of CWV (CLS, FID, LCP)
- But also realize that while you are improving scores for criteria which will be validated for search signals, you're also likely improving the user experience. Taking CLS as an example, for sure users are frustrated when they attempt to click a button and end up clicking something else instead because of a layout shift. And frustrated users generally equals lower conversion rates. So, by focusing on improvements in measures like these (I do realize your question about large images doesn't necessarily pertain specifically to CLS), you are optimizing both for search ranking and for conversions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google measure page position in Webmasters?
Does anyone know exactly how Google measures page position in Webmaster Tools? For example: In Google Webmaster Tools, we had a product which on the 22/12/15 was at position 7, and then dropped to position 112 on the 30/12/15. It then rose back up to position 7 on the 6/01/16 and then down to position 25 on the 16/01/16. What does this mean and why?
Reporting & Analytics | | CostumeD0 -
I want to take down some pages, how do I inform google?
Hey Guys, I'm hoping someone can help - I'm in the midst of a site re-design - whilst one of our biggest reasons for the redesign was to create more space to write valuable content and unique content I have been reading other posts on moz about content auditing. I have come across a few articles in my own blog that are from 250 - 300 words to which the articles seem similar and the traffic is low. I'm wondering while I'm left to consolidate these articles and create a fresh article for each entry that is more in-depth, when I consolidate or delete these pages - do I need to inform google these pages have now been deleted? If so using Wordpress what is the best way to do this? Cheers and would appreciate some advise Thanks
Reporting & Analytics | | edward-may0 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0 -
It appears there's a problem with our connection to your Google Analytics account. Please go to your Settings page to update your connection.
I keep getting this error though I have confirmed I have the correct information. Any recommendations?
Reporting & Analytics | | x3oadmin0 -
2 days in the past week Google has crawled 10x the average pages crawled per day. What does this mean?
For the past 3 months my site www.dlawlesshardware.com has had an average of about 400 pages crawled per day by google. We have just over 6,000 indexed pages. However, twice in the last week, Google crawled an enormous percentage of my site. After averaging 400 pages crawled for the last 3 months, the last 4 days of crawl stats say the following. 2/1 - 4,373 pages crawled 2/2 - 367 pages crawled 2/3 - 4,777 pages crawled 2/4 - 437 pages crawled What is the deal with these enormous spike in pages crawled per day? Of course, there are also corresponding spikes in kilobytes downloaded per day. Essentially, Google averages crawling about 6% of my site a day. But twice in the last week, Google decided to crawl just under 80% of my site. Has this happened to anyone else? Any ideas? I have literally no idea what this means and I haven't found anyone else with the same problem. Only people complaining about massive DROPS in pages crawled per day. Here is a screenshot from Webmaster Tools: http://imgur.com/kpnQ8EP The drop in time spent downloading a page corresponded exactly to an improvement in our CSS. So that probably doesn't need to be considered, although I'm up for any theories from anyone about anything.
Reporting & Analytics | | dellcos0 -
Google Analytics Organic search queries aren't being updated, even though I'm still seeing results in all our typical results pages.
We pushed some new changes to the site and Google Analytics is no longer updating the Organic Search queries listing, even though traffic is consistent and and we're still landing results in all our typical keyword searches. Any ideas?
Reporting & Analytics | | unclekaos0 -
What does "on first page" mean in seomoz ranking reports?
Hi - When reports here show numbers of keywords appearing "on first page", there must be some implicit assumption made about the number of results listed per page. 1. Can anyone tell me what that assumption is? Is it 10? 20? 2. What about universal results Local links? If the answer to number one is, for instance, 20 results per page, then are there any assumptions made about the number of universal results Local links included? I'm just trying to understand what the reports mean. Thanks, Tim
Reporting & Analytics | | tcolling0 -
GA custom reports involving pages and goals - what are the metrics saying?
Hi, All! I would like to create a custom report that will enable me to see which of my pages are contributing to goal completion on my site (so I can then optimize the pages that are contributing the most, with maximal ROI for the optimization investment). If I make the dimension "page/page title" and the metric "goal X completions" - which would make sense - what exactly are the numbers that I am seeing telling me? Is it how many times a person started the goal funnel from that pages (meaning every goal would appear only once and there be no overlap)? That doesn't appear to be the case with the numbers, because the headline in the main "Goals" section tells me I have 30 goal completions for that goal, for example, but the headline in the custom report (which is adding up all the numbers) is, say, 100. Or does it mean the number of times that this page was ever in the navigation path of someone who ended up completing a goal? Then the same goal would be counted multiple times, for each page in the path. Additionally, I see this strange thing on some of my reports where the actual funnel pages appear as contributing towards goals, which I guess makes sense, but again the numbers don't match up. If the goal was to get to page B, and the funnel was A->B, and there were supposedly 30 goal completions, my custom report says that A gave 28 goal completions and B gave 25. Anyone know for sure - or through testing - what the case is with all these things? Any explanations will be much appreciated!
Reporting & Analytics | | debi_zyx0