Google’s User Experience: Most Websites Fail To Meet Requirements


Google’s User Experience: Most Websites Fail To Meet Requirements




by , Staff Writer @lauriesullivan, April 20, 2021

Research by Searchmetrics released Tuesday suggests that 96% of sites tested in U.S. desktop searches — and more than 90% of those in mobile searches — fail to meet Google’s three Core Web Vitals thresholds for good website performance and usability.


The inability to pass the minimum requirements risks their rankings being negatively impacted from June.


One of the sites that doesn’t meet the requirements is YouTube, which ranks high — although it shows poor Core Web Vitals scores around the speed of loading (LCP) and responsiveness (FID).


“If other websites performed as badly, they would be judged to be offering a low-quality user experience,” according to the study. “But YouTube is ranking high despite this, most likely because of the platform’s overwhelming popularity, which helps it deliver positive user signals.”


Most sites, per Searchmetrics, could not afford to score so low on individual user-experience metrics because they do not have the luxury of YouTube’s extreme brand recognition.


It’s all about experience, according to Google. In fact, 2021 is the year of experience for Google. Google’s “Core Web Vitals” update is intended to improve the way it evaluates the overall user experience of a website. The three new core vitals include loading time, interactivity, and visual stability.


The Searchmetrics study analyzed more than 2 million web pages that appear in the top 20 Google results in the U.S., UK, and Germany, but YouTube doesn’t seem to follow the same standards. 


Google will include these signals in its search algorithm with the search engine aiming to deliver a ranking boost to web pages that are delivering a good experience, according to Searchmetrics.


The reasons highlighted in the study for poor user experience include the rise of unnecessary code on webpages built using templates in website builders such as WordPress and Wix, as well additional code in web plugins, all of which slows pages down and creates optimization challenges.


The study also cites as another issue dynamic content, such as ads and newsletter opt-in boxes that can cause the layout of pages to shift if they are not implemented properly.


Here’s a snapshot from Searchmetrics of how websites perform ahead of the Core Web Vitals update. The several key findings, based on the U.S. findings of the research, include:



  • Sites that ranked No. 1 to No. 3 deliver a good user experience for loading important page content quickly. Largest Contentful Paint measures the time it takes for the largest image or block of text to become visible when a user clicks onto a page. For a good user experience, Google suggests this should happen within the first 2.5 seconds. But of the top 20 ranking websites in search results, only the first 3 positions are below this threshold. The average time for pages listed in the top 20 positions is 3 seconds (21.3% slower than Google’s benchmark). 


  • Most sites can’t pass the test for controlling shifting elements on their pages. Cumulative Layout Shift tracks how much the elements on a page jump about or shift, creating a negative user experience. Google specifies a score below 0.1 as “good,” while below 0.25 “needs improvement” — everything else is “poor.” Onlythe position zero results (featured snippets) which Google places above the traditional organic results to provide quick answers to factual queries, achieves a “good” score. Position one is close, but all other search results fall into the “poor” bracket (below 0.25). The average score or the top 20 results is 0.38 (275.6% worse than Google’s required “good” rating).


  • Majority of sites fall short of Google’s benchmark for good responsiveness. First Input Delay measures the time it takes for a page to respond to a visitor interaction such as someone clicking on a button or a link. The research suggests the top 5 ranking results have an average total blocking time of 554 milliseconds – 84.6% slower than Google’s “good” benchmark of 300 milliseconds. The average for top 20 search results is 136.7% slower that the benchmark.


  • Wikipedia does well with the new Core Web Vitals update. It meets or surpasses Google’s Core Web Vitals performance thresholds across almost all metrics, meaning that it could serve as a good example to the web community. The encyclopaedia site’s lightweight approach to web design, using mainly text and optimized images, means it tends to have a low score for Largest Contentful Paint on many web pages. It also has good responsiveness with a Total Blocking Time of 0 on a lot of pages, which it achieves by avoiding long loading tasks created by plugins, excessive JavaScript or large video files. And because it uses a relatively fixed layout for the vast majority of its pages and limits dynamic content with no ads, it does not suffer from content or layouts jumping around.
MediaPost.com: Search & Performance Marketing Daily

(4)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.