Table of contents of the article:
In the dynamic world of the web, where every detail can make a difference in positioning and user experience, Google has made a significant qualitative leap in its evaluation methodology. Since Google introduced PageSpeed Insights – tools that for years have been the reference for developers and SEOs to evaluate the performance of their sites – its evolution has led to the advent of Core Web Vitals. This new suite of metrics not only measures a website's efficiency and speed, but goes further by focusing on the actual user experience.
I Core Web Vitals: The transition from Vanity Metrics to Crucial Tools
The days of boasting a high score on PageSpeed Insights are all you need. Now, Google requires a deeper understanding of a site's actual performance. THE Core Web Vitals they have become the emblem of this evolution, marking a clear distinction between what is purely aesthetic and what is fundamental to the user experience.
- I Core Web Vitals LAB: This is a set of tests conducted in the laboratory by Google. These tests, while rigorous and detailed, are actually simulations of a site's performance. They don't necessarily reflect the end-user experience, but they are valuable tools for developers. They work like a compass that indicates which direction to move in during the site design and optimization phases. However, it is crucial to understand that while they are indicative, they do not represent the concrete reality of how a site is perceived by users.
- I Core Web Vitals CRUX (Chromium Real User Experience): Here we enter the heart of the user experience. These metrics are based on real data collected by Chromium-based browsers. These include giants like Google Chrome, but also Microsoft Edge, Opera, Brave and many more. Every time a user opens a web page through these browsers, they send Google a series of detailed information about page loading, its interactivity and visual stability. Google, by analyzing this data, extracts an average of the performances of the last 28 days and establishes whether or not the site responds positively to the parameters of the Core Web Vitals, for both desktop and mobile versions.
While LABS tests offer a "theoretical" view of a site's performance, CRUX data provides a "practical" representation, based on real experiences. The latter have become of vital importance in determining the visibility of a site in the Google SERP. In other words, a site may score excellent on LABS tests, but if it falls short of CRUX metrics, its position in search results could suffer severely.
We talked about it intensively and specifically in this post: Core Web Vitals and CRUX data.
Put simply, not necessarily a score with 100 PageSpeed score can exceed Core Web Vitals, and not necessarily a score with 60 PageSpeed score cannot pass it as we can see from the analysis of this customer that with an average PageSpeed score of just 50 on Mobile devices, it still passes the test brilliantly Core Web Vitals, positioning itself as the 375th site in Italy and at least 9,2 million unique visitors per month and over 15 million page views. (Source Similarweb.com with connected Analytics).
The evolution in the pursuit of optimal web performance
In the recent digital context, where the speed and efficiency of a website can make the difference between a won customer and a lost one, attention to performance optimization has become crucial. The renewed importance attributed by the Core Web Vitals has boosted the optimization solutions industry, bringing to light a number of tools designed to help websites reach their peak performance.
Kleecks CDN and iSmartFrame: The Seeming Magic in Performance Optimization
Among the multiple options available to developers and website owners, stand out Kleecks CDN and iSmartFrame as two recognized leaders in providing performance-oriented solutions.
- The philosophy behind CDNs: CDNs, or Content Delivery Networks, represent a network of servers distributed in various geographical points, with the aim of serving content to visitors more quickly and efficiently. The main objective of these networks is to minimize the distance between the visitor and the source of web content, ensuring a reduced loading time and a smooth user experience.
- Kleecks CDN and iSmartFrame at work: Both of these solutions, while each having its own specific characteristics, exploit the potential of CDNs and operate as a reverse proxy. In this function, they act as intermediaries between the end user and the site's original server.
The magic happens when they take charge of the source code of a website and optimize it, performing advanced technical operations such as:
- Minification: Compress JS and CSS codes, reducing space and making loading faster.
- Converting images: Exchange heavy image formats for lighter and faster formats like WebP, without compromising quality.
- Cache and latency reduction: Thanks to caching mechanisms, frequently requested content is stored and served more quickly, minimizing user waiting times.
- Much more.
- A gift to developers: The beauty of these solutions lies in their nature of PaaS, or Platform as a Service. Instead of manually handling optimization complexities, developers can rely on these platforms to do all the heavy lifting, allowing them to focus on other project challenges without having to dig into application code and troubleshoot of performance by correcting the code.
In-depth analysis of the discrepancy between LABS and CRUX
In our ongoing effort to understand the dynamics of website performance, we've come across a particular dilemma regarding the use of services like Kleecks, iSmartFrame, and other similar optimization tools. While these services promise optimal performance, the reality may be slightly different.
It rather makes you smile (to put it mildly) as a company that places itself on the market as a Leader in the Enterprise range of optimization Core Web Vitals and Web Performance, have their institutional sites with which they interface with the world, unable to pass the tests Core Web Vitals, as we can see in the following images:
Arriving in both cases to have very serious TTFB problems that exceed the second, where Google recommends a TTFB time of less than 200 ms.
In the course of our investigations, we decided to go beyond mere theory and guesswork, but to examine in detail some customers making use of these emerging technologies. The goal was to better understand how these technology stacks handle requests and serve content, particularly in response to Google's specific PageSpeed Insight user agent.
For example, a particularly worrying figure was the very high Time to First Byte (TTFB), or latency, a key indicator of server responsiveness that Google recommends always keep below 200 milliseconds.
In short, it seems absurd to see TTFB of more than a second in sites of companies that propose to optimize Web Performance, resulting a priori not very credible.
To make our findings more accessible and understandable, we've condensed our findings into an analysis video. We invite all interested parties to view it to gain a detailed overview and in-depth understanding of what we have discovered.
Misalignment between synthetic metrics and real data
We have observed that many websites, after integrating these solutions, show exceptional scores when analyzed using the Google PageSpeed Insight LABS tests. However, these scores do not seem consistent with the results provided by Core Web Vitals (CRUX), which represent site performance for real users.
In this regard, we wanted to take as an example some sites that we can see in the video above which suggest both the discrepancy and the methodology used to go and verify the modus operandi of these "miraculous CDNs".
The apparent disconnect between these two metrics raises some concerns:
Synthetic LABS Tests: Reliability and Limits in the Real World
Synthetic tests, such as those offered by LABS, are a type of analysis that simulates user behavior on a website in a controlled environment. While they are extremely useful for identifying performance issues in development or optimization, they have some inherent limitations that could make their results less representative of actual user experiences.
How do synthetic tests work?
Such tests are performed in the laboratory, or in virtual environments, where variables such as bandwidth, latency and device resources are standardized or simulated. This allows developers to obtain performance metrics under "ideal" conditions, eliminating the fluctuations that might occur under real-world sailing conditions.
Limitations of Synthetic Tests
- Standardized environments: Because these tests are performed under controlled conditions, they may not account for different combinations of hardware, software, and connectivity that end users may have. A site might work well on a high-end device with a fast connection, but perform poorly on an older device or with a slow connection.
- External interference: Real users might have many tabs open, applications running in the background, or even security software that could affect the performance of a website. These factors are not typically simulated in synthetic tests.
- Caching and User Interactions: While synthetic tests may simulate some of them, they may not fully capture real user behavior, such as scrolling a page, clicking on various items, or how browsers handle caching of a site during subsequent visits.
- Deceptive Strategies: As mentioned earlier, techniques such as cloaking could allow a site to "cheat" synthetic tests by presenting an optimized version when it detects a test in progress. This could result in artificially high performance metrics.
Cloaking: A Deceptive Strategy For Manipulating Google Tests?
The term "cloaking" refers to a search engine optimization (SEO) practice that has raised a lot of controversy over the years. This tactic is based on presenting different versions of a web page to search engines and real users. The main goal behind this maneuver is to manipulate and improve a site's ranking in the search engine results pages (SERPs), by showing engines content that could be seen as more relevant or optimized.
How does cloaking work?
CDN and Cloaking : A New Paradigm for Data Optimization Core Web Vitals ?
In light of the growing emphasis placed on metrics such as i Core Web Vitals, one might assume that some CDNs, in their mission to optimize performance, resort to tactics similar to cloaking. This would mean that when such CDNs encounter a LABS test from Google PageSpeed Insight, they could serve up a “lightened” or “optimized” version of the site, dropping or tweaking some elements to get higher scores.
A dangerous precedent in the WP Optimize plugin for WordPress accused of altering PageSpeed
🚨 How "WP Optimize" is cheating PageSpeed and other testing tools 👇
No JS = high scores. But for real users, these JS files are loaded! pic.twitter.com/uuOiAOgvoo
- Gijo Varghese (@GijoVarghese_) August 26
This behavior has generated negative reactions from the WordPress community. Some have compared this tactic to similar scams, such as the Volkswagen emissions scandal. Users have expressed their disappointment and concern about these deceptive practices. The discussion highlighted the importance of focusing on real user experience rather than test scores. However, trust in WP-Optimize has been compromised due to these revelations.
Implications and Ethics
The potential discovery that CDNs are using cloaking techniques is not just a technical detail, it raises deeply rooted ethical and technical questions. When an organization engages in cloaking, it may actually be “masking” a site's true performance, giving the illusion of optimization that doesn't really exist. While the primary intention may appear to be to improve performance, what is actually happening is skewing test results, giving itself away as a misrepresentation of the site's actual capabilities. This can lead developers and website owners to make decisions based on erroneous data, diverting them from the optimal course.
Beyond that, it is crucial to consider the significant financial burden such solutions entail. The rates of some of these CDNs can reach considerable sums, even several thousand euros per month, which translate into considerable monthly expenses in the long run. If these large sums are spent without achieving a tangible improvement, such as exceeding the Core Web Vitals, one could legitimately ask whether these resources would have been better spent elsewhere.
Indeed, considering the current landscape of web technologies and the growing emphasis on performance, it would make absolute sense to reinvest these sums in more sustainable and enduring solutions. Committing resources to hiring or consulting experts, such as dedicated web developers and Linux systems engineers who specialize in web performance, could offer a much more significant return on investment. These professionals can address and resolve performance issues at their root, delivering tailored solutions that not only solve immediate challenges, but also prevent future problems. And all for a one-time investment, rather than onerous recurring fees.
- Large Scripts: Large scripts take longer to download, parse and execute. This can delay the processing of other crucial page elements.
- Intensive Execution: Some scripts, due to their nature or complexity, may be resource-intensive when running, causing a high load on the device's CPU or memory.
- External Dependencies: Scripts that rely on external libraries or call resources from third-party servers can introduce additional latencies, especially if those external resources are slow or unoptimized.
Direct Impacts on the User Experience
- Rendering block: Scripts that run before the page is fully loaded can block the display of content, leaving users waiting.
- Compromised interactivity: If a script takes too long to respond, user interactions, such as scrolling or clicking, may be delayed or interrupted.
Deceptive Tactics and Test LABS
Conclusion: The Fine Line Between Metrics and Reality
In the complicated landscape of the web, it's easy to be seduced by perfect numbers and maximum scores. But, as often happens, all that glitters is not always gold. Google PageSpeed Insight, while a vital tool, can sometimes offer partial insight into a website's actual performance.
The Deceptive Charm of Perfect Scores
A LABS score of 100 in Google PageSpeed Insight may seem like the unequivocal testimony of an optimized and performing website. However, it is vital to understand that such a metric, when taken by itself, can be misleading. Some companies, well aware of this, may resort to deceptive tactics to "fix" the LABS tests, in order to exhibit these high scores especially to end customers who do not have the ability or expertise to distinguish the difference between a simulation and the real user experience.
The Flip Side: The Deceived End Customer and the Unimproved Business.
The temptation to impress the end customer with perfect scores is understandable. However, often, website owners or stakeholders are not fully aware of the technical nature and nuances of web metrics. Presenting a high LABS score without consistently passing the Core Web Vitals in the last 28 days it could satisfy the customer in the short term, but it will not bring benefits in the long term, especially when visitors start experiencing real navigation problems on the site.
The Heart of the Matter: Real User Experience
Besides the numbers, what really matters is the CRUX – the real user experience. If a site doesn't deliver consistent and reliable performance to its visitors, perfect LABS scores become irrelevant. And over time, the site's reputation may suffer irreparable damage.
While analytics tools like Google PageSpeed Insight are invaluable, they should never replace a comprehensive and authentic assessment of user experience. It's imperative for anyone running a website to look beyond the shining numbers and focus on what really matters: providing a quality browsing experience for all visitors. And always remember to be wary of solutions that seem too good to be true; often, they are.
Regardless of the performance optimization solution you choose to adopt for your website, it is essential not to stop at the first results, but rather to analyze and monitor the progress of Core Web Vitals in the medium term. Web technology is constantly evolving and what seems to work perfectly today may not be as effective tomorrow. Therefore, a continuous and prolonged evaluation over time will allow you to have a clear and realistic view of the performance of your site. Aiming to overcome the Core Web Vitals it shouldn't be a short-term goal, but an ongoing commitment, thus ensuring a quality browsing experience for your users and a solid reputation for your site in the digital landscape.