August 18 2023

Kleecks, iSmartFrame and CDN Optimizing Core Web Vitals: Here's How Google Tests Cheat and Why They're of Little Use for SEO

Let's see how some tricks can improve the Google PageSpeed ​​Insight score but fail the Core Web Vitals tests.

In the dynamic world of the web, where every detail can make a difference in positioning and user experience, Google has made a significant qualitative leap in its evaluation methodology. Since Google introduced PageSpeed ​​Insights – tools that for years have been the reference for developers and SEOs to evaluate the performance of their sites – its evolution has led to the advent of Core Web Vitals. This new suite of metrics not only measures a website's efficiency and speed, but goes further by focusing on the actual user experience.

I Core Web Vitals: The transition from Vanity Metrics to Crucial Tools

The days of boasting a high score on PageSpeed ​​Insights are all you need. Now, Google requires a deeper understanding of a site's actual performance. THE Core Web Vitals they have become the emblem of this evolution, marking a clear distinction between what is purely aesthetic and what is fundamental to the user experience.

  1. I Core Web Vitals LAB: This is a set of tests conducted in the laboratory by Google. These tests, while rigorous and detailed, are actually simulations of a site's performance. They don't necessarily reflect the end-user experience, but they are valuable tools for developers. They work like a compass that indicates which direction to move in during the site design and optimization phases. However, it is crucial to understand that while they are indicative, they do not represent the concrete reality of how a site is perceived by users.
  2. I Core Web Vitals CRUX (Chromium Real User Experience): Here we enter the heart of the user experience. These metrics are based on real data collected by Chromium-based browsers. These include giants like Google Chrome, but also Microsoft Edge, Opera, Brave and many more. Every time a user opens a web page through these browsers, they send Google a series of detailed information about page loading, its interactivity and visual stability. Google, by analyzing this data, extracts an average of the performances of the last 28 days and establishes whether or not the site responds positively to the parameters of the Core Web Vitals, for both desktop and mobile versions.

While LABS tests offer a "theoretical" view of a site's performance, CRUX data provides a "practical" representation, based on real experiences. The latter have become of vital importance in determining the visibility of a site in the Google SERP. In other words, a site may score excellent on LABS tests, but if it falls short of CRUX metrics, its position in search results could suffer severely.

We talked about it intensively and specifically in this post: Core Web Vitals and CRUX data.

Put simply, not necessarily a score with 100 PageSpeed ​​score can exceed Core Web Vitals, and not necessarily a score with 60 PageSpeed ​​score cannot pass it as we can see from the analysis of this customer that with an average PageSpeed ​​score of just 50 on Mobile devices, it still passes the test brilliantly Core Web Vitals, positioning itself as the 375th site in Italy and at least 9,2 million unique visitors per month and over 15 million page views. (Source Similarweb.com with connected Analytics).

PageSpeed ​​IlCorrieredellacitta

Similarweb The city courier

 

The evolution in the pursuit of optimal web performance

In the recent digital context, where the speed and efficiency of a website can make the difference between a won customer and a lost one, attention to performance optimization has become crucial. The renewed importance attributed by the Core Web Vitals has boosted the optimization solutions industry, bringing to light a number of tools designed to help websites reach their peak performance.

Kleecks CDN and iSmartFrame: The Seeming Magic in Performance Optimization

Among the multiple options available to developers and website owners, stand out Kleecks CDN and iSmartFrame as two recognized leaders in providing performance-oriented solutions.

  1. The philosophy behind CDNs: CDNs, or Content Delivery Networks, represent a network of servers distributed in various geographical points, with the aim of serving content to visitors more quickly and efficiently. The main objective of these networks is to minimize the distance between the visitor and the source of web content, ensuring a reduced loading time and a smooth user experience.
  2. Kleecks CDN and iSmartFrame at work: Both of these solutions, while each having its own specific characteristics, exploit the potential of CDNs and operate as a reverse proxy. In this function, they act as intermediaries between the end user and the site's original server.

    The magic happens when they take charge of the source code of a website and optimize it, performing advanced technical operations such as:

    • Minification: Compress JS and CSS codes, reducing space and making loading faster.
    • Converting images: Exchange heavy image formats for lighter and faster formats like WebP, without compromising quality.
    • Cache and latency reduction: Thanks to caching mechanisms, frequently requested content is stored and served more quickly, minimizing user waiting times.
    • Much more.
  3. A gift to developers: The beauty of these solutions lies in their nature of PaaS, or Platform as a Service. Instead of manually handling optimization complexities, developers can rely on these platforms to do all the heavy lifting, allowing them to focus on other project challenges without having to dig into application code and troubleshoot of performance by correcting the code.

In-depth analysis of the discrepancy between LABS and CRUX

In our ongoing effort to understand the dynamics of website performance, we've come across a particular dilemma regarding the use of services like Kleecks, iSmartFrame, and other similar optimization tools. While these services promise optimal performance, the reality may be slightly different.

It rather makes you smile (to put it mildly) as a company that places itself on the market as a Leader in the Enterprise range of optimization Core Web Vitals and Web Performance, have their institutional sites with which they interface with the world, unable to pass the tests Core Web Vitals, as we can see in the following images:

 

Arriving in both cases to have very serious TTFB problems that exceed the second, where Google recommends a TTFB time of less than 200 ms.

In the course of our investigations, we decided to go beyond mere theory and guesswork, but to examine in detail some customers making use of these emerging technologies. The goal was to better understand how these technology stacks handle requests and serve content, particularly in response to Google's specific PageSpeed ​​Insight user agent.

After a series of painstaking and meticulous tests, we have obtained surprisingly enlightening results. We found that several JavaScript files did not actually load when detected by the Google PageSpeed ​​Insight user agent. This allows sites to achieve an impressively high LABS score, almost as if they are wearing an optimization mask. However, when it comes to the tests Core Web Vitals, the results are less flattering: not only do they fail these crucial tests, but they also have significant performance shortcomings.

For example, a particularly worrying figure was the very high Time to First Byte (TTFB), or latency, a key indicator of server responsiveness that Google recommends always keep below 200 milliseconds.

TTFB 200ms

In short, it seems absurd to see TTFB of more than a second in sites of companies that propose to optimize Web Performance, resulting a priori not very credible.

To make our findings more accessible and understandable, we've condensed our findings into an analysis video. We invite all interested parties to view it to gain a detailed overview and in-depth understanding of what we have discovered.

Misalignment between synthetic metrics and real data

We have observed that many websites, after integrating these solutions, show exceptional scores when analyzed using the Google PageSpeed ​​Insight LABS tests. However, these scores do not seem consistent with the results provided by Core Web Vitals (CRUX), which represent site performance for real users.

In this regard, we wanted to take as an example some sites that we can see in the video above which suggest both the discrepancy and the methodology used to go and verify the modus operandi of these "miraculous CDNs".

The apparent disconnect between these two metrics raises some concerns:

Synthetic LABS Tests: Reliability and Limits in the Real World

Synthetic tests, such as those offered by LABS, are a type of analysis that simulates user behavior on a website in a controlled environment. While they are extremely useful for identifying performance issues in development or optimization, they have some inherent limitations that could make their results less representative of actual user experiences.

How do synthetic tests work?

Such tests are performed in the laboratory, or in virtual environments, where variables such as bandwidth, latency and device resources are standardized or simulated. This allows developers to obtain performance metrics under "ideal" conditions, eliminating the fluctuations that might occur under real-world sailing conditions.

Limitations of Synthetic Tests
  1. Standardized environments: Because these tests are performed under controlled conditions, they may not account for different combinations of hardware, software, and connectivity that end users may have. A site might work well on a high-end device with a fast connection, but perform poorly on an older device or with a slow connection.
  2. External interference: Real users might have many tabs open, applications running in the background, or even security software that could affect the performance of a website. These factors are not typically simulated in synthetic tests.
  3. Caching and User Interactions: While synthetic tests may simulate some of them, they may not fully capture real user behavior, such as scrolling a page, clicking on various items, or how browsers handle caching of a site during subsequent visits.
  4. Deceptive Strategies: As mentioned earlier, techniques such as cloaking could allow a site to "cheat" synthetic tests by presenting an optimized version when it detects a test in progress. This could result in artificially high performance metrics.

Cloaking: A Deceptive Strategy For Manipulating Google Tests?

The term "cloaking" refers to a search engine optimization (SEO) practice that has raised a lot of controversy over the years. This tactic is based on presenting different versions of a web page to search engines and real users. The main goal behind this maneuver is to manipulate and improve a site's ranking in the search engine results pages (SERPs), by showing engines content that could be seen as more relevant or optimized.

How does cloaking work?

The functioning of cloaking is based on the recognition of the User Agent or the IP address which is making the request to the server. If the User Agent is recognized as that of a search engine crawler (such as Googlebot), then a different content is shown than what a normal visitor would see, such as an html that excludes the loading of Javascript.

CDN and Cloaking : A New Paradigm for Data Optimization Core Web Vitals ?

In light of the growing emphasis placed on metrics such as i Core Web Vitals, one might assume that some CDNs, in their mission to optimize performance, resort to tactics similar to cloaking. This would mean that when such CDNs encounter a LABS test from Google PageSpeed ​​Insight, they could serve up a “lightened” or “optimized” version of the site, dropping or tweaking some elements to get higher scores.

During our investigations, we simulated being a Google bot by modifying the User Agent of our browser and we noticed that, in some circumstances, external Javascript scripts, notoriously heavy and potentially slowing down, were not loaded. While this omission may result in seemingly faster load times during testing, it may not reflect the actual user experience.

A dangerous precedent in the WP Optimize plugin for WordPress accused of altering PageSpeed

WP-Optimize, a popular WordPress plugin to improve site performance, he has been accused of manipulating benchmarks. Gijo Varghese, developer specializing in web performance and creator of the FlyingPress plugin, pointed out that WP-Optimize disables JavaScript when testing with benchmarking tools. His claim was corroborated by a screenshot that showed how the plugin prevents JavaScript files from being loaded during testing.

This behavior has generated negative reactions from the WordPress community. Some have compared this tactic to similar scams, such as the Volkswagen emissions scandal. Users have expressed their disappointment and concern about these deceptive practices. The discussion highlighted the importance of focusing on real user experience rather than test scores. However, trust in WP-Optimize has been compromised due to these revelations.

Implications and Ethics

The potential discovery that CDNs are using cloaking techniques is not just a technical detail, it raises deeply rooted ethical and technical questions. When an organization engages in cloaking, it may actually be “masking” a site's true performance, giving the illusion of optimization that doesn't really exist. While the primary intention may appear to be to improve performance, what is actually happening is skewing test results, giving itself away as a misrepresentation of the site's actual capabilities. This can lead developers and website owners to make decisions based on erroneous data, diverting them from the optimal course.

Beyond that, it is crucial to consider the significant financial burden such solutions entail. The rates of some of these CDNs can reach considerable sums, even several thousand euros per month, which translate into considerable monthly expenses in the long run. If these large sums are spent without achieving a tangible improvement, such as exceeding the Core Web Vitals, one could legitimately ask whether these resources would have been better spent elsewhere.

Indeed, considering the current landscape of web technologies and the growing emphasis on performance, it would make absolute sense to reinvest these sums in more sustainable and enduring solutions. Committing resources to hiring or consulting experts, such as dedicated web developers and Linux systems engineers who specialize in web performance, could offer a much more significant return on investment. These professionals can address and resolve performance issues at their root, delivering tailored solutions that not only solve immediate challenges, but also prevent future problems. And all for a one-time investment, rather than onerous recurring fees.

The Impact of Javascript on Web Performance: A Double-Edged Sword

Javascript has become one of the fundamental tools in web development, allowing you to create rich, interactive and dynamic web applications. However, like any powerful tool, if not used wisely, it can have unintended consequences on a site's performance.

The Weight of Javascript on the Browser

When a browser loads a web page, it has to parse and execute the Javascript scripts included in the page. This process can be quite onerous, especially when it comes to:

  1. Large Scripts: Large scripts take longer to download, parse and execute. This can delay the processing of other crucial page elements.
  2. Intensive Execution: Some scripts, due to their nature or complexity, may be resource-intensive when running, causing a high load on the device's CPU or memory.
  3. External Dependencies: Scripts that rely on external libraries or call resources from third-party servers can introduce additional latencies, especially if those external resources are slow or unoptimized.
Direct Impacts on the User Experience

Inefficient Javascript execution can lead to various problems, including:

  • Rendering block: Scripts that run before the page is fully loaded can block the display of content, leaving users waiting.
  • Compromised interactivity: If a script takes too long to respond, user interactions, such as scrolling or clicking, may be delayed or interrupted.
Deceptive Tactics and Test LABS

To score high on synthetic tests like LABS, some CDNs may employ deceptive strategies, such as “skipping” loading problematic Javascript resources. If a website "hides" these scripts during a LABS test, the result will be a page that appears to load much faster, giving the site an artificially high performance score. However, this does not reflect the real experience of the user, who could be exposed to all the problems caused by such scripts in a real browsing context.

Conclusion: The Fine Line Between Metrics and Reality

In the complicated landscape of the web, it's easy to be seduced by perfect numbers and maximum scores. But, as often happens, all that glitters is not always gold. Google PageSpeed ​​Insight, while a vital tool, can sometimes offer partial insight into a website's actual performance.

The Deceptive Charm of Perfect Scores

A LABS score of 100 in Google PageSpeed ​​Insight may seem like the unequivocal testimony of an optimized and performing website. However, it is vital to understand that such a metric, when taken by itself, can be misleading. Some companies, well aware of this, may resort to deceptive tactics to "fix" the LABS tests, in order to exhibit these high scores especially to end customers who do not have the ability or expertise to distinguish the difference between a simulation and the real user experience.

The Flip Side: The Deceived End Customer and the Unimproved Business.

The temptation to impress the end customer with perfect scores is understandable. However, often, website owners or stakeholders are not fully aware of the technical nature and nuances of web metrics. Presenting a high LABS score without consistently passing the Core Web Vitals in the last 28 days it could satisfy the customer in the short term, but it will not bring benefits in the long term, especially when visitors start experiencing real navigation problems on the site.

The Heart of the Matter: Real User Experience

Besides the numbers, what really matters is the CRUX – the real user experience. If a site doesn't deliver consistent and reliable performance to its visitors, perfect LABS scores become irrelevant. And over time, the site's reputation may suffer irreparable damage.

 

Ultimately

While analytics tools like Google PageSpeed ​​Insight are invaluable, they should never replace a comprehensive and authentic assessment of user experience. It's imperative for anyone running a website to look beyond the shining numbers and focus on what really matters: providing a quality browsing experience for all visitors. And always remember to be wary of solutions that seem too good to be true; often, they are.

Regardless of the performance optimization solution you choose to adopt for your website, it is essential not to stop at the first results, but rather to analyze and monitor the progress of Core Web Vitals in the medium term. Web technology is constantly evolving and what seems to work perfectly today may not be as effective tomorrow. Therefore, a continuous and prolonged evaluation over time will allow you to have a clear and realistic view of the performance of your site. Aiming to overcome the Core Web Vitals it shouldn't be a short-term goal, but an ongoing commitment, thus ensuring a quality browsing experience for your users and a solid reputation for your site in the digital landscape.

 

Do you have doubts? Don't know where to start? Contact us!

We have all the answers to your questions to help you make the right choice.

Chat with us

Chat directly with our presales support.

0256569681

Contact us by phone during office hours 9:30 - 19:30

Contact us online

Open a request directly in the contact area.

INFORMATION

Managed Server Srl is a leading Italian player in providing advanced GNU/Linux system solutions oriented towards high performance. With a low-cost and predictable subscription model, we ensure that our customers have access to advanced technologies in hosting, dedicated servers and cloud services. In addition to this, we offer systems consultancy on Linux systems and specialized maintenance in DBMS, IT Security, Cloud and much more. We stand out for our expertise in hosting leading Open Source CMS such as WordPress, WooCommerce, Drupal, Prestashop, Joomla, OpenCart and Magento, supported by a high-level support and consultancy service suitable for Public Administration, SMEs and any size.

Red Hat, Inc. owns the rights to Red Hat®, RHEL®, RedHat Linux®, and CentOS®; AlmaLinux™ is a trademark of AlmaLinux OS Foundation; Rocky Linux® is a registered trademark of the Rocky Linux Foundation; SUSE® is a registered trademark of SUSE LLC; Canonical Ltd. owns the rights to Ubuntu®; Software in the Public Interest, Inc. holds the rights to Debian®; Linus Torvalds holds the rights to Linux®; FreeBSD® is a registered trademark of The FreeBSD Foundation; NetBSD® is a registered trademark of The NetBSD Foundation; OpenBSD® is a registered trademark of Theo de Raadt. Oracle Corporation owns the rights to Oracle®, MySQL®, and MyRocks®; Percona® is a registered trademark of Percona LLC; MariaDB® is a registered trademark of MariaDB Corporation Ab; REDIS® is a registered trademark of Redis Labs Ltd. F5 Networks, Inc. owns the rights to NGINX® and NGINX Plus®; Varnish® is a registered trademark of Varnish Software AB. Adobe Inc. holds the rights to Magento®; PrestaShop® is a registered trademark of PrestaShop SA; OpenCart® is a registered trademark of OpenCart Limited. Automattic Inc. owns the rights to WordPress®, WooCommerce®, and JetPack®; Open Source Matters, Inc. owns the rights to Joomla®; Dries Buytaert holds the rights to Drupal®. Amazon Web Services, Inc. holds the rights to AWS®; Google LLC holds the rights to Google Cloud™ and Chrome™; Microsoft Corporation holds the rights to Microsoft®, Azure®, and Internet Explorer®; Mozilla Foundation owns the rights to Firefox®. Apache® is a registered trademark of The Apache Software Foundation; PHP® is a registered trademark of the PHP Group. CloudFlare® is a registered trademark of Cloudflare, Inc.; NETSCOUT® is a registered trademark of NETSCOUT Systems Inc.; ElasticSearch®, LogStash®, and Kibana® are registered trademarks of Elastic NV Hetzner Online GmbH owns the rights to Hetzner®; OVHcloud is a registered trademark of OVH Groupe SAS; cPanel®, LLC owns the rights to cPanel®; Plesk® is a registered trademark of Plesk International GmbH; Facebook, Inc. owns the rights to Facebook®. This site is not affiliated, sponsored or otherwise associated with any of the entities mentioned above and does not represent any of these entities in any way. All rights to the brands and product names mentioned are the property of their respective copyright holders. Any other trademarks mentioned belong to their registrants. MANAGED SERVER® is a trademark registered at European level by MANAGED SERVER SRL, Via Enzo Ferrari, 9, 62012 Civitanova Marche (MC), Italy.

JUST A MOMENT !

Would you like to see how your WooCommerce runs on our systems without having to migrate anything? 

Enter the address of your WooCommerce site and you will get a navigable demonstration, without having to do absolutely anything and completely free.

No thanks, my customers prefer the slow site.
Back to top