Table of contents of the article:
A few weeks ago we talked about how in October the Birredamanicomio.com site went offline during a promotional passage in a TV commercial on X Factor. We made a rather clear and objective analysis of the shortcomings and shortcomings that led to the down of that site hosted not by us but by another hosting provider, explaining how the lack of a static cache was crucial to the crash that we also wanted document with a video recording of the smartphone always to this article entitled precisely How not to go on TV with your WooCommerce site if not properly configured on the Server and Hosting side.
Not even to do it on purpose, in less than a month we have the opportunity to show with facts all the theory that we dispensed with beautiful words in the previous article that denoted the downtime and the server crash of one of our competitors and competitors.
The specific case: X-Bio on RAI1 and the legacy with Flavio Insinna.
X-Bio is the brand of one of our customers (Buoninfante Medical Group) which produces high quality and quality mattresses which has launched a national launch that reserved for X-BIO Circulife, new model of the line X-BIO by Buoninfante - brand of the mattress manufacturer born in 1975. From Monday 7 to Friday 11 November X-BIO Circulife will be in TelePromento on RAI 1 during The legacy, program conducted by Flavio Insinna from 18.45 to 20.00.
Innovative in every respect, with a unique design and capable of protecting health and the environment, X-BIO Circulife is the first mattress in the line produced from recycled plastic bottles. A new product that looks to a more sustainable world, to the well-being of the person and to the future of the planet and of the next generations.
In this regard, therefore, with a commercial that explicitly referred to the website, X-bio.it there was no possibility other than to optimize servers and software stacks in order to withstand the peak of traffic in a program on the first national channel and seen by millions of people.
Concerns and preparations on the server side and Hosting less than two hours after launch.
It must be said to be fair that we did not have much time to prepare everything, indeed it was rather improvised since we weren't notified well in advance but just two hours before the November 7th airing.
When we spoke with the developer who was concerned about the speed of the site and the upcoming airing in less than two hours, our first comment was something like this:
You will inevitably go offline. Without ifs and buts. For these events, we prepare at least a week in advance.
However, the thing left that way sounded bad. Very bad, more for us than for them. We couldn't let the site go offline in the middle of that event, and it wasn't cool either, after we documented an article the month before titled How not to go on TV with your WooCommerce site if not properly configured on the Server and Hosting side.
It's okay that in this case the site was an e-Commerce built in Prestashop and not WooCommerce, but that didn't change much.
We had two hours at our disposal. Few, very few, but not very few.
Whatever we did would be done with certainty and very, very fast.
Obviously, Auditel data had to be taken into account, given that the Legacy collected one of the highest shares of the entire Monday with 4.39 million spectators and a share of 25.4%.
What could we have done, considering that the Prestashop Hosting and the site was already running on a dedicated AMD Ryzen 3600 server with 6core / 12 threads - 64 GB RAM and two nVME disks in RAID1 and a 1Gbit / s uplink?
We certainly wouldn't have had the time to order a machine with multiple cores and threads, we wouldn't have had more time to order a 10Gbit / s uplink and a 10Gbit network card, also because the machine had to be ordered, installed, configured, migrated, 4 hours at best, not two.
Therefore, we "limited" ourselves to quickly switching nameservers from Godaddy to CloudFlare that would have allowed us to absorb the peak of traffic for multimedia images, jpeg, png, css and js, as well as configure the server with an advanced configuration that included a Full Page Cache server side like Varnish.
Because although it is common belief that CloudFlare caches html it is good to remember that CloudFlare does not cache HTML by default, unless you go to expensive paid plans and do a limited and tricky dot configuration compared to what you can elegantly do in a few minutes and for free with Varnish
The site has held up very well without down and without slowdowns with extreme amazement, especially ours. Unfortunately, given the urgency and the very tight deadlines, we did not have the opportunity to document the event as we would have liked, however, we promised ourselves that we would be redone with better reporting and documentation in the television broadcast the following day.
Prepare better for the duration of the teleshopping of the whole week from 7 to 11 November.
Aware that the launch would last another 4 days, we wanted to further improve the infrastructure by using, first of all, a much more powerful hardware. In fact, we went from 6core / 12 threads to 16core / 32 threads, effectively tripling the number of threads available, certainly useful and potentially decisive for the spawn of PHP-FPM processes and for Percona Server threads (a fork of MySQL).
In all this we have obviously revised the tuning of the webserver, of the DBMS, of PHP-FPM, in order to find the best compromise between speed and multithreading, avoiding excessive loads and possible deadlocks at the DB level.
Everything was done with a commercial approach that is also quite interesting, namely that of renting the car for the scheduled time for free. We cannot go into detail on the subject, but it is enough to know that for such situations, for example, we can rent a car for 500 euros / month completely free of charge without charging a single cent to the end customer, if not the cost of setup / migration. and systems consultancy.
Not only Stack Server, also application implementations on Prestashop.
Although the server part had been extremely easy to implement, the application software part instead would have had a necessary modification on the application side by the developer. Because if it is true that data in hand, Prestashop turns out to be a better ecommerce-oriented CMS in many respects, including performance, compared to WooCommerce, it is also true that Prestashop does not lend itself well to working with scaling and caching like Varnish.
You can use Varnish by default, but then you risk that the user will not be able to log in or complete a purchase, which makes no sense from a purely business point of view.
We therefore had to insert a routine during the login phase that would create a cookie to identify a logged in user and put him in a position to navigate his customer area, with his data and without risking being logged out.
While it is relatively easy to get Prestashop to work with Varnish, it is not as easy to make it work WELL.
How did the new server go? Judge for yourself.
We could give numbers, talk about loads, query speed, Cache hit ratio and many other things. However, what is most important to the final visitor is to find a site available, fast and snappy when Flavio Insinna names the site name and not a 500 error due to exhaustion of resources, a Bad Gateway, or a site that loads the first page in 16 (sixteen) seconds, as we had the opportunity to measure and document on the aforementioned case “How not to go on TV”.
Therefore, in addition to saying that the server has held up very well, always working abundantly within the comfort zone (for us the comfort zone is 50% of the number of threads), i.e. compared to 32 threads, where a load average of 32 corresponds to 100%, we consider the comfortable zone within 16 of the load Average.
We used Camtasia Studio to record the whole event in "live", watching the live broadcast on Ray Play and then simultaneously go and see what was happening on the terminal at the load level, and on the Microsoft Edge browser to simulate the navigation of a normal user just finished the commercial.
You can view the video and video analysis with server screens, load, pool, and speed tests of the site and the TTFB directly in this video, created in order to document how you can also go on TV without crashing or downtime or having to spend insane amounts on dubious technologies such as Cloud, cluster or similar. Always remembering that each case is still a case in itself, and not necessarily this solution for this case, is suitable for all the needs of high traffic sites.