Welcome!

Machine Learning Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: Java IoT, Machine Learning , @CloudExpo

Java IoT: Article

The Super Bowl Effect on Website Performance

Synthetic monitoring of a website

Whether or not you are a fan of U.S. football - it was really hard to avoid this huge sports event on February 5. In addition to the actual game, it's the Super Bowl commercials that - besides being very expensive to air - usually drive a lot of load on the websites of the companies that run their ads. The question is whether the millions of dollars spent really drive consumers to these websites and make them do business with them.

As we won't get an answer from the top brands that advertised about the actual conversion rates we can look at the End User Experience and performance of their websites while these ads were aired. By analyzing the data that we can get through continuous synthetic monitoring combined with deep dive browser diagnostics, we'll be able to see whether their web application was actually able to handle the load and didn't leave too many of these users with a bad user experience.

Use Case: What Was the User Experience at One of the Top Brands on Super Bowl Day
In order to avoid any discussions on whether the numbers we present are good or bad and mean that this company did a good or terrible job, we present all this data without giving an actual name as the purpose of this exercise is to show how to perform this type of analysis.

Synthetic Monitoring of a Web Site
In order to monitor web site performance of a site we set up a Gomez Synthetic Monitor to be executed on a scheduled basis. The monitor not only monitored the performance of the initial home page but also walked through several main use-case scenarios on that page, e.g., searched for a product. These tests were also executed from multiple U.S. locations, using low- and high-bandwidth connections and also using both the Internet Explorer and Firefox browsers. In addition we captured individual performance sessions using dynaTrace Browser Diagnostics.

The following screenshot shows the response time of the home page of this website via our Last Mile platform via a Firefox browser in the days before the Super Bowl as well as during the Super Bowl. The average page load time was around 9 seconds with the exception of the time-frame during the Super Bowl - that's when it jumped up to 30 seconds:

Synthetic Monitoring shows peak during the Super Bowl of up to 30s to load the Homepage

Analyzing Page Load Time from Monitoring Data
There were two factors that drove this spike:

  1. Higher load on their application due to the commercial aired during the Super Bowl
  2. Additional content on their page that doubled the page size

Factor 1: Higher Load
That's of course intended and a success for their marketing campaign. What you want to make sure is that you can handle the additional expected load by using CDNs (Content Delivery Networks) to deliver that static content to your end users as well as provide additional resources on your web and application servers to handle the extra load. To be prepared for that it's highly recommended that you do some up-front large scale load-testing. For more information on that you can read my recent blog posts on To Load Test or Not to Load Test: That Is Not the Question.

Factor 2: Additional Content
The average size of the home page jumped from 776kb to 1.497kb - that's double the page size. The reason for this is additional images and content that was displayed on the home page when accessed during the Super Bowl. The Gomez Monitor as well as the dynaTrace Browser Diagnostics Sessions provides detailed resource analysis that immediately highlights the additional images, stylesheets and JavaScript files. The following shows some of the additional Super Bowl-related content including size and download time:

Additional content downloaded during the Super Bowl resulting in up to 3 times higher page load times

The additional content in this case may have been necessary to fulfill the company's marketing campaign. The question is whether the additional content could have been optimized not to download 40 additional resources with a total size of > 700kb.

Deep Dive Browser Diagnostics
Looking at the dynaTrace Browser Diagnostics Data we can observe several interesting aspects.

Observation 1: Load Impact of additional resources
We can see the actual impact of these additional resources that got downloaded during the time of the game. Two of these resources took a very long time and had a major impact on page load time. The background image was delivered by their own web server and was not put on a CDN, which results in more traffic on their web servers and less than optimal performance for end users who are far away from their data centers. Another interesting aspect was an additional Facebook Like button that took 600ms to download and execute.

dynaTrace Timeline View showing the impact of additional resources on the page load time

Observation 2: Page Transition Impact of CDN User Tracking Logic
Some tracking tools out there send their data in the onbeforeunload event handler. Modern browsers actually don't allow the onbeforeunload event handlers to take too long or send any data. The workaround for this is to put in an "artificial" JavaScript loop that waits for 750ms to ensure the browser sends the AJAX Request with the tracking data before the browser navigates to the next page. We can also see this behavior on this page:

JavaScript Tracing Code from a CDN Provider adding a 750ms loop to ensure tracing data gets send before navigating to the next page

Conclusion
When you are responsible for a website that is going to see high user volume for a marketing campaign you want to:

  1. Make sure the additional content for that marketing campaign is optimized, e.g., make sure this content is on CDNs and follow the Best Practices for Web Performance Optimization
  2. Make sure you test your application with the campaign-specific features under realistic load
  3. Analyze the impact of third-party tracking tools and other widgets you put on your page

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regulatory scrutiny and increasing consumer lack of trust in technology in general.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power the organization's key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the "Big Data MBA" course. Bill was ranked as #15 Big Data Influencer by Onalytica. Bill has over three decades of experience in data warehousing, BI and analytics. He authored E...