Welcome!

AJAX & REA Authors: Sebastian Kruk, RealWire News Distribution, Harald Zeitlhofer

Related Topics: Java, AJAX & REA, Cloud Expo

Java: Article

The Super Bowl Effect on Website Performance

Synthetic monitoring of a website

Whether or not you are a fan of U.S. football - it was really hard to avoid this huge sports event on February 5. In addition to the actual game, it's the Super Bowl commercials that - besides being very expensive to air - usually drive a lot of load on the websites of the companies that run their ads. The question is whether the millions of dollars spent really drive consumers to these websites and make them do business with them.

As we won't get an answer from the top brands that advertised about the actual conversion rates we can look at the End User Experience and performance of their websites while these ads were aired. By analyzing the data that we can get through continuous synthetic monitoring combined with deep dive browser diagnostics, we'll be able to see whether their web application was actually able to handle the load and didn't leave too many of these users with a bad user experience.

Use Case: What Was the User Experience at One of the Top Brands on Super Bowl Day
In order to avoid any discussions on whether the numbers we present are good or bad and mean that this company did a good or terrible job, we present all this data without giving an actual name as the purpose of this exercise is to show how to perform this type of analysis.

Synthetic Monitoring of a Web Site
In order to monitor web site performance of a site we set up a Gomez Synthetic Monitor to be executed on a scheduled basis. The monitor not only monitored the performance of the initial home page but also walked through several main use-case scenarios on that page, e.g., searched for a product. These tests were also executed from multiple U.S. locations, using low- and high-bandwidth connections and also using both the Internet Explorer and Firefox browsers. In addition we captured individual performance sessions using dynaTrace Browser Diagnostics.

The following screenshot shows the response time of the home page of this website via our Last Mile platform via a Firefox browser in the days before the Super Bowl as well as during the Super Bowl. The average page load time was around 9 seconds with the exception of the time-frame during the Super Bowl - that's when it jumped up to 30 seconds:

Synthetic Monitoring shows peak during the Super Bowl of up to 30s to load the Homepage

Analyzing Page Load Time from Monitoring Data
There were two factors that drove this spike:

  1. Higher load on their application due to the commercial aired during the Super Bowl
  2. Additional content on their page that doubled the page size

Factor 1: Higher Load
That's of course intended and a success for their marketing campaign. What you want to make sure is that you can handle the additional expected load by using CDNs (Content Delivery Networks) to deliver that static content to your end users as well as provide additional resources on your web and application servers to handle the extra load. To be prepared for that it's highly recommended that you do some up-front large scale load-testing. For more information on that you can read my recent blog posts on To Load Test or Not to Load Test: That Is Not the Question.

Factor 2: Additional Content
The average size of the home page jumped from 776kb to 1.497kb - that's double the page size. The reason for this is additional images and content that was displayed on the home page when accessed during the Super Bowl. The Gomez Monitor as well as the dynaTrace Browser Diagnostics Sessions provides detailed resource analysis that immediately highlights the additional images, stylesheets and JavaScript files. The following shows some of the additional Super Bowl-related content including size and download time:

Additional content downloaded during the Super Bowl resulting in up to 3 times higher page load times

The additional content in this case may have been necessary to fulfill the company's marketing campaign. The question is whether the additional content could have been optimized not to download 40 additional resources with a total size of > 700kb.

Deep Dive Browser Diagnostics
Looking at the dynaTrace Browser Diagnostics Data we can observe several interesting aspects.

Observation 1: Load Impact of additional resources
We can see the actual impact of these additional resources that got downloaded during the time of the game. Two of these resources took a very long time and had a major impact on page load time. The background image was delivered by their own web server and was not put on a CDN, which results in more traffic on their web servers and less than optimal performance for end users who are far away from their data centers. Another interesting aspect was an additional Facebook Like button that took 600ms to download and execute.

dynaTrace Timeline View showing the impact of additional resources on the page load time

Observation 2: Page Transition Impact of CDN User Tracking Logic
Some tracking tools out there send their data in the onbeforeunload event handler. Modern browsers actually don't allow the onbeforeunload event handlers to take too long or send any data. The workaround for this is to put in an "artificial" JavaScript loop that waits for 750ms to ensure the browser sends the AJAX Request with the tracking data before the browser navigates to the next page. We can also see this behavior on this page:

JavaScript Tracing Code from a CDN Provider adding a 750ms loop to ensure tracing data gets send before navigating to the next page

Conclusion
When you are responsible for a website that is going to see high user volume for a marketing campaign you want to:

  1. Make sure the additional content for that marketing campaign is optimized, e.g., make sure this content is on CDNs and follow the Best Practices for Web Performance Optimization
  2. Make sure you test your application with the campaign-specific features under realistic load
  3. Analyze the impact of third-party tracking tools and other widgets you put on your page

More Stories By Andreas Grabner

Andreas Grabner has more than a decade of experience as an architect and developer in the Java and .NET space. In his current role, Andi works as a Technology Strategist for Compuware and leads the Compuware APM Center of Excellence team. In his role he influences the Compuware APM product strategy and works closely with customers in implementing performance management solutions across the entire application lifecycle. He is a frequent speaker at technology conferences on performance and architecture-related topics, and regularly authors articles offering business and technology advice for Compuware’s About:Performance blog.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Cloud Expo Latest Stories
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore’s Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at 15th Cloud Expo, Mason Katz, CTO and co-founder of StackIQ, to discuss how infrastructure teams should be aware of the capitalization and depreciation model of these expenses to fully understand when and where automation is critical.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based applications and content within their business models are reaping huge benefits by directly leveraging cloud-based mapping and analysis capabilities within their existing enterprise investments. The ArcGIS mapping platform includes cloud-based content management and information resources to more widely, efficiently, and affordably deliver real-time actionable information and analysis capabilities to your organization.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is – how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, will give the audience an understanding of common mistakes businesses make when transitioning to SaaS; how to avoid them; and how to build a profitable and scalable SaaS business.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Solgenia, the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between personal and professional social, mobile and cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing collaboration and infrastructure solutions to the cloud.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
Enterprises require the performance, agility and on-demand access of the public cloud, and the management, security and compatibility of the private cloud. The solution? In his session at 15th Cloud Expo, Simone Brunozzi, VP and Chief Technologist(global role) for VMware, will explore how to unlock the power of the hybrid cloud and the steps to get there. He'll discuss the challenges that conventional approaches to both public and private cloud computing, and outline the tough decisions that must be made to accelerate the journey to the hybrid cloud. As part of the transition, an Infrastructure-as-a-Service model will enable enterprise IT to build services beyond their data center while owning what gets moved, when to move it, and for how long. IT can then move forward on what matters most to the organization that it supports – availability, agility and efficiency.
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.