Click here to close now.


IoT User Interface Authors: Pat Romanski, Liz McMillan, Gary Kaiser, Elizabeth White, Carmen Gonzalez

Related Topics: Microservices Expo, Java IoT, Mobile IoT, Microsoft Cloud, IoT User Interface, Agile Computing

Microservices Expo: Article

Top Seven Website Performance Indicators to Monitor

Whatever the reason for a website crashing or slowing down, it’s bad for business and for your online reputation

Poorly performing websites, like Twitter's recent fiasco with Ellen's selfie, are a constant source of irritation for users. At first you think it's your computer, or maybe someone on your block is downloading the entire "Game of Thrones" series. But, when nothing changes after refreshing the page once or twice, you give up, mutter under your breath, and move on.

Whatever the reason for a website crashing or slowing down, it's bad for business and for your online reputation. According to a survey conducted by Consumer Affairs, a dissatisfied customer will tell between 9-15 people about their experience. And, if your website can't load fast enough (in 400 milliseconds), then most of your customers will search for another website.

Understanding how your website performs under pressure is extremely important for any company. But, it can be daunting trying to figure out what website performance indicators you should monitor.

We have compiled a list of the top seven website performance indicators we believe to be important. Make sure to track each of these to guarantee a great customer experience.

Top Seven Website Performance Indicators

1. Uptime
Monitoring the availability of your website is without a doubt the single most important part of website monitoring. Ideally, you should constantly check the uptime of your key pages from different locations around the world. Measure how many minutes your site is down over a period of two weeks or a month, and then express that as a percentage.

2. Initial Page Speed
Consumers' behavior and tolerance thresholds have changed. Now, people who browse a website expect it to load in a blink of an eye. If it doesn't load quickly, they will leave and turn to a competitor's site. You can check your website's speed using Ping requests (measuring the time it takes from your location until the website starts loading) and loading time measurements, for example, measuring the time it takes to download the source code of a web page. Note that this measurement reflects the time it takes for the raw page to load, but that isn't the complete user experience. For that, you must measure...

3. Full Page Load Time including images, videos, etc.
This performance indicator is usually called End User Experience testing. It's the amount of time it takes for all the images, videos, dynamically-loaded (AJAX) content, and everything else seen by the user to pop up on the their screen. This is different than the time it takes for the raw file to download to the device it's going to display on (as indicated above).

Both full page load time and page speed are important to measure because you can employ different strategies to optimize for both of them. Images, videos, and other static content can be cached on separate, dedicated systems or content delivery networks (CDNs), while dynamic content might need dedicated servers and fast databases. Knowing how your website behaves as it scales will help you put the right infrastructure in place.

4. Geographic Performance
If you are a globally active company or if you have consumers from different parts of the world, understanding your geographical performance - which is your website's speed and availability in different locations - is extremely important. Your ultimate goal is to make sure your website is easily accessible to all visitors regardless of their location to give them an excellent customer experience.

Many companies ignore this factor, only testing performance in familiar geographies. At a minimum, use your website analytics as a guide to put testing in place that shadows the locations from which your visitors are accessing your site.

5. Website Load Tolerance
Do you know how many visitors it takes to considerably slow down your website? It's an important indicator to understand because if you are running aggressive marketing campaigns or are picked up by the press you might be in a situation where your website is flooded with visitors in a matter of minutes.

Regularly run stress tests and compare the results to your visitor numbers at peak times. Once you understand how much load your website can handle then you can adjust your infrastructure to meet the demand. Look for those "tipping points" so you won't be caught by surprised when traffic spikes.

6. Web Server CPU Load
CPU usage is a common culprit in website failures. Too much processing bogs down absolutely everything on the server without much indication as to where the problem lies. You can prevent web server failures by monitoring CPU usage regularly. If you cannot install monitoring software on your web servers due to hosting arrangements or other constraints, consider running a script that publishes the values from available disk space and CPU load to a very simple html page.

7. Website Database Performance
Your database can be one of the most problematic parts of your website. A poorly optimized query, for example, can be the difference between a zippy site and an unusable one. It's important to monitor your database logs closely. Create alerts if the results contain certain error messages, or deliver results outside of expected norms. Use the built-in capabilities of the database to see which queries are taking the most time, and identify ways to optimize those through indices and other techniques. Most importantly, monitor the overall performance of the database to make sure it's not a bottleneck.

No Downtime = Happy Customers
If you can monitor all seven of these metrics, you should have a good idea of how your website performs and what needs to change when it doesn't perform well. Minimizing website downtime will keep your customers happy. If you have any questions on these metrics or load testing let me know.

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@CloudExpo Stories
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innov...
SYS-CON Events announced today that Key Information Systems, Inc. (KeyInfo), a leading cloud and infrastructure provider offering integrated solutions to enterprises, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Key Information Systems is a leading regional systems integrator with world-class compute, storage and networking solutions and professional services for the most advanced softwa...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps and Continuous Delivery software provider XebiaLabs has announced it has been selected to join the Amazon Web Services (AWS) DevOps Competency partner program. The program is designed to highlight software vendors like XebiaLabs who have demonstrated technical expertise and proven customer success in DevOps and specialized solution areas like Continuous Delivery. DevOps Competency Partners provide solutions to, or have deep experience working with AWS users and other businesses to help t...
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Enterprises can achieve rigorous IT security as well as improved DevOps practices and Cloud economics by taking a new, cloud-native approach to application delivery. Because the attack surface for cloud applications is dramatically different than for highly controlled data centers, a disciplined and multi-layered approach that spans all of your processes, staff, vendors and technologies is required. This may sound expensive and time consuming to achieve as you plan how to move selected applicati...
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively.
As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ability. Many are unable to effectively engage and inspire, creating forward momentum in the direction of desired change. Renowned for its approach to leadership and emphasis on their people, organizations increasingly look to our military for insight into these challenges.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.