Welcome!

Machine Learning Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Corey Roth

Blog Feed Post

Load testing like a developer

Creating highly targeted load tests for your web server doesn’t have to be complicated. And it won’t be, at least not if you know what to look for.

The secret is to approach a load test with the mind of a developer. You’re not going to run a test just to punish your web server. You’re trying to gather valuable data that will aid you in keeping your web applications fast and responsive. To do this, let’s look at two simple steps to create a scientifically sound load test.

1. To plan your test, think like a user.
The first step is to outline exactly what type of web experience you’re testing for. Are you expecting users to log onto your site and remain for hours at a time? Or are a large number of users entering and leaving the site in short spurts? Or, are you expecting your about-to-be-released video to go viral, and you want to be sure your server can handle the load? No two websites are identical, and the same goes for the user experience on your site.

Let’s take a look at an example. An online shop might find it best to define the number of visits and purchases per hour that its servers must be able to handle. With this information, the shop can calculate it down to a value of each new visit and/or purchase per second. An example of this could be an hour-long test configured to send five new visitors to the site per second, simulate them surfing the product catalog, and have them start a new purchase every second.

2. Find the data that really matters.
Now that your load test is up and running, your next step is to comb through the mountain of incoming data to understand what’s happening on the server. During the load test for our hypothetical online shop, new simulated users are coming to the site and old simulated users are leaving the site. To judge the response time of the web server, we need to look at the number of parallel users. If the web server is slow, you’ll have a large number of virtual users, and if it’s fast, that number will be smaller.

When the results of a load test show unacceptable response times, it is often assumed that the network is the cause of the problem. In reality, this is seldom the case. Fortunately, it is quite easy to check this using test measurement data. Simply compare the response time for a static image (such as JPEG, GIF, or PNG) with the response time for a dynamically created HTML page of approximately the same size.

Consider this example:
A) The response time for a static GIF image of 200 kB is measured at 60 milliseconds.
B) The response time for a dynamically created HTML page with a size of 400 kB is measured at three seconds.

Assuming the response time of the static image depends solely on the network bandwidth (which is not always the case, but is useful for our purposes in this example), the dynamically created HTML page should have a response time of 120 milliseconds since it is only twice the size of the image. Since the actual response time of the HTML page is three seconds, the web application uses at least 2880 milliseconds (3000 minus 120) to produce and send the page.

However, one successful test does not guarantee a perfect application. You should be running load tests after any modification is made to the web server or application (i.e. new software release, upgrading database, etc.). These tests should run for at least 30 minutes, because often a web server does not crash instantly when it’s overloaded. Many can handle a large load for a few minutes, but might crash later after 15 or 20.

With these tips in mind, you’ll be on track to pinpointing your application’s performance issues. Perfection takes time, but as you tweak your tests to emulate your real-world traffic, your response times should gradually improve.

Read the original blog entry...

More Stories By Sven Hammar

Sven Hammar is Co-Founder and CEO of Apica. In 2005, he had the vision of starting a new SaaS company focused on application testing and performance. Today, that concept is Apica, the third IT company I’ve helped found in my career.

Before Apica, he co-founded and launched Celo Commuication, a security company built around PKI (e-ID) solutions. He served as CEO for three years and helped grow the company from five people to 85 people in two years. Right before co-founding Apica, he served as the Vice President of Marketing Bank and Finance at the security company Gemplus (GEMP).

Sven received his masters of science in industrial economics from the Institute of Technology (LitH) at Linköping University. When not working, you can find Sven golfing, working out, or with family and friends.

CloudEXPO Stories
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. In this session, learn how today's business leaders are managing finance in the cloud and the essential steps required to get on the right path to creating an agile, efficient and future-ready business.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.