Welcome!

Machine Learning Authors: William Schmarzo, Kevin Jackson, Stackify Blog, Elizabeth White, Pat Romanski

Article

Challenges in Tracing JavaScript Performance by Example

Week 8 of our 2010 Application Performance Almanac

In an earlier article I already discussed several approaches towards end-user experience (or performance) monitoring including their pros and cons. In this article I will present a simple real world sample which shows the limits of performance traceability in AJAX applications.

As I don’t like Hello World samples, I thought I’d rather build something a bit more useful. The sample uses the Twitter API to search for keywords. The search itself is triggered by typing into a textbox. While the sample isn’t spectacular from a technical perspective, I will make it more interesting by adding some “technical salt” – rather than sugar.

Building the Sample Page

So let us start looking at the code. Below you find the skeleton of our page. The code is straightforward. We have a textbox and a table. I am using jQuery for convenience reasons here – especially because of some innerHTML bugs of IE. However the sample does not require it.

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title>This is the untraceable AJAX Demo</title>
<script type="text/javascript" src="./jquery.js" /></script>
<script type="text/javascript">
// we add more code later here …
</script>
</head>
<body>
<input type="text" id="search"></input>
<br />
<table id="myTable" />
</body>
</html>

The textbox has a keyUp event listener which invokes the following code. So instead of directly triggering an XHR request, we only write the value of the textbox to a global variable. We do this to reduce the number of network requests.

var value = "";
 
$(function (){
$('#search').keyup(function (){
value = $(this).val();
});
});

Then we define another method which takes the variable value and dynamically creates a script tag with the entered search term as a query parameter. The callback parameter specifies which function will be called as part of the returned JSONP. This is really straightforward thanks to the Twitter API. This method is called every 1.5 seconds using the following setInterval call.

function queryTwitter (){
if(value != ""){
var head = document.getElementsByTagName('head')[0];
var elem = document.createElement('script');
elem.setAttribute('src', 'http://search.twitter.com/search.json?callback=writeToTable&amp;q=' + value +"'");
head.appendChild(elem);
value = "";
}
}
 
setInterval (queryTwitter, 1500);

When the script is returned the method below is invoked. It simply clears the table and then adds a row containing an image and text for each tweet.

function writeToTable (results){
var table = $('#myTable');
table.children().remove();
var tweets = results.results;
for (var i in tweets){
table.append('&lt;tr&gt;&lt;td&gt;&lt;img src="' + tweets[i].profile_image_url + '"/&gt;&lt;/td&gt;&lt;td&gt;&lt;b&gt;' +
tweets[i].from_user + ' says: &lt;/b&gt; &lt;i&gt;'+ tweets[i].text + '&lt;/i&gt;&lt;/td&gt;&lt;/tr&gt;');
}
}

You can try the service here if you want. That’s all the code for our sample. Really simple isn’t it? Now let’s come to the question we want to answer. What we want to know is, how long does it take from entering the text until the result is shown on the screen. This is simply the performance perceived by the end-user.

While this question looks simple at first sight it is really tricky to answer. I must admit, that I built the example in a way that it is difficult to trace :-) .  However I am using only common AJAX techniques. Let’s discuss the problems in detail:

Using a Variable for Execution Delay

The usage of a variable for delayed execution causes some problems. There is no direct link between the JavaScript code executed in the event handler and the code executed by the timer. We will not be able to do a single JavaScript trace. Respectively these calls have no direct relation.

If we have access to the internals of the framework we can overcome this problem by using explicit markers. I am using the dt_addMark JavaScript function which creates a marker in the free dynaTrace AJAX Edition I am doing this for the event handler as well as the queryTwitter method. We can now correlate the click to the timer method invocation.

 

Tracing Asynchronous Execution using Markers

Tracing Asynchronous Execution using Markers

Using a Script Tag for Populating the Table

The next problem we face is that we use a script tag rather than an XHR request to populate the table. Therefore we again have no direct link between the queryTwitter method and the execution of the script block. However we can find the execution of the script block in the PurePath view.

 

Relating Script Execution in the PurePath View

Relating Script Execution in the PurePath View

Using this information we find the respective script execution and can calculate the time from the initial typing to the populated table. Well, we have to be more precise. We know the time until which the elements were added to the DOM. In order to understand when the user sees the data we have to master another challenge

Relating Rendering to JavaScript Execution

This is the trickiest part of our analysis. We now need to correlate JavaScript execution to the rendering caused. I’ve already explained how rendering in Internet Explorer works in another post. So, in our cases rendering will be performed after inserting the elements into the DOM. In dynaTrace AJAX Edition we can identify this by looking at the JavaScript execution and search for nodes saying Rendering (Scheduling Layout Task …). We then search further down the PurePaths until we find the related – by number – layouting task and the following drawing task. Doing this we now know when the user sees the first information after typing in a search string.

Untraceable?

So we managed to get all the information after all. Why did I call this untraceable? Well, first we require knowledge of the internal event processing and needed to add custom instrumentation to link the event listener code to the actual worker code. While this was easy in this specific case it will get a lot harder if you try to do the same for a full-fledged JavaScript framework.

Secondly we had to do a lot of manual correlation. dynaTrace AJAX Edition as well as tools like SpeedTracer are of great help here as they provide all required information – including rendering times. Nevertheless this analysis required a thorough understanding of JavaScript techniques. Additionally we have to keep in mind that we were doing this in a lab environment where we had full freedom regarding how to trace our code. The story will be a very different one as soon as we try to collect the same information from end-users who have no plug-in installed and where the complete correlation must be performed automatically. In this case we will not be able to trace end-user performance.

Conclusion

So, what’s the conclusion? Analyzing the performance of JavaScript execution easily becomes a complex task which requires proper tooling and a lot of human expertise. Given these preconditions measuring end-user performance is doable. However, as soon as we move to real end-users the task becomes nearly impossible. Current approaches around end-user performance management still have to improve to provide the insights needed to measure accurate end-user performance. This is true for browser, frameworks and analysis toolkits.

Challenge Me ; -)

I tried my best to analyze the sample and give an accurate measurement of end-user performance using dynaTrace AJAX Edition. However I am interested in other approaches towards measuring end-user performance for this sample.

This article is part of the dynaTrace 2010 Application Performance Almanac

Related reading:

  1. Performance Analysis of dynamic JavaScript Menus In my previous post I talked about the impact of...
  2. Week 15 – Optimizing Data Intensive Web Pages by Example Lately I was checking out ShowSlow. The site is really...
  3. The Real Performance Overhead of CSS Expressions Steve Souders wrote this in Best Practices for Speeding up...
  4. Garbage Collection in IE7 heavily impacted by number of JavaScript objects and string sizes After my recent presentation at TSSJS – Performance Anti-Patterns in...
  5. Challenges of Monitoring, Tracing and Profiling your Applications running in “The Cloud” Cloud Computing presents unique opportunities to companies to reduce costs,...

More Stories By Alois Reitbauer

Alois Reitbauer is Chief Technical Strategist at Dynatrace. He has spent most of his career building monitoring tools and fine-tuning application performance. A regular conference speaker, blogger, author, and sushi maniac, Alois currently shares his professional time between Linz, Boston, and San Francisco.

@CloudExpo Stories
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.