Machine Learning Authors: Pat Romanski, William Schmarzo, Yeshim Deniz, Liz McMillan, Jason Bloomberg


Challenges in Tracing JavaScript Performance by Example

Week 8 of our 2010 Application Performance Almanac

In an earlier article I already discussed several approaches towards end-user experience (or performance) monitoring including their pros and cons. In this article I will present a simple real world sample which shows the limits of performance traceability in AJAX applications.

As I don’t like Hello World samples, I thought I’d rather build something a bit more useful. The sample uses the Twitter API to search for keywords. The search itself is triggered by typing into a textbox. While the sample isn’t spectacular from a technical perspective, I will make it more interesting by adding some “technical salt” – rather than sugar.

Building the Sample Page

So let us start looking at the code. Below you find the skeleton of our page. The code is straightforward. We have a textbox and a table. I am using jQuery for convenience reasons here – especially because of some innerHTML bugs of IE. However the sample does not require it.

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title>This is the untraceable AJAX Demo</title>
<script type="text/javascript" src="./jquery.js" /></script>
<script type="text/javascript">
// we add more code later here …
<input type="text" id="search"></input>
<br />
<table id="myTable" />

The textbox has a keyUp event listener which invokes the following code. So instead of directly triggering an XHR request, we only write the value of the textbox to a global variable. We do this to reduce the number of network requests.

var value = "";
$(function (){
$('#search').keyup(function (){
value = $(this).val();

Then we define another method which takes the variable value and dynamically creates a script tag with the entered search term as a query parameter. The callback parameter specifies which function will be called as part of the returned JSONP. This is really straightforward thanks to the Twitter API. This method is called every 1.5 seconds using the following setInterval call.

function queryTwitter (){
if(value != ""){
var head = document.getElementsByTagName('head')[0];
var elem = document.createElement('script');
elem.setAttribute('src', 'http://search.twitter.com/search.json?callback=writeToTable&amp;q=' + value +"'");
value = "";
setInterval (queryTwitter, 1500);

When the script is returned the method below is invoked. It simply clears the table and then adds a row containing an image and text for each tweet.

function writeToTable (results){
var table = $('#myTable');
var tweets = results.results;
for (var i in tweets){
table.append('&lt;tr&gt;&lt;td&gt;&lt;img src="' + tweets[i].profile_image_url + '"/&gt;&lt;/td&gt;&lt;td&gt;&lt;b&gt;' +
tweets[i].from_user + ' says: &lt;/b&gt; &lt;i&gt;'+ tweets[i].text + '&lt;/i&gt;&lt;/td&gt;&lt;/tr&gt;');

You can try the service here if you want. That’s all the code for our sample. Really simple isn’t it? Now let’s come to the question we want to answer. What we want to know is, how long does it take from entering the text until the result is shown on the screen. This is simply the performance perceived by the end-user.

While this question looks simple at first sight it is really tricky to answer. I must admit, that I built the example in a way that it is difficult to trace :-) .  However I am using only common AJAX techniques. Let’s discuss the problems in detail:

Using a Variable for Execution Delay

The usage of a variable for delayed execution causes some problems. There is no direct link between the JavaScript code executed in the event handler and the code executed by the timer. We will not be able to do a single JavaScript trace. Respectively these calls have no direct relation.

If we have access to the internals of the framework we can overcome this problem by using explicit markers. I am using the dt_addMark JavaScript function which creates a marker in the free dynaTrace AJAX Edition I am doing this for the event handler as well as the queryTwitter method. We can now correlate the click to the timer method invocation.


Tracing Asynchronous Execution using Markers

Tracing Asynchronous Execution using Markers

Using a Script Tag for Populating the Table

The next problem we face is that we use a script tag rather than an XHR request to populate the table. Therefore we again have no direct link between the queryTwitter method and the execution of the script block. However we can find the execution of the script block in the PurePath view.


Relating Script Execution in the PurePath View

Relating Script Execution in the PurePath View

Using this information we find the respective script execution and can calculate the time from the initial typing to the populated table. Well, we have to be more precise. We know the time until which the elements were added to the DOM. In order to understand when the user sees the data we have to master another challenge

Relating Rendering to JavaScript Execution

This is the trickiest part of our analysis. We now need to correlate JavaScript execution to the rendering caused. I’ve already explained how rendering in Internet Explorer works in another post. So, in our cases rendering will be performed after inserting the elements into the DOM. In dynaTrace AJAX Edition we can identify this by looking at the JavaScript execution and search for nodes saying Rendering (Scheduling Layout Task …). We then search further down the PurePaths until we find the related – by number – layouting task and the following drawing task. Doing this we now know when the user sees the first information after typing in a search string.


So we managed to get all the information after all. Why did I call this untraceable? Well, first we require knowledge of the internal event processing and needed to add custom instrumentation to link the event listener code to the actual worker code. While this was easy in this specific case it will get a lot harder if you try to do the same for a full-fledged JavaScript framework.

Secondly we had to do a lot of manual correlation. dynaTrace AJAX Edition as well as tools like SpeedTracer are of great help here as they provide all required information – including rendering times. Nevertheless this analysis required a thorough understanding of JavaScript techniques. Additionally we have to keep in mind that we were doing this in a lab environment where we had full freedom regarding how to trace our code. The story will be a very different one as soon as we try to collect the same information from end-users who have no plug-in installed and where the complete correlation must be performed automatically. In this case we will not be able to trace end-user performance.


So, what’s the conclusion? Analyzing the performance of JavaScript execution easily becomes a complex task which requires proper tooling and a lot of human expertise. Given these preconditions measuring end-user performance is doable. However, as soon as we move to real end-users the task becomes nearly impossible. Current approaches around end-user performance management still have to improve to provide the insights needed to measure accurate end-user performance. This is true for browser, frameworks and analysis toolkits.

Challenge Me ; -)

I tried my best to analyze the sample and give an accurate measurement of end-user performance using dynaTrace AJAX Edition. However I am interested in other approaches towards measuring end-user performance for this sample.

This article is part of the dynaTrace 2010 Application Performance Almanac

Related reading:

  1. Performance Analysis of dynamic JavaScript Menus In my previous post I talked about the impact of...
  2. Week 15 – Optimizing Data Intensive Web Pages by Example Lately I was checking out ShowSlow. The site is really...
  3. The Real Performance Overhead of CSS Expressions Steve Souders wrote this in Best Practices for Speeding up...
  4. Garbage Collection in IE7 heavily impacted by number of JavaScript objects and string sizes After my recent presentation at TSSJS – Performance Anti-Patterns in...
  5. Challenges of Monitoring, Tracing and Profiling your Applications running in “The Cloud” Cloud Computing presents unique opportunities to companies to reduce costs,...

More Stories By Alois Reitbauer

Alois Reitbauer is Chief Technical Strategist at Dynatrace. He has spent most of his career building monitoring tools and fine-tuning application performance. A regular conference speaker, blogger, author, and sushi maniac, Alois currently shares his professional time between Linz, Boston, and San Francisco.

@CloudExpo Stories
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
"NetApp is known as a data management leader but we do a lot more than just data management on-prem with the data centers of our customers. We're also big in the hybrid cloud," explained Wes Talbert, Principal Architect at NetApp, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: implemen...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
@DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises - and delivering real results.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
"We started a Master of Science in business analytics - that's the hot topic. We serve the business community around San Francisco so we educate the working professionals and this is where they all want to be," explained Judy Lee, Associate Professor and Department Chair at Golden Gate University, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.