Welcome!

Machine Learning Authors: Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz, Corey Roth

Related Topics: Machine Learning , Java IoT, Microservices Expo

Machine Learning : Article

How Bad Outdated JavaScript Libraries Are for Page Load Time

Make sure you stay up-to-date with your library versions.

Last week at Velocity we hosted a Birds of a Feather Session (BoF) and offered the attendees to analyze their web sites using dynaTrace Ajax Edition. Besides finding the typical performance problems (no cache settings, too many images, not minimized content, …) we found several sites that had one interesting problem in common: OLD VERSIONS of  JavaScript libraries such as YUI, jQuery or SPRY.

Why are outdated JavaScript Libraries a problem?
JavaScript libraries such as jQuery provide functions that make it easy for web developers to achieve certain things, e.g.: change the style of certain DOM elements. Most of these libraries therefore provide methods called $, $$ or find that allow finding DOM elements by ID, Tag Name, CSS Class Name or specific DOM attribute values.

The following is a screenshot of the Performance Report analyzing the Boston Bruins Page on msn.foxsports.com using Firefox 3.6. The Performance Report highlights 3 interesting things:

  • High Client Time (3.5 seconds in JavaScript execution until the Page was Fully Loaded)
  • 39 JavaScript! blocks that executed longer than 20ms
  • 38 calls to Spry.$$ for looking up an element by Class Name with an average of 80ms execution time
Long running CSS Class Name Lookups contribute about 80% to the Client Side Load Time

Long running CSS Class Name Lookups contribute about 80% to the Client Side Load Time

Looking up elements by class name takes 80ms on average. We see 38 calls with a total JavaScript execution time of 3 seconds. Is that normal? Even on Firefox?

The History on CSS Class Name Lookups
Looking up elements by CSS Class Name is a very popular lookup method. Browsers such as Firefox or Chrome supported this lookup natively with a method called getElementsByClassName. Older versions of Internet Explorer don’t support this.

The lack of this native implementation in IE caused many JS libraries to implement this missing feature using JavaScript. The implementations traverse through the complete DOM Tree and perform a string match on the CSS Class Name property. This works well and allows web developers to use one library that works across all browsers and browser versions. The down side of this approach is that traversing the DOM is one of the slowest operations you can do in the browser. The larger the DOM, the longer these lookup methods take to execute. I’ve blogged about this several times: 101 on jQuery Selector Performance, 101 on Prototype CSS Selectors, Top 10 Client-Side Performance Problems.

Good news is that newer versions of Internet Explorer now support the Selector API allowing JavaScript frameworks to leverage this native implementation and with that optimizing lookup performance.

Why is Spry.$$ so slow on Firefox?
Let’s get back to our example from above. We learned that newer browsers support lookups by class name natively and we also learned that Firefox, Chrome, Safari, Opera, …  had the native lookup methods implemented for a while. Does this mean that 80ms for looking up elements by class name is normal? No – it is not!! It should take a fraction of that. And here is why.

The following screenshot shows the JavaScript PurePath of Spry.$$(“.prev_label”). The PurePath shows us what happens internally in this method and why it takes 170ms for that call to complete:

Spry.$$ iterates through all DOM Elements and matches the CSS Class Name

Spry.$$ iterates through all DOM Elements and matches the CSS Class Name

Instead of leveraging the native getElementsByClassName method of Firefox the implementation of Spry.$$ uses its own implementation. Using getElementsByTagName with no specific tag name returns ALL DOM Elements – 2787 in total on that page. The library then iterates through all of these DOM elements, reading the className DOM Property and matching it against “prev_label”. Accessing 2787 DOM Properties is a costly operation and takes 170ms in that instance. That’s why Spry.$$ takes that long.

Solve this problem by upgrading your Libraries
Older versions of JavaScript libraries didn’t necessarily check the browsers capabilities for looking up DOM Elements. These libraries always used their own implementation. Newer versions of libraries do check the capabilities and take advantage of any performance enhancement they can get by using the native available implementations.

On this particular website – and also the others we looked at during the BoF session – an older version of the JavaScript library was used. Upgrading to a newer version would solve this problem. I know – upgrading is not always as easy as it sounds. Lots of testing is involved to ensure that your web site still works as expected. On the other side – speeding Page Load Time by 3 seconds by “just” updating your JavaScript library is a good argument in doing so.

Conclusion
Make sure you stay up-to-date with your library versions. Library providers are focusing a lot on performance and they make sure to optimize these libraries for the different browsers out there.

Tips and Tricks
The dynaTrace Performance Report identifies slow running JavaScript handlers and also identifies the jQuery $ method. In the case of Spry.$$ (or maybe other lookup methods that your library provides) the report won’t list these methods in the Contributor list. A little trick helps though to identify how much performance impact these methods have on your page. Simply Drill into the Hotspot View. Then type “Spry.$$” and sort by Total Sum. This will show you all invocations of Spry.$$ and how much time it takes to execute them:

Analyzing your JavaScript Lookup methods in the Hotspot View

Analyzing your JavaScript Lookup methods in the Hotspot View

Learn how to use the different views and features of dynaTrace Ajax Edition by watching the 15 available Video Tutorials

Related reading:

  1. Slow Page Load Time in Firefox caused by old versions of YUI, jQuery, … We blogged a lot about performance problems in Internet Explorer...
  2. Uncover JavaScript JSON Parsing errors with IE Compatibility Mode We just launched our new corporate web site. Right after...
  3. Testing and Optimizing Single Page Web 2.0/AJAX Applications – Why Best Practices alone don’t work any more // Testing and Optimizing of what I call “traditional” page-based...
  4. Top 10 Client-Side Performance Problems in Web 2.0 Inspired by the Top 10 Performance Problems post which focuses...
  5. Antivirus Add-On for IE to cause 5 times slower page load times The dynaTrace AJAX Community has been really active lately –...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@CloudExpo Stories
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: implemen...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...