Welcome!

Machine Learning Authors: Kevin Benedict, Yeshim Deniz, Elizabeth White, SmartBear Blog, XebiaLabs Blog

Related Topics: Machine Learning , Java IoT

Machine Learning : Article

Razor Profiler - An Automated JavaScript Profiling Tool

Razor Profiler - An Automated JavaScript Profiling Tool

Razor Profiler
(beta), an online Ajax profiling tool, is available for public review now at http://www.razorspeed.com.

What Is it?
Razor Profiler(beta) is a web-based Ajax profiling tool to help web developers understand and analyze the runtime behavior of their JavaScript code in a cross-browser environment. Razor Profiler can be access either online as a service; or be downloaded to run locally. Some Razor Profiler screen shots are shown below:

Files Tab


Files Tab
Top Call Stacks Image


Top Call Stacks
Call Stack Visualization


Call Stack Visualization

Why Razor Profiler?
The amount of JavaScript code on the client side is increasing significantly with the growing popularity of Ajax and Web 2.0.
Web developers rely on JavaScript heavily these days in order to deliver a richer user experience.
Including a either home-grown or third party JavaScript library and application specific code, today’s web applications can easily have several thousand lines of JavaScript code, or even tens of thousands of lines.

The footprint of client side JavaScript can range from tens of kilobytes to hundreds of kilobytes. This amount of JavaScript can do magic to application functionality and user experience, but they also introduce many questions.

As an application developer:

  • How do you measure the runtime behavior of your code on the vast array of client platform and browser combinations?
  • Do you know why the same code works well on one browser but performs very poorly on a different browser
  • Do you know whose code is causing problems, yours, or a third party library
  • Do you know exactly where is the performance bottleneck?

If you are a JavaScript library developer, it is even more important for you to understand the runtime behavior of your code since that it will be used by other developers in many different ways.

However, there is no easy way to obtain the answers to the above questions. It is difficult to study the runtime behavior of Javascript applications in a cross-browser environment:

  • Different browsers (Internet Explorer, FireFox, Safari, etc) have different runtime behaviors. The same code can behave very differently on different browsers;
  • Lack of tooling that supports JavaScript debugging and profiling. JavaScript has evolved from being perceived as a “toy language” to be a heavily used mainstream programming language for writing web applications, but JavaScript tooling has not caught up yet;

Razor Profiler aims to help solve this problem. Razor Profiler is JavaScript profiling tool hat aims to make it really easy for web developers to profile their Ajax code in a cross browser
environment.

Razor Profiler Features
Razor Profiler automates JavaScript profiling:

  • Automation: no application code change required. Razor Profiler automatically collects all the necessary data and presents them to web developers for analysis.
  • Runs on any browser: web developers can profile any JavaScript application on any browser. There is nothing to install on the client side.
  • Rich lexical analysis: Razor Profiler presents rich lexcial information about the application, such as file information (number, response status, size, mimetype, percentage, etc),
    tokens (size, file, percent, count), and functions (size, file, name…), etc;
  • Profile scenario recording: Razor Profile enables web developers to selectively record the scenarios that they are interested in. Only recorded scenarios will be used in analysis.
  • Call stack analysis: for each recorded scenario, Razor Profiler presents all the call stacks in the order of their occurence. For each call stacks, web developers can drill into it to find out
    the duration of the stack, all the function calls of this stack and the duration of each call.
  • Function analysis: For each JavaScript function in the application, Razor Profile presents the number of times it has been invoked, the duration of each invocation, and the call stacks that invoked this function.
  • Data visualization with graphing and charting: Razor Profiler presents top call stacks, top function calls of each stack, top recorded scenarios, etc. using visual charts and graphs to help web developers
    better understand the runtime behavior of their application. For example, each call stack is visualized as an intuitive Gantt chart.

some Razor Profiler screen shots are available here.

How Does Razor Profiler Work?
Razor Profiler composes of a server component that runs inside a standard Java EE Servlet engine, and a JavaScript-based client component that runs inside any browser. Once you have Razor server started, you can profile your JavaScript application by entering the start URL of your application into Razor Profiler and run through your test scenarios.
Razor Profiler will automatically record data and visualize them for your analysis. There is no client side installation, browser configuration change or application code change required.
In order to achieve this, Razor Profiler goes through five different phases:

  • Application retrieval: Once a web developer enters the application start URL into Razor Profiler, Razor Profiler client component (”the client”) will send this URL to Razor Profiler server component (”the server”).
    The server performs the actually retrieval of this URL. After additional server processing (such as lexical analysis and code injection, see below), the retrieved content is sent to the client side to be displayed in a new browser window. For the developer point of view, the application is launched and running in this new browser window.

    In this process, Razor Profiler Server is acting like a “proxy server”. But it is not really a “proxy server” and there is no need for developers to re-configure their browser proxy settings.
  • Lexical analysis:
    Once the server retrieves the application URL, it performs lexical analysis of the returned content by identifying and analyzing JavaScript files, functions, and tokens,etc. The result is sent to the client for display.
  • Code injection: Upon lexical analysis of JavaScript code, the server injects “probe” code into the application’s JavaScript sources before returning them to the client. These injected “probes” enable automatic collection of application runtime data, and saves developers from doing so manually.
  • Runtime data capture: Once the application’s JavaScript code is running on the client side and as developers run throug desired profile scenarios, the injected “probes” automcally collect all the necessary data to Razor Profiler Client.
  • Data analysis: When the developer finishes recording scenarios and starts data analysis, Razor Profiler client performs analysis of all the collected data and presents the results.

How To Get Razor Profiler?
Just go to  andhttp://www.razorspeed.com download it. Follow the installation instructions to install Razor Profiler server. Or, you can try it online as an online service.
Feel free to post comments at Razor Profiler online forum

More Stories By Coach Wei

Coach Wei is founder and CEO of Yottaa, a web performance optimization company. He is also founder and Chairman of Nexaweb, an enterprise application modernization software company. Coding, running, magic, robot, big data, speed...are among his favorite list of things (not necessarily in that order. His coding capability is really at PowerPoint level right now). Caffeine, doing something entrepreneurial and getting out of sleeping are three reasons that he gets up in the morning and gets really excited.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The in...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facing...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Infranics will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Since 2000, Infranics has developed SysMaster Suite, which is required for the stable and efficient management of ICT infrastructure. The ICT management solution developed and provided by Infranics continues to add intelligence to the ICT infrastructure through the IMC (Infra Management Cycle) based on mathemat...
SYS-CON Events announced today that Cloudistics, an on-premises cloud computing company, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Cloudistics delivers a complete public cloud experience with composable on-premises infrastructures to medium and large enterprises. Its software-defined technology natively converges network, storage, compute, virtualization, and management into a ...
Now that the world has connected “things,” we need to build these devices as truly intelligent in order to create instantaneous and precise results. This means you have to do as much of the processing at the point of entry as you can: at the edge. The killer use cases for IoT are becoming manifest through AI engines on edge devices. An autonomous car has this dual edge/cloud analytics model, producing precise, real-time results. In his session at @ThingsExpo, John Crupi, Vice President and Eng...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, will discuss new ways of thinking and the approaches needed to address the emerging challenges of securit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, will posit that disruption is inevitable for c...
SYS-CON Events announced today that Loom Systems will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2015, Loom Systems delivers an advanced AI solution to predict and prevent problems in the digital business. Loom stands alone in the industry as an AI analysis platform requiring no prior math knowledge from operators, leveraging the existing staff to succeed in the digital era. With offices in S...
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.