|By Debu Panda||
|December 4, 2012 09:06 AM EST||
I met several customers in the past few weeks who are evaluating Application Performance Management (APM) solution. They are facing a lot of challenges with their existing investments in old generation of APM solution. In this blog, I will outline some of the shortcomings with APM 1.0 tools that make them unfit for today’s applications.
What is APM 1.0
Customers have been managing application performance since early days of mainframe evolution. However, Application Performance Management as a discipline has gained popularity in the past decade.
Let me first introduce what I mean by APM 1.0. The enterprise applications and technologies such as Java have evolved in past two decades. The APM 1.0 tools were invented more than a decade back and they provided great benefits to resolve application issues that were prevalent with the early versions of Java and .NET technologies. However Java/.NET application servers have become mature and do not have those challenges any more. Also enterprise application architecture and technologies have changed drastically and the APM 1.0 tools have not kept up. The following figure shows the evolution of enterprise Java in the past 15 years and when APM 1.0 and APM 2.0 tools have started emerging.
Following are few challenges with the APM 1.0 tools that you will run into when trying to manage your enterprise applications.
Challenge 1: Not enough focus on end-user or visibility for business critical transactions
The application owner and the application support team primarily cares about the user experience and service level delivered by their applications. APM 1.0 tools were primarily built to monitor applications from an application infrastructure perspective.
These tools lack the capabilities to monitor applications from real user perspective and help you isolate application issues whether it is caused by the network, load balancers, ADNs such as Akamai, or the application, database, etc. Some of these solutions were quick to add some basic end-user monitoring capabilities such as synthetic monitoring. However an application support personnel has to switch between multiple consoles and depend on manual correlation between end-user monitoring and application deep dive monitoring tools.
These tools do not allow you to track a real user request to the line of the code. That means you are blind-sighted when users are impacted and struggle to find what is causing the application failure.
Challenge 2: Built for Development and not suitable for production monitoring
APM 1.0 deep-dive monitoring tools were primarily built to diagnose issues during the application development lifecycle. These tools morphed into production deep-dive monitoring tools when the need arose for APM in production environments. So, These tools were not optimized for production monitoring and hence require a lot of effort to tune for production.
First off, the complexities of agent installation and configuration hinder deployment in production environment. Second, these tools usually require configuration changes every time new application code is rolled out.
Most damagingly, they have high overhead on application performance and do not scale beyond 100-150 application servers. This means that most customers use these in a test environment or enable deep-dive monitoring retroactively after an application failure - assuming the problem will recur.
Finally, these tools do not provide operation friendly UIs and because they were originally built for developers.
Challenge 3: High Cost of Ownership
As I alluded earlier, the old generation APM tools are very complex to configure because these require application knowledge, manual instrumentation and complex agent deployment. Hence expensive consultants are required to deploy and configure and maintain these tools. These tools also have multiple consoles - adding to total cost of ownership. Some customers told me that they spend a lot of time managing these APM tools rather than being able to manage their applications.
Conclusion: A Poor fit for today’s applications
These tools were built more than a decade back, and have not evolved much although the application architecture, technologies and methodologies have gone though drastic changes.
Many of the customers whom I met were of the opinion that they spend more time managing the APM solution then managing their applications. If you use any of the APM 1.0 tools, and try to manage a modern application, you are likely in the same boat. Here are some customer expectations for a modern APM solution:
- It reduces your MTTR by quickly pinpointing business-critical issues with always-on, user-centric, deep application visibility
- Non-Invasive solution that requires no changes to application code, does not require manual instrumentation and auto-discovers your transactions, frame works, etc
- It provides Quick Time to Value and Ease of use with a single, integrated APM console
- Purpose-built for cloud applications
APM 1.0 tools certainly cannot satisfy these needs. In the next blog, I will discuss how an APM 2.0 solution like BMC Application Management addresses the challenges with APM 1.0 products and help you manage applications better thus improving customer satisfaction and resulting in better bottomline.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
Jan. 24, 2017 08:45 AM EST Reads: 5,087
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
Jan. 24, 2017 08:30 AM EST Reads: 2,005
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Jan. 24, 2017 08:15 AM EST Reads: 2,505
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
Jan. 24, 2017 07:45 AM EST Reads: 4,558
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Jan. 24, 2017 07:45 AM EST Reads: 3,921
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Jan. 24, 2017 06:45 AM EST Reads: 6,906
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Jan. 24, 2017 06:45 AM EST Reads: 5,768
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Jan. 24, 2017 06:45 AM EST Reads: 3,633
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 24, 2017 06:30 AM EST Reads: 3,660
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Jan. 24, 2017 05:00 AM EST Reads: 3,113
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 24, 2017 04:00 AM EST Reads: 6,699
While many government agencies have embraced the idea of employing cloud computing as a tool for increasing the efficiency and flexibility of IT, many still struggle with large scale adoption. The challenge is mainly attributed to the federated structure of these agencies as well as the immaturity of brokerage and governance tools and models. Initiatives like FedRAMP are a great first step toward solving many of these challenges but there are a lot of unknowns that are yet to be tackled. In hi...
Jan. 24, 2017 02:45 AM EST Reads: 3,953
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 24, 2017 02:30 AM EST Reads: 4,278
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Jan. 24, 2017 02:00 AM EST Reads: 1,957
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 01:45 AM EST Reads: 2,272
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 01:15 AM EST Reads: 6,470
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 24, 2017 12:30 AM EST Reads: 5,132
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 12:30 AM EST Reads: 2,391
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 24, 2017 12:30 AM EST Reads: 6,447
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Jan. 24, 2017 12:15 AM EST Reads: 1,018