Welcome!

IoT User Interface Authors: Elizabeth White, Dana Gardner, Pat Romanski, JP Morgenthal, John Basso

Related Topics: Java IoT, Microservices Expo, IoT User Interface, Apache

Java IoT: Blog Feed Post

Monitor Your Java Application Logs in Four Easy Steps

As systems administrators, application logs are often the key to our success

As systems administrators, application logs are often the key to our success, but also our biggest hassle. They provide clues to what’s going on when things go awry, and in those situations more detail is generally better. But when you don’t actually know something is wrong, and just want to get a sense for whether things are normal, more detail can create so much noise that it’s all but impossible to glean any useful information.


In those situations, you’d rather just have statistical information about what’s in your logs. In this article, I present a simple and easy solution to turn your logs into useful graphs, in real time. If you ever need to measure the volume of your logs, or perhaps graph the frequency of certain log events, then read on.

The tools

The solution I present uses four key tools:

  • Log4J (though plain log files would fit as well)
  • Logstash
  • StatsD
  • Monitis

With so many moving parts, you might be tempted to think this could be an overcomplicated solution. But in fact — as in the long tradition of Unix command line tools — it is a composition of simple tools each doing one job very well. As with files piped from one Unix command to another, these four components act as a pipeline for log events, with each piece adding value to the stream along the way.

Log4J

All of the log events in this article start inside of Log4J. If you run Java applications, then this provides an easy way to hook into your logs, to peel off an event stream that you want to see graphed in Monitis. But, Log4J could easily be replaced in this solution with plain log files, syslog, or any number of other logging frameworks.

The key modification that we make to Log4J is to add a SocketAppender that sends a copy of selected Loggers to our logstash server.

Logstash

The role of logstash in the pipeline is twofold. First, it listens for connections from Java application servers, accepting streams of logs when they connect. Second, it filters, modifies, and routes those streams to the appropriate outputs. In this case, we’ll be handling all of the incoming streams by notifying StatsD each time a log event is received, without actually sending the content of each event.

StatsD

Logstash will be receiving log events very frequently, but Monitis only wants to receive updates at most once per minute. To resolve this mismatch, StatsD acts as our log stream bean counter, allowing logstash to send increment messages each time an event is received. StatsD records these in counters for each type of log message, and then sends the counts on to Monitis every 60 seconds.

Monitis

Finally, we get to the end of the pipeline, and Monitis receives the count messages. These are added to the appropriate custom monitors, which are automatically created if they don’t already exist. Once the data is in Monitis, it can be graphed in the Web UI, or used to send alerts when a rate of log events is outside of a user-specified threshold.

The gory details

Now that you’ve seen the overview, let’s take a look at the configuration details that make it happen. Don’t worry, since each component in the pipeline is doing a simple job, there’s really not much to it.

Install and configure the software

Let’s look at installation details for the tools in each step in the pipeline. I’m assuming that you already have Java applications using Log4J. If not, modifying the pipeline to read from log files, receive from syslog, or other options is pretty straightforward, but outside the scope of this article. For that, refer to the logstash documentation on how to set up other kinds of logstash inputs.

Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of PicsArt, Inc.,

@CloudExpo Stories
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
“Being the one true cloud-agnostic and storage-agnostic software solution, more and more customers are coming to Commvault and saying ' What do you recommend? What's your best practice for implementing cloud?” explained Randy De Meno, Chief Technologist at Commvault, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...