Welcome!

Machine Learning Authors: Pat Romanski, Elizabeth White, Liz McMillan, Jason Bloomberg, Rajeev Kozhikkattuthodi

Related Topics: Java IoT, Machine Learning

Java IoT: Article

Behind the Scenes of Serialization in Java

The right serialization strategy is central for achieving performance and scalability

When building distributed applications one of the central performance-critical components is serialization. Most modern frameworks make it very easy to send data over the wire. In many cases you don’t see at all what is going on behind the scenes. Choosing the right serialization strategy however is central for achieving good performance and scalability. Serialization problems affect CPU, memory, network load and response times.

Java provides us with a large variety of serialization technologies. The actual amount of data which is sent over the wire can vary substantially. We will use a very simple sample where we send the firstname, lastname and birthdate over the wire. Then we’ll see how big the actual payload gets.

Binary Data
As a reference we start by sending only the payload. This is the most efficient way of sending data, as there is no overhead involved. The downside is that due to the missing metadata the message can only be de-serialized if we know the exact serialization method. This approach also has a very high testing and maintenance effort and we have to handle all implementation complexity ourselves. The figure below shows what our payload looks like in binary format.

Binary Representation of Address Entity

Binary Representation of Address Entity

Java Serialization
Now we switch to standard serialization in Java. As you can see in below we are now transferring much more metadata. This data is required by the Java Runtime to rebuild the transferred object at the receiver side. Besides structural information the metadata also contains versioning information which allows communication across different versions of the same object. In reality, this feature often turns out to be harder than it initially looks. The metadata overhead in our example is rather high. This is caused by large amount of data in the GregorianCalendar Object we are using. The conclusion that Java Serialization comes with a very high overhead per se, however is not valid. Most of this metadata will be cached for subsequent invocations.

 

Person Entity with Java Serialization

Person Entity with Java Serialization
Java also provides the ability to override the Serialization behavior using the Externalizable interface. This enables us to implement a more efficient serialization strategy. In our example we could only serialize the birthdate as a long rather than a full object. The downside again is the increased effort regarding testing an maintainability

Java Serialization is by default used in RMI communication when not using IIOP as a protocol. Application server providers also offer their own serialization stacks which are more efficient than default serialization. If interoperability is not important, provider-specific implementations are the better choice.

Alternatives
The Java ecosystem also provides interesting alternatives to Java Serialization. A widely known one is Hessian which can be easily used with Spring. Hessian allows an easy straightforward implementation of services. Underneath it uses a binary protocol. The figure below shows our data serialized with Hessian. As you can see the transferred data is very slim. Hessian therefore provides an interesting alternative to RMI.

Hessian Binary Representation of Person Object

Hessian Binary Representation of Person Object

JSON
A newcomer in serialization formats is JSON (JavaScript Object Notation). Originally used as a text-based format for representing JavaScript objects, it’s been more and more adopted in other languages as well. One reason is the rise of Ajax applications, but also the availability of frameworks for most programming languages.

As JSON is a purely text-based representation it comes with a higher overhead than the previously shown serialization approaches. The advantage is that it is more lightweight than XML and it has good support for describing metadata. The Listing below shows our person object represented in JSON.

{"firstName":"Franz",
"lastName":"Musterman",
"birthDate":"1979-08-13"
}

XML
XML for sure is the standard format for exchanging data in heterogeneous systems. One nice feature of XML is the out-of-the-box support for data validation, which is especially important in integration scenarios. The amount of metadata, however, can become really high – depending on the used mapping. All data is transferred in text format by default. However the usage of CDATA tags enables us to send binary data. The listing below shows our person object in XML. As you can see the metadata overhead is quite high.

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<person>
  <birthDate>1979-08-13T00:00:00-07:00</birthDate>
  <firstName>Franz</firstName>
  <lastName>Musterman</lastName>
</person>

Fast InfoSet
Fast InfoSet
is becoming a very interesting alternative to XML. It is more or less a lightweight version of XML, which reduces unnecessary overhead and redundancies in data. This leads to smaller data set and better serialization and deserialization performance.

When working with JAX-WS 2.0 you can enable Fast InfoSet serialization by using the @FastInfoset annotation. Web Service stacks then automatically detect whether it can be used for cross service communication using HTTP Accept headers.

When looking at data serialized using Fast InfoSet the main difference you will notice is that there are no end tags. After their first occurrence there are only referenced by an index. There are a number of other indexes for content, namespaces etc.

Data is prefixed with its length. This allows faster and more efficient parsing. Additionally binary data can avoid being serialized in base64 encoding as in an XML.

In tests with standard documents the transfer size could be shrunk down to only 20 percent of the original size and the serialization speed could be doubled. The listing below shows our person object now serialized with Fast InfoSet. For Illustration purposes I skipped the processing instructions and I used a textual representation instead of binary values. Values in curly braces refer to indexed values. Values in brackets refer to the use of an index.

{0}<person>
{1}<birthDate>{0}1979-08-13T00:00:00-07:00
{2}<firstName>{1}Franz
{3}<lastName>{2}Musterman

The real advantage however can be seen when we look what the next address object would look like. As the listing below shows we can work mostly with index data only.

[0]<>
[1]<>{0}
[2]<>{3}Hans
[3}<>{4}Musterhaus

Object Graphs
Object graphs can get quite tricky to serialize. This form of serialization is not supported by all protocols. As we need to work with reference to entities, the language used by the serialization approach must provide a proper language construct. While this is no problem in serialization formats which are used for (binary) RPC-style interactions, it is often not supported out-of-the-box by text-based protocols. XML itself, for example, supports serializing object graphs using references. The WS-I however forbids the usage of the required language construct.

If a serialization strategy does not support this feature it can lead to performance and functional problems, as entities get serialized individually for each occurrence of a reference. If we are, for example, serializing addresses which have reference to country information, this information will then be serialized for each and every address object leading to large serialization sizes.

Conclusion
Today there are numerous variants to serialize data in Java. While binary serialization keeps being the most efficient approach, modern text-based formats like JSON or Fast Infoset provide valid alternatives – especially when interoperability is a primary concern. Modern frameworks often allow using multiple serialization strategies at the same time. So the approach can even be selected dynamically at runtime.

Related reading:

  1. Behind the scenes of ASP.NET MVC 2 – Understand the internals to build better apps With Visual Studio 2010, Microsoft is shipping the next version...
  2. 52 weeks of Application Performance – The dynaTrace Almanac 2010 is over and there has been a log going...
  3. 7 Rules to Improve your Application Performance Practices In this post I discuss the seven most important steps...
  4. Is There a Business Case for Application Performance? We all know that slow performance – and service disruption...
  5. Applying Maslow’s Pyramid to Application Performance This time I take an a bit unconventional approach towards...

More Stories By Alois Reitbauer

Alois Reitbauer is Chief Technical Strategist at Dynatrace. He has spent most of his career building monitoring tools and fine-tuning application performance. A regular conference speaker, blogger, author, and sushi maniac, Alois currently shares his professional time between Linz, Boston, and San Francisco.

@CloudExpo Stories
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...