Welcome!

Machine Learning Authors: Zakia Bouachraoui, Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan

Related Topics: Java IoT

Java IoT: Article

Frank's Java Code Stack #4 Using Message Digest Stream

Frank's Java Code Stack #4 Using Message Digest Stream

(November 12, 2002) - In Java Code Stack #1 and #3, we observed some code snippets on both Symmetric and Asymmetric Cryptography. But most of our applications, such as password authentication and logon verification, need a simpler way of creating a Digest of a given string or a message. Message Digest is a hash algorithm that takes as input a message of arbitrary length and produces as output a 128-bit fingerprint or message digest of the input. This Digest algorithm is meant for digital signature applications, where a large file/Data must be compressed in a secure manner before being encrypted with a Secret key under a public key crypto model. But instead of passing a byte array to the Digest system, we can pipe a stream to the Digest object for reading and writing Digest directly into the stream. This week, we'll build one such class, which performs a Digest for a string and writes it to a stream.

Code:

  1.        import java.io.*;
  2.        import java.security.*;

  3.        public class msgStream{
  4.        public static void main(String ar[]){

  5.        try{
  6.        /* Let us write the Digest to a
  7.        File Stream */
  8.        FileOutputStream fos=
           new FileOutputStream("MyDigest");
  9.        /* We are using SHA1 Algorithm */
  10.        MessageDigest md=
           MessageDigest.getInstance("SHA1");

  11.        /* A transparent stream that
           updates the associated message
           digest using the bits going
           through the stream. */
  12.        DigestOutputStream dos=
           new DigestOutputStream(fos, md);
  13.        ObjectOutputStream oos=
           new ObjectOutputStream(dos);

  14.        /* String to be processed */
  15.        String text="This class
           works with Digest Streams";
  16.        oos.writeObject(text);

  17.        /* Before writing the digest
           to the Stream, turn the
           Digest OFF. When it is off,
           a call to one of the write
           methods does not result
           in an update on the
           message digest. However you        can ignore this.*/
  18.        dos.on(false);
  19.        oos.writeObject(md.digest());
  20.        }catch(Exception e){}

  21.        try{
  22.        /* Read the Digest from
           the File Stream */
  23.        FileInputStream fis=
  24.        new FileInputStream("MyDigest");

  25.        MessageDigest md=
           MessageDigest.getInstance("SHA1");

  26.        DigestInputStream dis=
           new DigestInputStream(fis,md);
  27.        ObjectInputStream ois=
           new ObjectInputStream(dis);
  28.        String text=(String)ois.readObject();

  29.        /* We got the Original Text..
           Not the Digest! */
  30.        System.out.println(""+text);

  31.        dis.on(false);
  32.        byte rdigest[]=
           (byte[]) ois.readObject();

  33.        /* Comparing the Digest of the
           String with the Original Digest */
  34.        if(MessageDigest.isEqual
           (md.digest(), rdigest))
  35.        System.out.println("Valid Messg.");
  36.        else
  37.        System.out.println("Invalid Messg.");
  38.        }catch(Exception e){}

  39.        }
  40.        }

As you can see, the DigestOutputStream allows us to write Data to any Output Stream and calculate the Message Digest of that Data transparently as the Data passes through the Stream. Note that unlike usual Message Digest calculation, which involves only the Data, we are calculating the Digest over the serialized String Object, which can have additional information like Class definition along with the Data.

Assignment:
Try to build a Secure Message Digest (MAC) by using any standard Encryption Engine.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also received the prestigious Outstanding Technical Achievement Award three times - an accomplishment befitting only the most innovative thinkers. Shankar Kalyana is among the most respected strategists in the global technology industry. As CTO, with over 32 years of IT experience, Mr. Kalyana has architected, designed, developed, and implemented custom and packaged software solutions across a vast spectrum o...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at Dice, he takes a metrics-driven approach to management. His experience in building and managing high performance teams was built throughout his experience at Oracle, Sun Microsystems and SocialEkwity.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching of virtual storage services to its enterprise market.
Despite being the market leader, we recognized the need to transform and reinvent our business at Dynatrace, before someone else disrupted the market. Over the course of three years, we changed everything - our technology, our culture and our brand image. In this session we'll discuss how we navigated through our own innovator's dilemma, and share takeaways from our experience that you can apply to your own organization.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, will discuss how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.