|By James H. Wong||
|December 9, 2012 04:00 PM EST||
Designing and implementing a hybrid encryption application is a big challenge but without a supporting infrastructure it's almost impossible. There are open source libraries that allow you to encrypt a file but only provide the translation technique. After the information has been encrypted, how do you know what algorithm was used, who you encrypted it, what version did you used, etc. In order to decrypt the protected message or file, a well-defined cryptographic header provides all the information required. This also applies if the encrypted data is digitally signed and the recipient wants to validate the signature.
This article will address one of the critical components of a support infrastructure by providing a design of a cryptographic header used to precede encrypted and/or digitally signed messages and files. The header is used within an application known as DocuArmor that was written using Java and the Cryptography library from the BouncyCastle organization and designed by Logical Answers Inc. The header will store information used when encrypting and/or digitally signing a message or file and allow the recipient to decrypt the information and/or verify the digital signature. With a properly designed header, a person can encrypt their personal files as well as exchange confidential messages and authenticate the sender.
In order to encrypt personal files and exchange protected data, we use a hybrid technique with two types of encryption, symmetric and asymmetric.
Symmetric encryption uses a single key to hide the message and reveal the message. There are several symmetric algorithms available such as AES (the Advanced Encryption Standard) but the important thing to remember is that the file can be encrypted and decrypted using the same key. An example is the Caesar cipher that shifts the letters of the alphabet by a specific number. If the shift is 2 (single key) then we get the following translation; a=c, b=d, c=e, ..., z=b.
Asymmetric encryption uses a pair of keys (public, private) to hide and reveal the message and the RSA algorithm is most commonly used. The RSA algorithm was credited in 1977 to Ronald Rivest, Adi Shamir, and Leonard Adleman. Sometimes referred to as Public Key Infrastructure (PKI), the pubic key is used to encrypt data and the private key is used to decrypt data.
Figure 1: Public and Private Key Functions
The hybrid technique uses the symmetric key to encrypt a file. The asymmetric public key is used to encrypt the symmetric key and is placed in the header. When the recipient receives an encrypted file, the encrypted symmetric key is extracted from the header. The encrypted symmetric key is decrypted using the private key. The file is decrypted using the symmetric key.
The same pair of keys can be used with digital signatures. The private key is used to generate a digital signature from a file and inserted into the header. The public key is used to verify the authenticity of the signature.
When two people want to exchange encrypted files, they each generate a pair of asymmetric keys and exchange a copy of their public keys. By using the other person's public key, they can encrypt a file, storing the cryptographic information in the header and then e-mail it to the recipient. The recipient will use the header to extract a symmetric key with their private key and decrypt the accompanying file. If a digital signature is included, the recipient can authenticate the sender.
Figure 2: Exchange of Encrypted Files
When a file is encrypted, digitally signed or both, a Cryptographic header is placed in front of the resulting file and has the following structure. The structure consists of two sections, the header and the encrypted/plain file contents.
Figure 3: Encrypted File Structure
The header structure contains information required to reverse the encryption process and decrypt the contents of the file or verify the digital signature. The header contains the total length, an ID, version, and two sections containing encryption and digital signature information. Using Java, you can write out the contents of header within a byte stream as well as read it back in.
Figure 4: Cryptographic Header Structure
- Total Len: Contains the total length of the header (stored as a 4 byte integer)
- Header ID: Contains the string "LAHEADER" to identify the file (16 bytes)
- Header Version: Structural version of the header (stored as a 4 byte integer)
- Encryption Information: Holds the algorithm, mode, encrypted symmetric key, etc.
- Digital Signature Information: Holds digital signature
The Encryption Information structure contains information that was used to encrypt the contents of the file and later decrypt the file. The symmetric key and initialization vector is encrypted with the recipient's asymmetric public key. The recipient could be the owner if you are encrypting a file for yourself or another user you want to send confidential information to.
An additional field has been allocated to allow the encryption of the symmetric key with another set of asymmetric keys. For example, if owner A is sending an encrypted file to another person B, the symmetric key can be encrypted with B's public key as well as A's public key so that either person can decrypt the file.
Alternatively, an employee can encrypt a file with their public key and a corporation could insert an encrypted symmetric key into the header using their asymmetric keys. The corporation's asymmetric keys can be a Certifying Authority (CA), which can be used to issue employee keys.
Figure 5: Encryption Information Structure
- Encrypt Flag: (Y/N - 2 bytes) specifies whether the file is encrypted.
- Decrypt ID Length: (integer - 4 bytes) length in chars(bytes) of the Key ID.
- Decrypt ID: (size varies) an identifier of the RSA keys used in the encryption/decryption process. It is the alias associated to the asymmetric encryption keys (e.g., JaneDoe_12ff).
- Other Decrypt ID Length: (integer - 4 bytes) length in chars(bytes) of the Key ID.
- Other Decrypt ID: (size varies) an identifier of the RSA keys used in the encryption/decryption process. It can be the alias or the common name (e.g., JaneDoe_12ff or Logical Answers CA).
- Symmetric Key Algorithm: (integer - 4 bytes) specifies the symmetric key algorithm used to encrypt the file. The default value is 1=AES.
- Symmetric Key Mode: (integer - 4 bytes) specifies the symmetric key block cipher mode used to enhance confidentiality. The default value is 5=Segmented Integer Counter mode (CTR).
- Symmetric Key Padding: (integer - 4 bytes) specifies the type of padding for block cipher. The default value is 1=No Padding
- Wrapped Symmetric Key Length: (integer - 4 bytes)
- Wrapped Symmetric Key: (size varies) symmetric key used to encrypt/decrypt the file and encrypted with the asymmetric key.
- Initialization Vector Length: (integer - 4 bytes)
- Initialization Vector: (byte - size varies) vector used with the symmetric encryption process.
- Other Wrapped Symmetric Key Length: (integer - 4 bytes)
- Other Wrapped Symmetric Key: (size varies) symmetric key used to encrypt/decrypt the file and encrypted with another person's asymmetric key.
- Other Initialization Vector Length: (integer - 4 bytes)
- Other Initialization Vector: (byte - size varies) vector used with the symmetric encryption process.
Digital Signature Information
The Digital Signature Information structure contains information used to add or verify a digital signature generated from the contents of the file. The digital signature is generated with the owner's private key using a specific algorithm and then inserted into the header. When the recipient receives the signed file, they can use the signer's public key to validate its authenticity. If the signature is authenticated, it implies the file has not been altered and the holder of the private key generated the signature.
Figure 6: Digital Signature Information Structure
- Signed Flag: (Y/N - 2 bytes) specifies whether the file contains a digital signature
- Signature Algorithm: (integer - 4 bytes) specifies the algorithm used to generate the digital signature. The default value is 12= SHA512WithRSAEncryption
- Verify Signature Cert Name Length: (integer - 4 bytes) length in chars(bytes) of the filename of the certificate used to verify a digital signature
- Verify Signature Cert Name: (size varies) filename of the certificate holding the RSA public key used to verify the digital signature of a file (e.g., JaneDoe_fa39.cer).
- Signature Date/Time: (long - 8 bytes) date the digital signature was generated.
- Signature Length: (integer - 4 bytes)
- Signature: (size varies) holds digital signature generated with RSA private key and signature engine
File Naming Conventions
The Cryptographic header holds information that designates which keys were used to encrypt a file but it's not physically accessible without reading it in first. With proper naming conventions, you can determine who the intended recipient is for encrypted files - whether it is for yourself or a colleague. When you generate your pair of asymmetric encryption keys using Java, store them in a file called a key store. The key store holds a pair of asymmetric keys as an entry with a unique alias. The alias typically consists of the initial of your first name and your last name. To make it more unique, you can extract 4 hex digits from your public key and append an underline and the hex digits to the alias. For example, if the person's name was Jane Smith, then the resulting unique alias would be jsmith_ad5e. A certificate holds a person's public key and the alias would be used in the filename, as jsmith_ad5e.cer. Similarly, the key store holding the pair of asymmetric keys would be saved as, jsmith_ad5e.jks.
Following the unique alias analogy, Jane Smith could encrypt files for herself and the file name would be appended with her alias and an appropriate file extension. For example, if Jane encrypted a personal file, myTaxes.txt, then the result would be myTaxes.txt.jsmith_ad5e.aes. If Jane wanted to send her colleague Dick an encrypted document, she would use Dick's certificate to encrypt it. If Dick's certificate is djones_9fa2, Jane could encrypt the file, comments.doc, for Dick and the resulting file would be comments.doc.djones_9fa2.aes. When Dick receives the file, he knows it is for him by recognizing his alias on the file name.
The unique alias is stored within the header. This reinforces the importance of having a well-defined Cryptographic header for implementing encryption within your applications.
A well-defined cryptographic header stores the information required to encrypt, decrypt and digitally sign a file. Along with facilitating the implementation of standard cryptographic functions, the header also provides the following benefits:
- The header allows for the protection of personal files as well as the exchange of confidential data.
- Using the stored digital signature, the recipient can determine if the sender is valid and whether file has been altered.
- The header allows either the sender or recipient to decrypt the encrypted file since both would encrypt the symmetric key with their public key.
- Using the concept of a Certifying Authority pair of asymmetric keys, a corporation, group, or family could issue pairs of asymmetric keys to their employees or members and decipher files encrypted by them in case of emergencies.
- The header allows for using different combinations of symmetric algorithms, modes, padding and key sizes to be used to encrypt information.
- The header version allows for enhancements to be added to the structure for implementing new functions and still support older versions.
References and Other Technical Notes
- Computer running Windows XP or higher...
- Java Runtime (JRE V1.6 or higher)
- The Legion of the Bouncy Castle Encryption Modules (no runtime fee)
- DocuArmor software modules by Logical Answers Inc.
- "Beginning Cryptography with Java" by David Hook.
- "The Code Book" by Simon Singh
- Mainstream Business Applications and In-Memory Databases
- Working with Project Management Software – Who Is Managing Who?
- APM Convergence: Monitoring vs. Management
- Donald Fischer Joins General Catalyst as Venture Partner
- DataStax Hires Clint Smith as General Counsel
- Achieving Agile Transformation with Kanban, Kotter, and Lean Startup
- The Top Five Benefits of Cloud Computing
- How to Performance Test Automation for GWT and SmartGWT
- Compuware APM Extends Leadership in Big Data
- Compuware APM Recognized as Trendsetter in Big Data Solutions
- Will These Five Websites Make the Same Mistake Twice During the Big Game?
- RSA Conference USA 2014 Exhibitor Profiles (A through L)
- Mainstream Business Applications and In-Memory Databases
- Consumer Electronics - Global Trends, Estimates and Forecasts, 2011-2018
- Working with Project Management Software – Who Is Managing Who?
- Objective-C Programming: The Big Nerd Ranch Guide (2nd Edition)
- APM Convergence: Monitoring vs. Management
- Small Medium Business (SMB) IT Continues to Gain Respect, What About SOHO?
- Donald Fischer Joins General Catalyst as Venture Partner
- Big Data Market: Business Case, Market Analysis and Forecasts 2014 - 2019
- 2014 International CES Exhibitor Profiles: Samsung Electronics America, Inc. to 3D Vision Technologies Limited
- Global Customer Relationship Management (CRM) Software Industry
- Creating JavaServer Faces Maven Managed Projects with Eclipse
- DataStax Hires Clint Smith as General Counsel
- Building a Drag-and-Drop Shopping Cart with AJAX
- What Is AJAX?
- Google Maps! AJAX-Style Web Development Using ASP.NET
- Where Are RIA Technologies Headed in 2008?
- How and Why AJAX, Not Java, Became the Favored Technology for Rich Internet Applications
- Flashback to January 2006: Exclusive SYS-CON.TV Interviews on "OpenAjax Alliance" Announcement
- "Real-World AJAX" One-Day Seminar Arrives in Silicon Valley
- AJAXWorld Conference & Expo to Take Place October 2-4, 2006, at the Santa Clara Convention Center, California
- AJAX Sponsor Webcasts Are Now Available at AJAXWorld Website
- AJAXWorld University Announces AJAX Developer Bootcamp
- AJAX Support In JadeLiquid WebRenderer v3.1
- i-Technology 2008 Predictions: Where's RIAs, AJAX, SOA and Virtualization Headed in 2008?
Cloud environments have created situations that allow users, customers, consumers, and employees to access Public, Intranet, and Extranet applications from different locations, devices, and as different personas. The focus of all the Internet and enterprise front-end applications today is to enhance the user experience. In addition, with advances in mobility and BYOD, the line between public and private becomes a deep shade of gray. At the same time, organizations are leveraging SaaS applications, such as Google Apps and DropBox, for their internal business communication and collaboration. This opens up challenges in providing a universal identity for the user, while at the same time retaining the flexibility to segregate access depending on the scenario. While cloud computing environments may offer different levels of abstraction to its users, federated identity management does not leverage these abstractions; each user must set up her identity management solution. This situation is...
Mar. 11, 2014 02:39 PM EDT
SYS-CON Events announced today that Ambernet Technologies, the innovative “Cloud Management Center” company, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. Ambernet Technologies is a leading global provider of cloud management software (CloudTruOps) and IT professional services to the enterprise, service provider and government markets. CloudTruOps is the industry’s first infrastructure-independent and service-aware software solution that provides a fully transactional single pane of glass for cloud service provisioning & orchestration, governance, policy, security, performance, self-service storefront, and billing/chargeback for multiple clouds. Ambernet's IT professional services provide consulting services, solutions, and support. Ambernet is a global company with headquarters in Dallas, Texas and regional offices in Toronto, Canada, and Bangalore, India.
Mar. 11, 2014 08:00 AM EDT Reads: 789
The evolutionary nature of mobile presents a security-centric challenge for businesses with corporate content on these devices. Enterprises put themselves at risk when users access sensitive information through email and applications across smartphones and tablets, while mobile. Organizations can choose to ignore this security threat or enhance employee productivity through secure corporate containers. In his session at 14th Cloud Expo, Eric Owings, an enterprise account executive at AirWatch®, will discuss best practices and strategies to ensure global security and workforce enablement by leveraging enterprise mobility management (EMM) across the enterprise. He will also provide attendees with a deeper understanding of enterprise mobility in a connected ecosystem, while ensuring security and compliance in the cloud.
Mar. 7, 2014 09:45 AM EST Reads: 1,710
Cascading is the popular Java-based application development framework for building Big Data applications on Apache Hadoop. This open source framework allows you to leverage existing skillsets such as Java, SQL, R, and more to create enterprise-grade applications without having to think in MapReduce. In his session at 5th Big Data Expo, Alexis Roos, a Senior Solutions Architect focusing on Big Data solutions at Concurrent, Inc., will give an introduction to Cascading, how it works, and then dive into how enterprises can start building applications with Cascading. Come and see how companies like Twitter, eBay, Etsy, and other data-driven companies are taking advantage of Cascading and how Cascading is changing the business of Big Data in the enterprise.
Mar. 4, 2014 11:15 AM EST Reads: 1,829
The world’s largest and most successful private cloud operations are revolutionizing their approach to demand management. These organizations have recognized that while self-service portals are a component in the overall cloud architecture, these tools do not enable demand management. In fact, in many cases the portals and end-user interfaces don’t actually capture anything to do with demand, but instead force the user to enter the capacity “supply” requirements that they think will meet their demands. This is very different. Large enterprises have recognized the need to look beyond immediate requests to also model the “pipeline” of new demands that will be coming down the road. It is only by capturing new immediate requirements, an understanding of the pipeline and what is running in environments that organizations can possibly hope to accurately model demand and properly allocate compute, storage and network resources.
Mar. 4, 2014 10:15 AM EST Reads: 1,845
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. Without bringing these three elements together via Systems of Discover you either end up with an Internet of somethings and/or a big mess of data. In his session at @ThingsExpo, Mac Devine, a Distinguished Engineer at IBM, will focus on how to ensure businesses have the right plans in place for Systems of Discovery for the Internet-of-Things world we are entering.
Mar. 4, 2014 09:00 AM EST Reads: 2,134
Nominations for participating vendors will be accepted through Twitter at @ThingsExpo. The "Open Cloud Shoot-Out at @ThingsExpo New York," in which leading cloud providers are expected to participate, will be held live on stage at the event. The Shootout will provide the vendors with an opportunity to demonstrate the features and capabilities of their products, with a particular focus on interoperability, scalability, security, and reliability in terms of development, deployment, and management.
Feb. 25, 2014 02:30 PM EST Reads: 2,271
As businesses aspire to move more and more application workloads outside of the boundaries of their private cloud data centers, public cloud service providers are increasingly implementing a private cloud staple: resiliency. In his session at 14th Cloud Expo, John Roese, SVP and Chief CTO at EMC Corporation, will summarize the key architectural tenets of resilient private cloud architectures. These tenets can be implemented in any service provider cloud implementation, regardless of hypervisor choice (e.g., VMware, Hyper-V, Xen), cloud orchestration software (e.g., vSphere, OpenStack), network implementation (e.g., SDN, NFV), or storage implementation (file, block, object). A resilient public cloud will naturally attract increased workload migration, and the rest of the session will describe foundational technologies that facilitate not only secure and seamless application workload migration, but secure and seamless data set migration as well.
Feb. 25, 2014 11:00 AM EST Reads: 2,000
Fueled by the global economic situation, the government's focus on datacenter consolidation and the "Cloud First" initiative, Cloud Computing continues to be the buzzword of the year. As government agencies start to adopt cloud computing, additional challenges including security in the cloud have become prominent barriers to adoption. In his session at 14th Cloud Expo, Majed Saadi, Director of the Cloud Computing Practice at SRA International, will focus on providing a quick Cloud Computing technology update with an emphasis on current Cloud Computing security trends and drivers. Examples of these trends include: the utilization and evaluation of Clouds in both active and passive surveillance systems and the use of High Performance Clouds for expanding scientist ability to access data. He will also introduces best practices and lessons learned for securing both public and private cloud environments. It offers insight into how Cloud Computing coupled with other technical advancements i...
Feb. 24, 2014 09:45 AM EST Reads: 2,407
With Windows Server 2003 end of extended support approaching, enterprises must begin their migration planning for all affected production applications. There are a variety of approaches and many people will take a “mix and match” approach. Whatever the approach, it’s important to have a migration plan now – 200 business days goes by quickly when some applications take weeks to migrate. This is the perfect opportunity to move those applications to the Cloud. There’s a way to move your applications and modernize (move to the cloud) at the same time.
Feb. 23, 2014 11:30 AM EST Reads: 1,788
Software development, like engineering, is a craft that requires the application of creative approaches to solve problems given a wide range of constraints. However, while engineering design may be craftwork, the production of most designed objects relies on a standardized and automated manufacturing process. By contrast, much of what's typically involved when moving an application from prototype to production and, indeed, maintaining the application through its lifecycle remains craftwork.
Feb. 22, 2014 01:30 PM EST Reads: 1,910
Are you re-creating existing technology silos in the cloud? If so, your entire enterprise investment in the cloud is at risk. From the perspective of IT, organizational silos seem to be the root of all problems. Every line of business, every department, every functional area has its own requirements, its own technology preferences, and its own way of doing things. They have historically invested in specialized components for narrow purposes, which IT must then conventionally integrate via application middleware – increasing the cost, complexity, and brittleness of the overall architecture. Now those same stakeholders want to move to the cloud. Save money with SaaS apps! Reduce data center costs with IaaS! Build a single private cloud we can all share! But breaking down the technical silos is easier said than done. There are endless problems: Static interfaces. Legacy technology. Inconsistent policies, rules, and processes. Crusty old middleware that predates the cloud. And everybod...
Feb. 21, 2014 11:00 AM EST Reads: 2,129
Recent high-profile events (2010 Haitian Earthquake, 2011 Tōhoku Earthquake and Tsunami, 2013 Typhoon Haiyan/Yolanda) have highlighted the growing importance played by the international community in successful humanitarian assistance and disaster response. These events also showcased the critical importance of quickly providing robust information technology resources to response effort participants. In June 2010, in support of its continuing effort to foster international collaboration, the National Geospatial-Intelligence Agency (NGA) initiated a dialog with the Network Centric Operations Industry Consortium (NCOIC) to discuss this and other aspects of geospatial data information-sharing across the international community. In response to this request the NCOIC through the use of a cloud services brokerage paradigm, built and demonstrated a federated cloud computing infrastructure capable of managing the electronic exchange of geospatial data. The effort also led to the development of ...
Feb. 21, 2014 09:00 AM EST Reads: 2,234
Cloud computing is changing our world, sharing common platforms for global information exchange. Self-service computing makes the Internet come alive, helping users visualize and analyze location-aware information. Configurable applications deliver a solution framework for integration, collaboration, and efficiency. Cloud-based applications integrate and synthesize information from many sources, facilitating communication and collaboration, and breaking down barriers between institutions, disciplines, and cultures. Online platforms enable real-time access from everyone. Web connectivity provides a common information source, elaborating, collaborating, and sharing holistic approaches for content awareness.
Feb. 18, 2014 09:15 AM EST Reads: 1,942
Although PaaS is new, it's rapidly gaining momentum, with growth projected at 48 percent annually by Technavio, the research firm, and topping $6 billion in value by 2016. If PaaS is treated as a strategic opportunity to align agendas across IT and across the business, it may well prove to be a ʺonce in a generationʺ opportunity to clarify, improve, and strengthen everything developers do. As with any new technology or approach to doing business, PaaS will appeal to different groups for different reasons. The clear business value is that PaaS is added at the application layer. For ISVs, PaaS can help extend the availability of a traditional software product or enable organizations to add new capabilities to their existing IT spectrum. It's also helpful to anyone wishing to achieve productivity gains, speed time to results, or reduce their costs. But like any technological shift, PaaS adoption requires changes in how people work and demands collaboration if it is to be as successful as...
Feb. 17, 2014 09:00 AM EST Reads: 2,961