Welcome!

Machine Learning Authors: Elizabeth White, Kevin Benedict, Liz McMillan, Pat Romanski, Rene Buest

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

How to Develop an Effective Security Strategy to Play in the Public Cloud

Develop an effective security strategy with the right blend of technology and processes

Look all around and you can easily see that there is no shortage of press regarding the promises of cloud computing. Cloud evangelists have touted cloud computing as the next big thing, a game changer - a disruptive technology that will spark innovation and revolutionize the way businesses acquire and deliver IT services. The staggering volume of these sales pitches is to be expected, considering that cloud computing is at or near the peak of its hype cycle, but as with any new technology or model, reality will eventually set in and the public relations blitz will fade. As people continue to define cloud computing and debate its pros and cons, one thing is certain - one of the biggest obstacles to widespread cloud computing adoption will be security.

This article will deal with the security approach for the public cloud as opposed to a private, hybrid, or community cloud. The public cloud, as defined by the National Institute of Standards and Technology (NIST), is cloud infrastructure that is made available to the general public or a large industry group and is owned by an organization selling cloud services. An example of a public cloud implementation would be an application that is hosted in Amazon EC2. Anyone with a simple credit card would be able to deploy a software application in this type of environment.

Cloud Computing Styles
There are three major styles of cloud computing: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). IaaS is delivery of the computing infrastructure as a fully outsourced service versus an in-house, capital investment-sourced model. The consumer rents processing, storage, networking components, etc. With PaaS, consumers are given everything they need to develop, test, and deploy applications to the cloud. Finally, SaaS provides the consumer with the capability to use a cloud provider's applications running on a cloud infrastructure. The software application is accessed through a thin client interface such as a standard web browser. While many of the recommendations presented are applicable across all three cloud styles, the security approach described in this article is most applicable to IaaS.

Benefits of the Cloud
Before we dive into the security approach for the public cloud, let's briefly examine the potential benefits. Once you cut through all the hype, a closer look at the benefits of moving to the cloud reveals a strong business case. The cloud offers a pay-as-you-go model that is highly reliable and scalable, and gives you tremendous flexibility and agility to boot. The McKinsey study, "Clearing the Air on Cloud Computing," states that the average server utilization in a data center is 10 percent. Anyone who has ever run a data center knows how enormously difficult it is to achieve high reliability, efficiency, and scalability.

In the cloud, enterprises can greatly reduce their capital costs and no longer have to worry about allocating time and resources to maintaining infrastructure, and patching servers and software. As a result, IT personnel can work more efficiently which in turn, can breed more innovation and help enterprises enter new markets. In the cloud, applications are accessible anywhere and at any time so employees now have more mobility. The cloud provides nearly infinite computing power and storage to enterprises and users at a mere fraction of what it would cost to actually purchase and maintain these resources. This is a huge advantage for technology startups that have limited capital. The case for moving to the cloud becomes even stronger when you consider how the troubled economy is putting pressure on businesses to cut costs.

Although surveys differ on what percentage of companies will adopt cloud computing in the next 12-24 months, enterprises are already taking cloud computing seriously. In fact, according to a recent Forrester study, one out of four large companies plans to use a cloud provider soon, or has already employed one. Furthermore, Intel predicts that by 2012, an estimated 20 to 25 percent of its server chips will be dedicated toward cloud computing data centers.

Cloud Computing in the Private and Public Sectors
Anyone who has ever logged onto Facebook, Twitter, or Gmail or purchased an item from Amazon.com has either knowingly or unknowingly used a cloud-based application. There are numerous other examples of cloud computing implementations in the private sector, but it is also important to note that the public sector does not trail far behind.

Vivek Kundra, the federal CIO, is a big supporter of cloud computing. Under Kundra's leadership, the federal government has moved quickly on major cloud computing initiatives such as the General Services Administration (GSA) Storefront, an online store that will soon allow government agencies to easily procure cloud computing services. NIST has already released a working definition of cloud computing and is currently developing a Special Publication on cloud computing security.

In the defense sector, the Defense Information Systems Agency (DISA) has led the way with private cloud implementations such as Rapid Access Computing Environment (RACE) and Forge.mil. RACE gives DISA customers the ability to rent a basic computing environment. Customers purchase an environment on a monthly basis so the costs and risks of acquiring and sustaining a computing environment are significantly reduced. Forge.mil is essentially a mirror of SourceForge.net and allows developers to store and manage code for open source software projects.

Cloud Computing Security Risks
If the benefits are so clear, why isn't everyone adopting cloud computing right now? Research and polling indicate that the main obstacle is security. It probably comes as no surprise that the vast majority of surveys reveal security to be the number one concern of IT executives and CIOs who are considering cloud computing. Security within the cloud has received substantial press coverage, including publication of the Gartner top seven security risks associated with cloud computing, in a report entitled, "Assessing the Security Risks of Cloud Computing."

Earlier this year, a flaw in Google Docs led to the inadvertent sharing of some users' private documents with other users on the Internet without the owners' permission. There have been other highly publicized breaches and future incidents are inevitable.

Does this mean that the security risks of cloud computing outweigh its potential benefits?

Absolutely not, but customers must perform due diligence and practice due care. In addition to selecting a vendor that can comply with organizational security requirements, customers need to carefully plan and develop a defense-in-depth strategy that mitigates the security risks of cloud computing and addresses all layers of the cloud architecture.

Cloud Computing Security Approach
Given the highly distributed and federated nature of the cloud computing model and the constant threat of new attacks, the network-based perimeter defense strategy is clearly no longer adequate or relevant. Customers will now have to protect all the layers of the cloud architecture. To ensure the confidentiality, integrity, and availability of customer data, the security strategy for the cloud must address the following:

  • Physical and environmental security
  • Hypervisor security
  • Operating system security
  • The web tier
  • The application tier
  • The database tier
  • Network security
  • Auditing

The design of this approach is best accomplished through the use of defense-in-depth principles, but the traditional defense-in-depth approach will have to be expanded beyond on-premise security controls to distributed and federated ones that are agile enough to be implemented in many different types of cloud architectures.

Physical and Environmental Security
The first line of defense in an effective cloud security strategy is physical and environmental security. Data stored in the cloud can be just as secure, if not more, than data stored in customer data centers as reputable and well-established cloud providers will typically have greater dedicated resources and security solutions at their disposal than any single enterprise. Security mechanisms, ranging from robust authentication and access controls to disaster recovery, and their associated costs are distributed across multiple enterprises, resulting in capabilities that are usually too expensive to employ or manage for many enterprises.

Cloud providers also have the advantage of possessing many years of experience in designing and operating world class, large-scale data centers and because they have to win and maintain the confidence of their customers to maintain their business, they are highly motivated to avoid a security breach. However, none of this implies that enterprises should blindly accept any cloud provider's claims.

In addition to addressing personnel security issues, enterprises need to perform due diligence by looking for certifications and accreditations such as WebTrust/SysTrust, Statement on Accounting Standard 70 (SAS 70) and International Organization for Standardization (ISO), and verifying compliance with Sarbanes-Oxley (SOX), Federal Information Security Management Act (FISMA), Health Insurance Portability and Accountability Act (HIPAA), and the Payment Card Industry Data Security Standard (PCI DSS).

If you think that these certifications do not matter, think again. According to Verizon's "2009 Data Breach Investigations Report," 81 percent of the researched companies were not PCI compliant prior to being breached.

Hypervisor Security
When choosing a cloud provider, it is important to consider hypervisor security. In a public cloud, the customer is renting servers and the computing tasks are now being executed within the cloud provider's infrastructure. These virtual servers (or virtual machines) are actually guest instances running on a cloud provider's hypervisors. The hypervisor (also known as a virtual machine monitor) is software that controls the guest instances running on it. Anyone who exploits the hypervisor has all the proverbial keys to the kingdom and can modify or delete the customer data residing on the guest instances.

Customers will not have much control over the types of hypervisors their vendors will use, but it is important that they understand what security mechanisms and features are in place to secure the hypervisor layer. Proper implementation is crucial to hypervisor security as misconfiguration is one of the biggest security risks. Enterprises should understand hypervisor best practices and verify that cloud providers are incorporating them into their hypervisor solutions.

Operating System Security
In a virtualized environment, each operating system installed on an individual virtual machine (VM) needs to be hardened. Good operating system security boils down to three sets of practices:

  1. Server hardening
  2. Patch management
  3. Access control

Well-known hardening guides such as the DISA Security Technical Implementation Guides (STIGs) and Center for Internet Security (CIS) benchmarks can be used to effectively lock down operating system images.

By installing anti-virus software, and hardening and patching servers, the administrator protects instances against malware, keeps operating system patches current, removes all unused and unnecessary services, and ensures that only trusted parties may establish a connection to the operating system. Once an operating system image has been properly configured and hardened, the administrator can then develop a minimum security baseline and provision new, secure virtual machine images on demand. Fortunately, there are tools that can automatically assess and lock down systems.

Web Security
The defense-in-depth strategy must also secure the web tier. Administrators must prevent unauthorized users from gaining access to web resources. The first step is to protect web resources. If an unauthenticated user attempts to gain access to a protected web resource, the web container will automatically try to authenticate the user. Cloud customers should implement client certificate authentication mechanisms such as HTTPS for web resources.

Administrators can apply a wide range of best practices to secure web servers. A wise approach is to organize the safeguards you would like to implement and the settings that need to be configured into categories. Categories allow you to systematically walk through the hardening process using a checklist so that administrators can focus on individual categories and understand all the specific steps that are required to apply a particular countermeasure.

Most web server best practice guides incorporate the following:

  • Patches and updates
  • The lockdown of unnecessary ports
  • Protocols, and services
  • Account management
  • The proper securing of files and directories
  • The removal of all unnecessary file shares
  • Auditing and logging
  • The application of security policy settings
  • Application Security

Web applications are vulnerable to many different kinds of attacks (e.g., network eavesdropping, unauthorized access, and malware). To prevent eavesdropping, administrators can utilize strong authentication mechanisms (e.g., SSL with digital certificates) and secure communication channels (encrypting all traffic between the client, the application, and the database server).

Unauthorized access can be prevented by implementing firewall policies that block all traffic except authorized communication ports, disabling all unused services, limiting and periodically reviewing user membership to predefined administrative groups, restricting user access to administrative accounts created during product installation, practicing the principle of least privilege when granting permissions to new administration groups or roles, and restricting directory and file access. To mitigate the risks posed by malware, administrators should promptly apply the latest software patches, disable unused functionality, and run processes with least privileged accounts to reduce the scope of damage in the event of a compromise.

Of course, the best way to protect the application tier is to design and build secure web applications. Until recently, organizations merely talked about developing secure web applications, but the steady rise in the number and sophistication of cyber attacks over the years has forced IT professionals to move beyond mere talk. Fortunately, some real progress is being made. For example, (ISC)2 introduced a new certification last year called the Certified Secure Software Lifecycle Professional (CSSLP).

The CSSLP certification is designed to help developers understand government standards and best practices for secure software development so that security is considered and implemented throughout the entire software lifecycle. More and more security professionals are leveraging tools such as web application scanners to detect vulnerabilities and weak configuration settings. Most of the more established automated security tools offer a selection of security engines and vulnerability tests ranging from the OWASP Top 10 and ISO 27002 to HIPAA and SOX. Users can select modules or let automatic crawlers map a site's tree structure, and apply all of the selected policies' attacks from thousands of security checks.

Data Security
One of the biggest cloud computing concerns is data confidentiality. Data stored in the cloud has different privacy implications than data stored in an in-house data center. These are some questions that must be considered before storing data in the cloud:

  • What is the provider's privacy policy?
  • What are the terms of service?
  • Who owns the data? Who has access to the data?
  • How does the provider deal with subpoenas for customer data?
  • How many copies of the customer's data are kept and are they stored in different locations?
  • What are the provider's data and media sanitization methods?
  • When data is removed from the cloud, does the provider retain rights to customer information?
  • How is data isolated and separated from other customers' data?
  • Where is the data processed?
  • How does the provider protect customer data?

Many of the data confidentiality obstacles can be overcome by utilizing existing technologies and solutions. While it is important to encrypt network traffic, it is just as important to encrypt data at rest. It is wise to assume that all data in the cloud can be compromised. This means that network traffic, storage, and file systems must all be encrypted. Some other best practices for database security include using roles to simplify security administration, encapsulating privileges into stored procedures, using row-level access control to enforce security policies at a row level of granularity, and building web applications so that the application users are the database users.

Network Security
A network-based perimeter defense alone is not effective for the cloud, but network security is still a vital piece of the defense-in-depth strategy. Most cloud providers utilize VLANs to provide traffic isolation channels and will offer some level of protection against the most common types of external attacks such as distributed denial of service, man-in-the-middle attacks, IP spoofing, port scanning, and packet sniffing, but it is the enterprise's responsibility to implement additional layers of security.

Virtualization brings with it a host of new threat vectors that cannot be secured with traditional security tools and methods. An owner of one VM instance may launch attacks against adjacent VMs or hackers may try to install a rogue hypervisor that can take complete control of a server. To prevent these types of attacks, enterprises need to deploy virtual firewalls and virtual IDS/IPS solutions.

These security tools are designed to protect each VM instance and can even secure live migrations of VM instances. Some VM security solutions offer protection against SQL injection attacks, cross-site scripting, and other web application vulnerabilities and can monitor unauthorized or unexpected changes to operating system files and application files.

Auditing
The importance of audit event logging has never been greater as the threat of cybercrime continues to increase. Auditing takes on even more importance in the cloud due to the dynamic nature of virtual machines. A good auditing solution for the cloud will collect and integrate real-time information from all the major systems in a cloud environment and enable the customer to detect intrusions, data leaks, misuse, or insider threats. A robust, centralized auditing solution provides a clear and comprehensive picture of the customer's changing cloud environment and enables IT professionals to spot trends and quickly assess and resolve security incidents. Ensuring that a continuous monitoring solution is implemented that includes these capabilities in a scalable nature is essential to maintaining an effective security presence within the cloud.

Final Thoughts
Utilizing cloud computing brings with it many advantages that can improve application deployment, scalability, and flexibility while leveraging cost savings. While there are multiple concerns, security in the cloud does not present radically new challenges. With cloud computing, we have the convergence of virtualization, SOA, and distributed computing - concepts that have been around for some time. This does not mean that every application should be deployed to the public cloud. Cloud computing standards and guidelines need more time to mature. For now, more security-sensitive applications should probably remain in-house or move to a private cloud, but enterprises that are considering appropriate applications for the public cloud should know that they can develop an effective security strategy with the right blend of technology and processes that takes into account all layers of the cloud architecture.

Resources

More Stories By Peter Choi

Peter Choi is the cloud computing security lead for Apptis, Inc. He has over 9 years of experience in certification and accreditation, vulnerability management, security auditing, network engineering, and systems administration. Most recently, he spoke about cloud computing security at the 2009 Special Operations Forces Industry Conference and worked with FEMA to demonstrate that a cloud prototype could be certified and accredited.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
sinosummer 08/12/09 09:24:46 PM EDT

as someone who had never heard about cloud computing before, i found peter choi's article to be both interesting and insightful. i look forward to learning more about this potentially-revolutionary technology.

liangtu 08/11/09 05:34:30 PM EDT

Great overview of some of the technical challenges organizations have to contend with in the cloud. It will be interesting to see how dramatic of an effect the GSA Storefront will have on the way the federal government manages IT services.

@CloudExpo Stories
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
In his session at 20th Cloud Expo, Chris Carter, CEO of Approyo, discussed the basic set up and solution for an SAP solution in the cloud and what it means to the viability of your company. Chris Carter is CEO of Approyo. He works with business around the globe, to assist them in their journey to the usage of Big Data in the forms of Hadoop (Cloudera and Hortonwork's) and SAP HANA. At Approyo, we support firms who are looking for knowledge to grow through current business process, where even 1%...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, will address this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of t...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...