Click here to close now.

Welcome!

AJAX & REA Authors: Elizabeth White, Pat Romanski, Carmen Gonzalez, Liz McMillan, Wayne Lam

Related Topics: Virtualization, Java, Microservices Journal, Open Source, Cloud Expo, Big Data Journal

Virtualization: Blog Feed Post

Challenges in Virtualization

Companies looking at virtualization solutions need storage solutions that are flexible

By Sue Poremba

Virtualization has been a boon to enterprise as it makes IT operations more efficient. Some like its green qualities as virtualization saves on energy consumption, while others appreciate the storage capacity, as well as the data recovery solutions for if disaster strikes.

However, the virtual environment is invisible, and with that come more challenges in making sure it runs smoothly. The cloud might be simple to setup, but it becomes more complex over time. In addition, the more machines and data involved, the more difficult it can be to monitor for space, CPU spikes, network security and other indicators.

“If there is a bug or a discrepancy, I need to know that there’s a problem before my customer does. And though that is the biggest challenge, it’s also a great opportunity,” Russ Caldwell, CTO, Emcien Corporation said.

One of those challenges is making sure storage in the virtualized environment is adequate. “We focus on storage and database environments that scale as the customers grow,” said Caldwell. “Determining how fast customers grow and change is the biggest factor for determining the adequate storage size.”

Companies looking at virtualization solutions need storage solutions that are flexible so they can add or remove storage, as needed. Even though it may have been the right size in the beginning of a project, things change, and a flexible virtualization tool can give that peace of mind when things change. For example, when we’re working with slow-moving manufacturing data, we can determine the adequate storage size easier than when we’re working with hundreds of millions of bank nodes, where the growth is much more dramatic.

The key, according to John Ross with virtual solution company Phantom Business Development at Net Optics, is to truly assess the performance of the servers and the requirements of the virtual machines. This requires monitoring to be in place for the life of the systems to predict utilization and to modify placement based on performance. “When this is not accounted for, it can appear as though there is high CPU utilization on the hosts as well as the VMs,” said Ross, “With the use of protocols such as NFS and ISCSI, it can put quite a load on the network.”

Companies moving to the cloud also have to change how they think about networking. “It can be hard to understand how network connection works when there aren’t wires to simply plug it into a box, but instead virtual, invisible connections that need to be managed through APIs or online interfaces,” said Caldwell. One of the challenges for a company with multiple clients is keeping client data separate from one another. Grouping machines together and isolating them in their own network is the best approach in tackling this challenge. Using excellent monitoring tools smartly can ensure that the network is as reliable as possible.

“Network connectivity comes down to whether the network connection is a single point of failure: If your virtualization solution is off-site, it’s only as good as the quality of the Internet connection between you and your provider,” said William L. Horvath with DoX Systems. If you have a single connection between you and the Internet, that’s one problem. (You can reduce the risks by contracting with two or more ISPs and getting routers that support trunking.) Likewise, if your virtualization provider’s facility is in a single geographical location (say, Manhatten) that loses functionality for an extended period of time due to some natural disaster, you’re hosed. Our Chamber of Commerce lost access to a cloud-based service not too long ago because someone in the data center, which wasn’t owned by the service provider, forgot to disable the fire suppression system during emergency testing, which unexpectedly destroyed most of the hard drives in the servers.

To avoid the challenges involved in virtualization, Ross provided the following tips:

1. Plan on virtualizing everything — not just the servers but the network, the storage, the security … everything!

2. Standardize everything, from the operating systems on upwards through middleware and applications. The more uniformity exists within configurations, the easier it will be to scale and move these workloads optimally around the environment.

3. Ensure network capabilities are met. This will dynamically change and collapse. There will be huge flow changes as utilization and cloud are adopted.

4. Implement resource monitoring. Existing legacy tools will not provide the data or detail needed.

5. Implement a decommissioning process. Ross repeatedly finds several unused machines running. In a virtual environment, this can become a major issue, consuming resources and driving up costs.

6. Plan for backup and disaster recovery. This will drastically change in virtualization and must be addressed.

7. Train your team based on what the management will look like, not on the migration.

The cloud solves certain problems really well and it allows for SMBs to have the flexible infrastructures that they require — without a lot of capital or hardware or payroll costs. Using the cloud wisely with the right tools, companies can get a leg ahead.

Sue Poremba is a freelance writer focusing primarily on security and technology issues and occasionally blogs for Rackspace Hosting.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
“Oh, dev is dev and ops is ops, and never the twain shall meet.” With apoloies to Rudyard Kipling and all of his fans, this describes the early state of the two sides of DevOps. Yet the DevOps approach is demanded by cloud computing, as the speed, flexibility, and scalability in today's so-called “Third Platform” must not be hindered by the traditional limitations of software development and deployment. A recent report by Gartner, for example, says that 25% of Global 2000 companies will b...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects - scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e....
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, de...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
With the arrival of the Big Data revolution, a data professional is expected to master a broad spectrum of complex domains including data processing, mathematics, programming languages, machine learning techniques, and business knowledge. While this mastery is undoubtedly important, this narrow focus on tool usage has divorced many from the imagination required to solve real-world problems. As the demand for analysis increases, the data science community must transform from tool experts to "data...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Grow your business with enterprise wearable apps using SAP Platforms and Google Glass. SAP and Google just launched the SAP and Google Glass Challenge, an opportunity for you to innovate and develop the best Enterprise Wearable App using SAP Platforms and Google Glass and gain valuable market exposure. In his session at @ThingsExpo, Brian McPhail, Senior Director of Business Development, ISVs & Digital Commerce at SAP, outlined the timeline of the SAP Google Glass Challenge and the opportunity...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo fl...