|By Lori MacVittie||
|May 25, 2011 01:30 PM EDT||
Turns out that ‘unassailable’ economic argument for public cloud computing is very assailable
The economic arguments are unassailable. Economies of scale make cloud computing more cost effective than running their own servers for all but the largest organisations. Cloud computing is also a perfect fit for the smart mobile devices that are eating into PC and laptop market. -- Tim Anderson, “Let the Cloud Developer Wars Begin”
Ah, Tim. The arguments are not unassailable and, in fact, it appears you might be guilty of having tunnel vision – seeing only the list price and forgetting to factor in the associated costs that make public cloud computing not so economically attractive under many situations. Yes, on a per hour basis, per CPU cycle, per byte of RAM, public cloud computing is almost certainly cheaper than any other option. But that doesn’t mean that arguments for cloud computing (which is much more than just cheap compute resources) are economically unassailable. Ignoring for a moment that it isn’t as clear cut as basing a deployment strategy purely on costs, the variability in bandwidth and storage costs along with other factors that generate both hard and soft costs associated with applications must be considered .
MACRO versus MICRO ECONOMICS
The economic arguments for cloud computing almost always boil down to the competing views of micro versus macro economics. Those in favor of public cloud computing are micro-economic enthusiasts, narrowing in on the cost per cycle or hour of a given resource. But micro-economics don’t work for an application because an application is not an island of functionality; it’s an integrated, dependent component that is part of a larger, macro-economic environment in which other factors impact total costs.
The lack of control over resources in external environments can be problematic for IT organizations seeking to leverage cheaper, commodity resources in public cloud environments. Failing to impose constraints on auto-scaling – as well as defining processes for de-scaling – and the inability to track and manage developer instances launched and left running are certainly two of the more common causes of “cloud sprawl.” Such scenarios can certainly lead to spiraling costs that, while not technically the fault of cloud computing or providers, may engender enough concern in enterprise IT to keep from pushing the “launch” button.
The touted cost savings associated with cloud services didn't pan out for Ernie Neuman, not because the savings weren't real, but because the use of the service got out of hand.
When he worked in IT for the Cole & Weber advertising firm in Seattle two and a half years ago, Neuman enlisted cloud services from a provider called Tier3, but had to bail because the costs quickly overran the budget, a victim of what he calls cloud sprawl - the uncontrolled growth of virtual servers as developers set them up at will, then abandoned them to work on other servers without shutting down the servers they no longer need.
Whereas he expected the developers to use up to 25 virtual servers, the actual number hit 70 or so. "The bills were out of control compared with what the business planned to spend," he says.
But these are not the only causes of cost overruns in public cloud computing environments and, in fact, uncontrolled provisioning whether due to auto-scaling or developers forgetfulness is not peculiar to public cloud but rather can be a problem in private cloud computing implementations as well. Without the proper processes and policies – and the right infrastructure and systems to enforce them – cloud sprawl will certainly impact especially those large enterprises for whom private cloud is becoming so attractive an option.
While it’s vastly more difficult to implement the proper processes and procedures automatically in public as opposed to private cloud computing environments because of the lack of maturity in infrastructure services in the public arena, there are other, hotter issues in public cloud that will just as quickly burn up an IT or business budget if not recognized and addressed before deployment. And it’s this that cloud computing cannot necessarily address even by offering infrastructure services, which makes private cloud all the more attractive.
Though not quite technically accurate, we’ll use traffic sprawl to describe increasing amounts of unrelated traffic a cloud-deployed application must process. It’s the extra traffic – the malicious attacks and the leftovers from the last application that occupied an IP address – that the application must field and ultimately reject. This traffic is nothing less than a money pit, burning up CPU cycles and RAM that translate directly into dollars for customers. Every request an application handles – good or bad – costs money.
The traditional answer to preventing the unnecessary consumption of resources on servers due to malicious or unwanted traffic is a web application firewall (WAF) and basic firewalling services. Both do, in fact, prevent that traffic from consuming resources on the server because they reject it, thereby preventing it from ever being seen by the application. So far so good. But in a public cloud computing environment you’re going to have to pay for the resources the services consumed, too. In other words, you’re paying per hour to process illegitimate and unwanted traffic no matter what. Even if IaaS providers were to offer WAF and more firewall services, you’re going to pay for that and all the unwanted, malicious traffic that comes your way will cost you, burning up your budget faster than you can say “technological money pit.”
This is not to say that both types of firewall services are not a good idea in a public cloud environment; they are a valuable resource regardless and should be part and parcel of any dynamic infrastructure. But it is true that in a public cloud environment they address only security issues, and are unlikely to redress cost overruns but instead may help you further along the path to budget burnout.
HYBRID WILL DOMINATE CLOUD COMPUTING
I’ve made the statement before, I’ll make it again: hybrid models will dominate cloud computing in general due primarily to issues around control. Control over processes, over budgets, and over services. The inability to effectively control traffic at the network layer imposes higher processing and server consumption rates in public environments than in private, controlled environments even when public resources are leveraged in the private environment through hybrid architectures enabled by virtual private cloud computing technologies. Traffic sprawl initiated because of shared IP addresses in public cloud computing environments alone is simply not a factor in private and even hybrid style architectures where public resources are never exposed via a publicly accessible IP address. Malicious traffic is never processed by applications and servers in a well-secured and architected private environment because firewalls and application firewalls screen out such traffic and prevent them from unnecessarily increasing compute and network resource consumption, thereby expanding the capacity of existing resources. The costs of such technology and controls are shared across the organization and are fixed, leading to better forecasting in budgeting and planning and eliminating the concern that such essential services are not the cause of a budget overrun.
Control over provisioning of resources in private environments is more easily achieved through existing and emerging technology, while public cloud computing environments still struggle to offer even the most rudimentary of data center infrastructure services. Without the ability to apply enterprise-class controls and limits on public cloud computing resources, organizations are likely to find that the macro-economic costs of cloud end up negating the benefits initially realized by cheap, easy to provision resources. A clear strategy with defined boundaries and processes – both technical and people related – must be defined before making the leap lest sprawl overrun budgets and eliminate the micro-economic benefits that could be realized by public cloud computing.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
Oct. 6, 2015 06:00 PM EDT Reads: 302
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Oct. 6, 2015 05:15 PM EDT Reads: 327
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
Oct. 6, 2015 05:00 PM EDT Reads: 245
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Oct. 6, 2015 04:40 PM EDT
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Oct. 6, 2015 03:00 PM EDT Reads: 112
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Oct. 6, 2015 02:45 PM EDT Reads: 367
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Oct. 6, 2015 01:00 PM EDT Reads: 737
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 6, 2015 12:45 PM EDT Reads: 456
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Oct. 6, 2015 12:30 PM EDT Reads: 577
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
Oct. 6, 2015 12:15 PM EDT Reads: 105
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Oct. 6, 2015 12:00 PM EDT Reads: 408
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Oct. 6, 2015 12:00 PM EDT Reads: 438
Secure Cloud through Automated Compliance | @CloudExpo @CloudRaxak #Cloud #BigData #DevOps #Microservices
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Oct. 6, 2015 12:00 PM EDT Reads: 247
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Oct. 6, 2015 11:00 AM EDT Reads: 850
Learn how Backup as a Service can help your customer base protect their data. In his session at 17th Cloud Expo, Stefaan Vervaet, Director of Strategic Alliances at HGST, will discuss the challenges of data protection in an era of exploding storage requirements, show you the benefits of a backup service for your cloud customers, and explain how the HGST Active Archive and CommVault are already enabling this service today with customer examples.
Oct. 6, 2015 11:00 AM EDT Reads: 679
SYS-CON Events announced today that Key Information Systems, Inc. (KeyInfo), a leading cloud and infrastructure provider offering integrated solutions to enterprises, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Key Information Systems is a leading regional systems integrator with world-class compute, storage and networking solutions and professional services for the most advanced softwa...
Oct. 6, 2015 11:00 AM EDT Reads: 321
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, ...
Oct. 6, 2015 10:45 AM EDT Reads: 451
Cloud Foundry open Platform as a Service makes it easy to operate, scale and deploy application for your dedicated cloud environments. It enables developers and operators to be significantly more agile, writing great applications and deliver them in days instead of months. Cloud Foundry takes care of all the infrastructure and network plumbing that you need to build, run and operate your applications and can do this while patching and updating systems and services without any downtime.
Oct. 6, 2015 10:00 AM EDT Reads: 4,418
SYS-CON Events announced today that Interface Masters Technologies, provider of leading network visibility and monitoring solutions, will exhibit at the 17th International CloudExpo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading provider of high speed networking solutions focused on Gigabit, 10 Gigabit, 40 Gigabit and 100 Gigabit Ethernet network access and connectivity products. For over 20 ye...
Oct. 6, 2015 10:00 AM EDT Reads: 663
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 17th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity
Oct. 6, 2015 10:00 AM EDT Reads: 414