|By Chris Evans||
|November 14, 2012 08:30 AM EST||
In the first wave of solid-state storage arrays, we saw commodity style SSDs (solid state drives) being added to traditional storage arrays. This solution provided an incremental benefit in performance over spinning hard drives, however the back-end technology in these arrays was developed up to 20 years ago and was purely focused around driving performance out of the slowest part of the infrastructure – the hard drive. Of course SSDs are an order of magnitude faster than HDDs so you can pretty much guarantee SSDs in traditional arrays results in underused resources, but is premium priced.
Wave 2 of SSD arrays saw the development of custom hardware, mostly still continuing to use commodity SSDs. At this point we saw full exploitation of the solid state capabilities, with architecture designed to provide the full performance capabilities of solid state drives. These arrays removed unnecessary or bottlenecking features (like cache) and provided much more back-end scalability. Within the wave 2 group, Nimbus Data have chosen a hybrid approach and developed their own solid state drives. This gives them more control over the management functionality of the SSDs and subsequently more control over performance and availability.
Notably, some startup vendors have taken a slightly different approach. Violin Memory have chosen from day 1 to use custom NAND memory cards called VIMMs (Violin Intelligent Memory Module). This technology removes the need for NAND to emulate a hard drive and for the interface between the processor/memory & persistent memory (e.g. the NAND) to go across a hard drive interface like SAS using the SCSI protocol. Whilst it could be debated that the savings from removing the disk drive protocol could be marginal, the use of NAND that doesn’t emulate hard drives is about much more than that. SSD controllers have many features to extend the life of the drive itself. This includes wear levelling and garbage collection, features that could have a direct impact on device performance. Custom NAND components can, for instance allow wear levelling to be achieved across the entire array or for individual cell failures to be managed more efficiently.
Building bespoke NAND components isn’t cheap. Violin have chosen to invest in technology that they believe gives them an advantage in their hardware – no dependency on SSD manufacturers. The ability to build advanced functionality into their persistent memory means availability can be increased (components don’t need to be swapped out as frequently – failing components can be partially used).
At this point we should do a call out to Texas Memory Systems, recently acquired by IBM. They have also used custom NAND components; their RamSan-820 uses 500GB flash modules using eMLC memory.
I believe that the third wave will see many more vendors looking to move away from the SSD form factor and building bespoke NAND components as Violin have done. Currently Violin and TMS have the headstart. They’ve done the hard work and built the foundation of their platform. Their future innovations will probably revolve around bigger and faster devices and replacing NAND with whatever is the next generation of persistent memory.
Last week, HDS announced their approach to full flash devices; a new custom-build Flash Module Drive (FMD) that can be added to the VSP platform. This provides 1.6TB or 3.2TB (higher capacity due March 2013) of storage per module, which can then be stacked into an 8U shelf of 48 FMDs in total – a total of 600TB of flash in a single VSP. Each FMD is like a traditional SSD drive in terms of height and width, but is much deeper in size. It appears to the VSP as a traditional SSD.
The FMD chassis is separate to the existing disk chassis that are deployed in the VSP and so FMDs can’t be deployed in conjunction with hard drives. Although this seems like a negative, the flash modules have higher specification back-end directors (to fully utilise the flash performance), which, in addition to their size, explains why they wouldn’t be mixed together.
Creating a discrete flash module provides Hitachi with a number of benefits compared to individual MLC SSDs including:
- Higher performance on mixed workloads
- Inbuilt compression using the onboard custom chips
- Improved ECC error correction using onboard code and hardware
- Lower power per TB consumption from higher memory density
- > 1,000,000 IOPS in a single array
The new FMDs can also be used with HDT (dynamic tiering) to cater for mixed sub-LUN workloads and of course Hitachi’s upgraded microcode is already optimised to work with flash devices.
The Architect’s View
Solid state storage continues to evolve. NAND flash is fast and has its foibles but this can be overcome with dedicated NAND modules. Today, only four vendors have moved to dedicated solid-state components while the others continue to use commodity SSDs. At scale, performance and availability, when viewed in terms of consistency become much more important. Many vendors today are producing high performance devices, but how well will they scale going forward and how resilient will they be? As the market matures, these differences will be the dividing line between survival and failure.
Disclaimer: I recently attended the Hitachi Bloggers’ and Influencers’ Days 2012. My flights and accommodation were covered by Hitachi during the trip, however there is no requirement for me to blog about any of the content presented and I am not compensated in any way for my time when attending the event. Some materials presented were discussed under NDA and don’t form part of my blog posts, but could influence future discussions.
- Optimising Storage Architectures for SSD
- Hitachi Accelerated Flash Storage Ignites Flash From Enlightenment to Productivity
Comments are always welcome; please indicate if you work for a vendor as it’s only fair. If you have any related links of interest, please feel free to add them as a comment for consideration.
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. ...
Oct. 20, 2014 02:00 PM EDT Reads: 1,404
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
Oct. 20, 2014 01:45 PM EDT Reads: 1,386
Cloudwick, the leading big data DevOps service and solution provider to the Fortune 1000, announced Big Loop, its multi-vendor operations platform. Cloudwick Big Loop creates greater collaboration between Fortune 1000 IT staff, developers and their database management systems as well as big data vendors. This allows customers to comprehensively manage and oversee their entire infrastructure, which leads to more successful production cluster operations, and scale-out. Cloudwick Big Loop supports ...
Oct. 20, 2014 01:45 PM EDT Reads: 1,503
StackIQ offers a comprehensive software suite that automates the deployment, provisioning, and management of Big Infrastructure. With StackIQ’s software, you can spin up fully configured big data clusters, quickly and consistently — from bare-metal up to the applications layer — and manage them efficiently. Our software’s modular architecture allows customers to integrate nearly any application with the StackIQ software stack.
Oct. 20, 2014 01:00 PM EDT Reads: 1,303
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Oct. 20, 2014 01:00 PM EDT Reads: 1,285
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce t...
Oct. 20, 2014 12:00 PM EDT Reads: 1,746
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
Oct. 20, 2014 12:00 PM EDT Reads: 1,452
TechCrunch reported that "Berlin-based relayr, maker of the WunderBar, an Internet of Things (IoT) hardware dev kit which resembles a chunky chocolate bar, has closed a $2.3 million seed round, from unnamed U.S. and Switzerland-based investors. The startup had previously raised a €250,000 friend and family round, and had been on track to close a €500,000 seed earlier this year — but received a higher funding offer from a different set of investors, which is the $2.3M round it’s reporting."...
Oct. 20, 2014 09:00 AM EDT Reads: 1,381
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Oct. 19, 2014 10:45 PM EDT Reads: 1,882
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, da...
Oct. 19, 2014 10:00 PM EDT Reads: 1,358
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
Oct. 19, 2014 09:00 PM EDT Reads: 1,569
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services.
Oct. 19, 2014 11:15 AM EDT Reads: 1,871
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.
Oct. 19, 2014 11:00 AM EDT Reads: 1,813
SYS-CON Events announced today that Objectivity, Inc., the leader in real-time, complex Big Data solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies – InfiniteGraph®, The Distributed Graph Database™ and Objectivity/DB®, a distributed and scalable object ma...
Oct. 18, 2014 12:00 PM EDT Reads: 1,663
In their session at DevOps Summit, Stan Klimoff, CTO of Qubell, and Mike Becker, Senior Data Engineer for RingCentral, will share the lessons learned from implementing CI/CD pipeline on AWS for a customer analytics project powered by Cloudera Hadoop, HP Vertica and Tableau. Stan Klimoff is CTO of Qubell, the enterprise DevOps platform. Stan has more than a decade of experience building distributed systems for companies such as eBay, Cisco and Seagate. Qubell is helping enterprises to become mor...
Oct. 17, 2014 08:00 PM EDT Reads: 1,562
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo, moderated by Ashar Baig, Research ...
Oct. 17, 2014 07:45 PM EDT Reads: 1,556
What are the benefits of using an enterprise-grade orchestration platform? In their session at 15th Cloud Expo, Jeff Tegethoff, CEO of Appcore, and Kedar Poduri, Senior Director of Product Management at Citrix Systems, will take a closer look at the architectural design factors needed to support diverse workloads and how to run these workloads efficiently as a service provider. They will also discuss how to deploy private cloud environments in 15 minutes or less.
Oct. 17, 2014 05:00 PM EDT Reads: 1,977
Seagate has a strong track record of collaborating with others to develop better cloud solutions. The Seagate Cloud Builder Alliance program, for example, leverages the company’s knowledge of storage and cloud-optimized solutions to give cloud service providers the customized, flexible and scalable server and storage solutions to meet the high levels of service their customers demand. Seagate also is a member of the OpenStack Foundation and Open Compute Project to help define and promote open-so...
Oct. 17, 2014 03:30 PM EDT Reads: 1,445
Big Data means many things to many people. From November 4-6 at the Santa Clara Convention Center, thousands of people will gather at Big Data Expo to discuss what it means to them, how they are implementing it, and how Big Data plays an integral role in the maturing cloud computing world and emerging Internet of Things. Attend Big Data Expo and make your contribution. Register for Big Data Expo "FREE" with Discount Code "BigDataOCTOBER" by October 31
Oct. 17, 2014 03:00 PM EDT Reads: 1,637
IBM and SAP announced a major strategic agreement that integrates Big Blue's datacenter capabilities with Big Hasso's ongoing HANA cloud-based database initiative. IBM is investing $1.2 billion to build 15 new datacesnters as part of its SoftLayer acquisition and expansion, and will also use its existing IBM Cloud servers to support the new agreement with SAP. The announcement was made jointly in Armonk, NY and Walldorf, Germany. IBM is described by SAP as a “premier strategic provider” of cl...
Oct. 17, 2014 01:00 PM EDT Reads: 1,931