Click here to close now.


IoT User Interface Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Gary Kaiser, Liz McMillan

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, API Journal, Agile Computing, @CloudExpo

Containers Expo Blog: Article

CIOs' Top Priority: Analytics and BI

How to Deal with the Data Integration Bottleneck

Whether as a driver for growth, a means to attract and retain customers, or a way to drive innovation and reduce costs, the business value of analytics and business intelligence has never been higher.

Gartner's Amplifying the Enterprise: The 2012 CIO Agenda as well as IBM's Global CIO Study 2011 confirm this point, with analytics and BI setting atop CIO's technology priorities in both reports.

Data Integration Is the Biggest Bottleneck
Providing analytics and BI solutions with the data required has always been difficult, with data integration long considered the biggest bottleneck in any analytics or BI project.

Complex data landscapes, diverse data types, new sources such as big data and the cloud are but a few of the well-known barriers.

For the past two decades, the default solution has been to first consolidate the data into a data warehouse, and then provide users with tools to analyze and report on this consolidated data.

However, data integration based on these traditional replication and consolidation approaches have numerous moving parts that must be synchronized. Doing this right extends lead times.

The Data Warehousing Institute confirms this lack of agility. Their recent study stated the average time needed to add a new data source to an existing BI application was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. And 33% of the organizations needed more than 3 months to add a new data source.

Data Virtualization Brings Agility to Analytics and BI
According to Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, data virtualization significantly accelerates data integration agility. Key to this success has been data virtualization's ability to provide:

  • A more streamlined data integration approach
  • A more iterative development process
  • A more adaptable change management process

Using data virtualization as a complement to existing data integration approaches, the ten organizations profiled in the book cut analytics and BI project times in half or more.

This agility allowed the same teams to double their number of analytics and BI projects, significantly accelerating business benefits.

For more insights on data virtualization and business agility, check out my earlier articles on this topic.

Simplify to Overcome Historical IT Complexity

Data virtualization's simplified information access and faster time-to-solution is especially useful as an enabler for  more agile analytics and BI

Is Data Virtualization the Fast Path to BI Agility? describes how the architectures of most business intelligence systems are based on a complex chain of data stores starting with production databases, data staging areas, a data warehouse, dependent data marts, and personal data stores.   Simply maintaining this complexity is overwhelming IT today.

These classic BI architectures served business well for the last twenty years. However, considering the need for more agility, they have some disadvantages:

  • Duplication of data
  • Non-shared meta data specifications
  • Limited flexibility
  • Decrease of data quality
  • Limited support for operational reporting:
  • Limited support for reporting on unstructured and external data"

From a different point of view, SOA World's Zettabytes of Data and Beyond describes the challenges of force-fitting development methods that were appropriate for earlier times when less data complexity was the norm.

In addition, the proliferation of fit-for-purpose data stores including data warehouse appliances, Hadoop-based file systems, and a range of No-SQL data stores are breaking the hegemony of the traditional data warehouse as the "best" solution to the enterprise-level data integration problem.   The business and IT impact of these new approaches can be explored in the Virtualization Magazine article NoSQL and Data Virtualization - Soon to Be Best Friends.

Self-Service Analytics and BI are Important Too!
Responding to constantly changing business demands for analytics and BI is a daunting effort.

Mergers and acquisitions and evolving supply chains require new comparisons and aggregations. The explosion of social media drives demand for new customer insights. Mobile computing changes form factors. And self-service BI puts users in the driver's seat.

Business Taking Charge of Analytics and BI

In true Darwinian fashion, the business side of most organizations is now taking greater responsibility for fulfilling its own information needs rather than depending solely on already-burdened IT resources.

For example, in a 2011 survey of over 625 business and IT professionals entitled Self-Service Business Intelligence: TDWI Best Practices Report, @TDWI July 2011,The Data Warehousing Institute (TDWI) identified the following top five factors driving businesses toward self-service business intelligence:

  • Constantly changing business needs (65%)
  • IT's inability to satisfy new requests in a timely manner (57%)
  • The need to be a more analytics-driven organization (54%)
  • Slow and untimely access to information (47%)
  • Business user dissatisfaction with IT-delivered BI capabilities (34%)

In the same survey report, authors Claudia Imhoff and Colin White suggest that IT's focus shifts toward making it easier for business users "to access the growing number of dispersed data sources that exist in most organizations."

Examples Imhoff and White cite include:

  • providing friendlier business views of source data
  • improving on-demand access to data across multiple data sources
  • enabling data discovery and search functions
  • supporting access to other types of data, such as unstructured documents; and more.

Data Virtualization to the Self-Service Rescue

In the TDWI survey, 60% of respondents rated business views of source data as "very important," and 44% said on-demand access to multiple data sources using data federation technologies was "very important."

According to Imhoff and White, "Data virtualization and associated data federation technologies enable BI/DW builders to build shared business views of multiple data sources so that the users do not have to be concerned about the physical location or structure of the data.

These views are sometimes known as virtual business views because, from an application perspective, the data appears to be consolidated in a single logical data store. In fact, it may be managed in multiple physical data structures on several different servers.

Data virtualization platforms such as the Composite Data Virtualization Platform support access to different types of data sources, including relational databases, non-relational systems, application package databases, flat files, Web data feeds, and Web services.

To Achieve Self-Service BI, Consider Using Data Virtualization provides additional insights on about how data virtualization enables self-service analytics and BI.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@CloudExpo Stories
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...