Welcome!

Machine Learning Authors: Progress Blog, Elizabeth White, Jason Bloomberg, Dan Blacharski, Darren Anstee

Related Topics: @ThingsExpo, Machine Learning , @DXWorldExpo

@ThingsExpo: Blog Post

Are You Thinking About Big Data When Doing IoT? – You Should Be | @ThingsExpo #ML #IoT #M2M #BigData

Based on all estimates by industry analysts and current trends, the IoT is growing at an incredible rate and is here to stay

Are You Thinking About Big Data When Doing IoT? - You Should Be

There is no denying the Internet of Things (IoT) is a hot topic. Gartner positions IoT as being at the peak of the ‘hype cycle.' From a size perspective, these ‘Things' can be anything, from a small sensor to a large appliance, and everything in between. The data transmitted by these devices, for the most part, tends to be small - tiny packets of information destined for consumption and analysis, bringing value to the business.

Is there hype? Yes. As with any new technology, there is always a level of hype involved. Are the data packets involved small? For the most part, yes (there are always exceptions). While both may be true, The Internet of Things is growing at breakneck speed. No matter which analyst you read, the growth predictions are staggering. Gartner predicts that we will hit over 20 billion (with a B) devices by 2020. IHS predicts even larger numbers, with 30 billion by 2020, and over 75 billion devices by 2025. No matter what, that's a lot of devices, and no matter how small the packets, multiplied by the number of devices, that's a lot of data.

It's not the things, it's the data
What I find interesting is that many times the focus of discussion when talking IoT are the devices, the sensors, the hardware itself. The latest Fitbit or smartwatch. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). Yes, those technologies are interesting (okay, fascinating, I will admit, my inner geek loves getting down into the actual technologies), but when we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing?

What I am about to say may sound like heresy to many. IoT is not about the devices. The devices are not the end goal. The devices are tools, mechanisms, conduits, conduits of information. They provide (and consume) information. Massive amounts of information. A former colleague of mine for years was always fond of saying, ‘Ed, It's all about the data.' In the burgeoning world of IoT that statement identifies the true business value of IoT. Information.

Watching out for potholes
Recently, Ford announced they were testing a pothole detector and alert system for cars. Living in New England, let me tell you, potholes are the bane of a car driver's existence. Many a car ends up in the repair shop during pothole season. Given that, the concept is intriguing. The manufacturer has cameras mounted on the vehicles. The cameras scan the roadway around the vehicle looking for signs of potholes. Image recognition allows it to make this determination. If a pothole is detected, the system will allow the car to avoid hitting the pothole, and thus potential damage to the vehicle.

Now some would say, ‘what does that have to do with big data?' The system is self-contained within the vehicle. To be useful, the system needs to react in near real-time to the situation. It doesn't have time to send all the data back to the cloud for analysis to determine if there is a pothole. Also, what if it loses network connection? All valid points. Let's take a step back, and look at the bigger picture.

  • How does the system recognize a pothole? Image recognition. What does image recognition need? Lots of data about what potholes look like. Machine learning algorithms help it determine if its seeing a pothole, and those algorithms need data to do that.
  • What will be the source of those pothole images? Wouldn't it be useful if images of any potholes the system encounters become part of the source data for the image recognition system to improve its detection? Wouldn't it be useful to provide that back to a central location to improve the algorithms and detection software, which could then be sent back to all the other vehicles to improve their capability?
  • What about all the cars without the system? Wouldn't it be nice if the pothole locations were flagged to the various GPS applications people use so they are aware of the pothole and its location?
  • What about the local public works department? Wouldn't it be nice if they were automatically notified about the new pothole identified so it could be repaired?

Ingestion considerations
Given the importance of the data to the success of any IoT implementation, ingesting that information is critical to the successful implementation.

  • Data Quality - In the world of data, quality has always been an important consideration. Data cleansing and scrubbing is standard practice already in many organizations. It has become critical for IoT implementations. Ingesting dirty data into even the best IoT implementation will bring it to a grinding halt.
  • Data Volume - As I have mentioned already, many times the data packets for an individual device/sensor are small. That being said, multiplied by the sheer number of devices, the volume can quickly overwhelm a network or storage environment if not planned for appropriately. These considerations also must take into account location
  • Data Timeliness - Besides volume, new and timely data is also a consideration. In the pothole example, if the last update was weeks ago, how valid is the location anymore?
  • Data Pedigree - Where did the data come from? Is it a valid source? The pedigree is less important when using internal systems, as the source is well known, but IoT systems, by their nature, frequently will be getting their data from devices and sources outside the normal perimeter. This requires extra effort to ensure you trust the information being consumed.

No technology negates the need for good design and planning
Based on all estimates by industry analysts and current trends, the Internet of Things is growing at an incredible rate and is here to stay. There is a big radar blip of data outside your data center that is not going anywhere. That data provides great value, but also many challenges that need to be taken into consideration. If you are doing IoT and are not looking at Big Data, you are missing an opportunity and business value. As many of my readers have heard me say frequently, no technology negates the need for good design and planning. The Internet of Things and the accompanying Big Data demands it if you are to be successful.

More Stories By Ed Featherston

Ed Featherston is VP, Principal Architect at Cloud Technology Partners. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.