Welcome!

Machine Learning Authors: Aruna Ravichandran, Carmen Gonzalez, Elizabeth White, Liz McMillan, Jyoti Bansal

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, Government Cloud, SDN Journal

@BigDataExpo: Blog Post

Public Sector Big Data: Five Ways Big Data Must Evolve in 2013

2012 will go down as a “Big” year for Big Data in the public sector

By

Editor’s note: This guest post provides context on mission focused data analytics in the federal space by one of the leaders of the federal big data movement, Ray Muslimani. -bg

2012 will go down as a “Big” year for Big Data in the public sector. Rhetoric and hype has been followed by tangible action on the part of both government and industry. The $200 million Big Data initiative unveiled by the White House in March 2012 was an injection of R&D and credibility towards efforts to develop tools and technologies to help solve the nation’s most pressing challenges.

On the industry side, the recently issued TechAmerica report, “Demystifying Big Data,” provides agencies with a roadmap for using Big Data to better serve citizens. It also offers a set of policy recommendations and practical steps agencies can take to get started with Big Data initiatives.

For all of the enthusiasm around Big Data this year, every indication is that 2013 will be the year when Big Data transforms the business of government. Below are 5 steps that need to be taken in order for Big Data to evolve in 2013 and deliver on its promise.

Demystify Big Data
Government agencies warmed to the potential of Big Data throughout 2012, but more education is required to help decision makers wade through their options and how further investments can be justified. Removing the ambiguitiessurrounding Big Data requires an emphasis in 2013 on education from both industry and government.

The TechAmerica Big Data report is a good example of how industry can play an active role in guiding agencies through Big Data initiatives. It also underscores that vendors can’t generate more Big Data RFPs through marketing slicks and sales tactics alone. This approach will not demystify Big Data – it will simply seed further doubt if providers of Big Data tools and solutions focus only on poking holes in competitor alternatives.

Industry and government should follow proven templates for education in 2013. For example, agencies can arrange “Big Data Days” in a similar format as Industry Tech Days occur today. Big Data industry days can help IT providers gain better insight into how each Agency plans to approach their Big Data challenges in 2013 and offer these agencies an opportunity to see a wide range of Big Data services.

The Big Data education process must also extend to contracting officers. Agencies need guidance on how RFPs can be constructed to address a service-based model.

Consumerize Big Data
While those within the public sector with the proper training and skills to analyze data have benefited from advanced Big Data tools, it has been far more difficult for everyday business users and decision makers to access the data in a useful way. Sluggish data query responses, data quality issues, and a clunky user experience is undermining the benefits Big Data Analytics can deliver and requiring users to be de facto “data scientists” to make sense of it all.

Supporting this challenge is a 2012 MeriTalk survey, “The Big Data Gap,” that finds just 60 percent of IT professionals indicate their agency is analyzing the data it collects and a modest 40 percent are using data to make strategic decisions. All of this despite the fact that 96 percent of those surveyed expects their agency’s stored data to grow in the next two years by an average of 64 percent.  The gap here suggests a struggle for non “data scientists” to convert data into business decisions. 

What if any government user could ask a question in natural language and receive the answer in a relevant visualization?  For Big Data to evolve in 2013 we must consumerize the user experience by removing spreadsheets and reports, and place the power of analytics in the hands of users of any level without analytics expertise.

Mobilize Big Data
IDC Government Insights predicts that in 2013, 35 percent of new Federal and state applications will be mobile. At the same time, 65 percent of Federal IT executives expect mobile device use to increase by 20 percent in 2013, according to The 2012-2013 Telework/Mobile IT Almanac.

Part of consumerizing Big Data means building it for any device so that users do not need to be tethered to their desktops to analyze data. Agency decision makers must be empowered to easily view and analyze data on tablets and smartphones, while the increase of teleworking in the public sector requires Big Data to be accessible from anywhere, at any time, and on any device.

There is promising innovation at work by both established Federal IT providers and upstarts in taking a mobile-first path to Big Data, rather than the traditional approach of building BI dashboards for the desktop. The degree to which 2013 sees a shift in Big Data from the desktop to tablets and smartphones will depend on how forcefully solutions providers employ a mobile-first approach to Big Data.

Act on Big Data
A tremendous amount of “thought” energy went into Big Data in 2012. For Big Data to evolve in a meaningful way in 2013, initiatives and studies must generate more action in the form of Big Data RFIs and RFPs.

Within the tight budget climate, agencies will not act on Big Data if vendor proposals require massive investments in IT infrastructure and staffing. There must be a shift –to the extent possible – of the financial and resource burden from agency to vendor. For example, some vendors have developed “Big Data Clouds” that allow agencies to leverage a secure, scalable framework for storing and managing data, along with a toolset for performing consumer-grade search and analysis on that data.

Open Big Data
Adoption of Big Data solutions has been accelerated by open source tools such as Hadoop, MapReduce, Hive, and HBase. While some agencies will find it tempting to withdraw to the comfort of proprietary Big Data tools that they can control in closed systems, that path undermines the value Big Data can ultimately deliver.

One could argue that as open source goes in 2013, Big Data goes as well. If open source platforms and tools continue to address agency demands for security, scalability, and flexibility, benefits within from Big Data within and across agencies will increase exponentially. There are hundreds of thousands of viable open source technologies on the market today. Not all are suitable for agency requirements, but as agencies update and expand their uses of data, these tools offer limitless opportunities to innovate. Additionally, opting for open source instead of proprietary vendor solutions prevents an agency from being locked into a single vendor’s tool that it may at some point outgrow or find ill suited for their needs.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@CloudExpo Stories
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, will highlight the current challenges of these transformative technologies and share strategies for preparing your organization for these changes. This “view from the top” will outline the latest trends and developm...
Discover top technologies and tools all under one roof at April 24–28, 2017, at the Westin San Diego in San Diego, CA. Explore the Mobile Dev + Test and IoT Dev + Test Expo and enjoy all of these unique opportunities: The latest solutions, technologies, and tools in mobile or IoT software development and testing. Meet one-on-one with representatives from some of today's most innovative organizations
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.