Joe McKendrick| Forbes

This sentiment was recently expressed in an interview with Motley Fool analyst Tim Byers, who analyzed the zeitgeist coming out of the South-by-Southwest (SXSW) conference and observed that cloud computing and big data were now one in the same phenomena, converging on enterprises of all shapes and sizes. Byers may have meant that the level of industry interest and innovation in big data parallels that seen in the cloud computing space in recent years.

But in many ways, the two are becoming one in the same — cloud resources are needed to support big data storage and projects, and big data is a huge business case for moving to cloud. The innovations and buzz around big data and cloud at SXSW represent a “tipping point,” Byers relates, as it is now apparent that both cloud and big data are huge springboards to innovation.

Tech-cultural confabs such as SXSW are not the only place where the convergence is being discussed. The potential these two converging forces was not lost on a more button-down affair, a recent workshop sponsored by the U.S. government’s National Institute of Standards & Technology (NIST).

Steven VanRoekel, U.S. chief information officer, says the combined initiatives to open up federal data, along with cloud computing, has the potential to create entire new industries. As an example, he cites the opening up of geographic positioning systems data in the mid 1980s, which now is embedded in a range of commercial applications.  “As a free open data stream, we almost overnight created $100 billion in value to the marketplace,” he says. Reaching back even further, he noted how the U.S. Weather Service, first launched in the Smithsonian Institute, provided an open-source approach to collection and reporting of weather data from across the country — another amazing “big data stream.”

“The government is sitting on a treasure drove of data,” VanRoekel continues. “We’re opening data, and looking at what we can do. We can greatly impact the lives of every American by just unlocking simple prices of data.” A key building block to this effort is the FedRAMP program, he says, in which the agencies pursue a “cloud-first” policy for any IT engagement. This is a key part of the government’s digital strategy — built on shared, open platforms. “Government agencies are ordered to look at the data they produce, catalog data, start to publish data, and think about machine-readable as the new default inside government. Any time we’re building a new system, or amending a system, we focus on machine readability both on the collection, as well as getting agencies to develop [application programming interfaces] around their data.”

“We’re at the verge, the tipping point of the data economy,” VanRoekel adds. “We’re just now starting to seeing companies founded on government data. The combination of cloud and big data can not only create useful insights, but also can create incredible value — not only the value we provide downstream in areas such as public safety and GPS, but real value in the way we think about decision making, the way we think about creating mission-based systems. There’s a definite multiplier effect at that intersection.”

In his interview, Byers provides some perspective on what exactly big data is to most organizations these days.  It may have already been around for some time now, either in the cloud or within on-premises systems, but tools were not available to capture or analyze it. What has changed is that organizations now have tools to make sense of this data, he says. “We were talking about sort of the cast-off stuff,  things are that are unstructured, doesn’t fit in the database. There was so much of it — registration logs on a website; all of these addresses would come in, and they’d have just bazillions of lines of data. It was of kind of like the cast-offs, the garbage, the stuff that’s sort of just left to the side, because we didn’t have the tools the analyze that stuff.”

But there’s more than the cast-off data now capable of being analyzed as well. Commercial, scientific and government organizations are figuring out ways to collect new types of data and bringing them into a new type of cloud. At the NIST conference, Dr. Thomas Cynkin, VP and GM of Fujitsu Ltd.’s Washington office, talked about the applications under development that take advantage of big data on a cloud platform. “We accumulate a wide variety and a massive amount of data, data collected from areas as diverse as the human body, automobiles, video, internet services,” he says. Potential applications include “analysis of customer behavior, community energy management systems, building energy management systems, and information analysis in smart cities.”

The most prominent application of big data on cloud, Cynkin predicts, will be in support of agribusiness, employing  “big data inputs such as environmental conditions, thus enabling farmers to manage their production more effectively.” Another potential area includes “prediction of machinery failure in multi-function printers,” he adds. “By using a cloud platform and mobile technologies, big data analysis can provide meaningful real-time results that give businesses further insights into their actual operations,” Cynkin says.

1 Comment
  1. Mikel 7 years ago

    That’s an astute answer to a tricky quesoitn

Leave a reply to Mikel Click here to cancel the reply

Your email address will not be published. Required fields are marked *

*

Copyright © 2024 xcluesiv.com All rights reserved

Log in with your credentials

Forgot your details?