New Year, New Cloud? Managing “Extreme Data”

David Eisner

Managing Extreme Data

Companies are familiar with the concept of “big data”; every piece of information they generate from day-to-day business processes or when interacting with customers is considered valuable, so long as these pieces are quickly analyzed and converted into actionable results. But according to predictions from the Institute of Electrical and Electronics Engineers (IEEE), big data is just the beginning—2014 marks the start of a race to establish leaders in the “extreme data” market.

Going to Extremes

Most IT professionals recognize the “three Vs” of big data: Volume, Variety and Velocity. To make the most of constant data flows, companies need a high volume of information for analysis, a wide variety of data to examine and a high rate of data transfer. One other V gaining ground is Veracity, which speaks to any inherent bias in collected data along with its relevance to the problem at hand. In other words, the three Vs are a good starting point but won’t point a company in the right direction if the data collected contains a massive bias or intersects only tangentially with the question being asked.

Making this market more complex is the increasing amount of data coming from previously untapped sources. This is an extension of a concept called the Internet of Things, which focuses on bringing objects and products outside the technology spectrum into the online community using the cloud. Wireless sensors and radio frequency identification (RFID) chips are commonly used to track products as they move from assembly to production to quality control and are finally shipped to the customer, or to report “unstructured” data in real time, for example how consumers use appliances and technology in their homes. The result is an rapid increase in the amount of data available to companies, enough to justify the shift in market wording from merely big to extreme.

Tools and Training

Companies face two challenges when it comes to managing extreme data: Tools and training. Technology giants and startup companies alike are able to compete in this emerging market, since it’s not the hardware they sell that matters but rather the kind of analysis they can deliver through an accessible cloud portal. Where prescriptive analytics once ruled as a way to correct inefficient businesses processes, predictive algorithms have emerged able to intelligently consider data and deliver predictions about the future rather than mere confirmations about the present.

This leads to the second challenge: Training. While these tools are deceptively simple, they will never reach maximum efficiency without skilled hands guiding the wheel. Strides are being made in cognitive computing—the science of teaching computers to think like human beings—but there’s still no replacement for the human capacity to examine the “bigger picture” and put data sets in context. Analysis tools for extreme data have evolved at a steady pace and can now handle massive information volumes with virtually no hardware lag, but trained data scientists are needed to ask the right questions; even the best data means nothing without interpretation.

Finding the Time

If hiring data scientists and paying for analytics tools sounds costly, it can be, especially if companies want to get it right the first time. As a recent Forbes article notes, however, analytics half-measures aren’t effective and can cripple efforts to make best use of extreme data.

Companies, therefore, have several choices. If this is their first foray into extreme data, it’s worth considering strategic consultation for a single analysis project. Using a trusted, experienced IT service provider lets businesses skip the step of finding and vetting analytics providers in a suddenly crowded market. If more robust analysis tools are required, there are two options: Outsource the task entirely, or supplement existing IT staff with professionals from a reputable third party. In many cases, this kind of shared responsibility offers the best mix of flexibility and security; in-house IT administrators retain control of all mission-critical data while outsourced IT handles the day-to-day details. With IT staffers already hard pressed to keep up with cloud deployments and mobile advancement, there’s nothing wrong with getting a little help for extreme data analysis.

The flow of data isn’t slowing—in fact, its pace continues to increase. Companies can’t ignore this data, nor can they blindly analyze portions hoping for actionable results. The right tools, the right training and the right help make all the difference in the transition from big to extreme.

By David Eisner

Episode 2: Coronavirus Phishing Emails and Work-from-Home Meetings

Coronavirus Phishing Emails What to watch out for as scammers exploit pandemic panic, and tips ...

Episode 6: Cloud Migration: Why It’s More Important Than Ever

The Importance of Cloud Migration Moving fully to the cloud is still a concern for ...

Episode 1: Why Small and Medium Sized Businesses Need an MSP

Small and Medium Sized Businesses Need an MSP Small and medium-sized businesses don’t enjoy the ...
Kishore Durg

Relevance at scale is the key to growth – just ask Del Monte Foods

Relevance at scale is the key to growth Consumer goods companies have seldom had things tougher. The possibilities shown to consumers by customer experience leaders ...
Steve Prentice

Episode 4: The Power of Regulatory Compliant Cloud: A European Case Study

An interview with Johan Christenson, CEO of CityNetwork With the world focusing on the big three hyperscalers, there is still room – and much necessity ...
Kayla Matthews

Higher-Ups More Likely to Break Policy, Data Breach Survey Finds

Data Protection Policies In an ideal scenario, the people at the highest levels of an organization would be the most likely to abide by data ...
Mark Kirstein

2020 Market Predictions: Cloud-Services Growth Will Continue

2020 Tech Market Predictions The beginning of every new year is a healthy time for businesses to survey the cloud landscape, reflect on the market ...
Garry Connolly

Data Policy is Fundamental for Trust

Data Policy Trust Consumers once owned and protected their data independent of anyone else. Handwritten letters, paper bank statements, medical records locked up in a ...
Christian Buckley

The Evolution of SharePoint Customization

When I started working with SharePoint back in 2005, deploying WSS 2.0 followed and then SharePoint Portal Server 2003 for a large client, the concept ...
Steve Prentice

Episode 3: The Bottomless Cloud – An Interview with David Friend of Wasabi

Why data is not “the new oil” and why “cloud” means more than we think. In his new book, author David Friend refers to the ...
Ajay

Explainable Intelligence Part 2 – Illusion of the Free Will

Illusion of the Free Will Explainable Artificial Intelligence (XAI) is getting a lot of attention these days, and like most people, you're drawn to it because ...
Yuri Sagalov

IT Culture Clash Where Employees Use Multiple Devices To Collaborate

Employees use multiple devices to collaborate It used to be that company IT decision makers could simply dictate the software that business units would use ...
Google Prog

Working with security researchers to make the web safer for everyone

Working with security researchers What do a 19-year-old researcher from Uruguay, a restaurant owner from Cluj, Romania and a Cambridge professor have in common? They’re ...