David Eisner

New Year, New Cloud? Managing “Extreme Data”

Managing Extreme Data

Companies are familiar with the concept of “big data”; every piece of information they generate from day-to-day business processes or when interacting with customers is considered valuable, so long as these pieces are quickly analyzed and converted into actionable results. But according to predictions from the Institute of Electrical and Electronics Engineers (IEEE), big data is just the beginning—2014 marks the start of a race to establish leaders in the “extreme data” market.

Going to Extremes

Most IT professionals recognize the “three Vs” of big data: Volume, Variety and Velocity. To make the most of constant data flows, companies need a high volume of information for analysis, a wide variety of data to examine and a high rate of data transfer. One other V gaining ground is Veracity, which speaks to any inherent bias in collected data along with its relevance to the problem at hand. In other words, the three Vs are a good starting point but won’t point a company in the right direction if the data collected contains a massive bias or intersects only tangentially with the question being asked.

Making this market more complex is the increasing amount of data coming from previously untapped sources. This is an extension of a concept called the Internet of Things, which focuses on bringing objects and products outside the technology spectrum into the online community using the cloud. Wireless sensors and radio frequency identification (RFID) chips are commonly used to track products as they move from assembly to production to quality control and are finally shipped to the customer, or to report “unstructured” data in real time, for example how consumers use appliances and technology in their homes. The result is an rapid increase in the amount of data available to companies, enough to justify the shift in market wording from merely big to extreme.

Tools and Training

Companies face two challenges when it comes to managing extreme data: Tools and training. Technology giants and startup companies alike are able to compete in this emerging market, since it’s not the hardware they sell that matters but rather the kind of analysis they can deliver through an accessible cloud portal. Where prescriptive analytics once ruled as a way to correct inefficient businesses processes, predictive algorithms have emerged able to intelligently consider data and deliver predictions about the future rather than mere confirmations about the present.

This leads to the second challenge: Training. While these tools are deceptively simple, they will never reach maximum efficiency without skilled hands guiding the wheel. Strides are being made in cognitive computing—the science of teaching computers to think like human beings—but there’s still no replacement for the human capacity to examine the “bigger picture” and put data sets in context. Analysis tools for extreme data have evolved at a steady pace and can now handle massive information volumes with virtually no hardware lag, but trained data scientists are needed to ask the right questions; even the best data means nothing without interpretation.

Finding the Time

If hiring data scientists and paying for analytics tools sounds costly, it can be, especially if companies want to get it right the first time. As a recent Forbes article notes, however, analytics half-measures aren’t effective and can cripple efforts to make best use of extreme data.

Companies, therefore, have several choices. If this is their first foray into extreme data, it’s worth considering strategic consultation for a single analysis project. Using a trusted, experienced IT service provider lets businesses skip the step of finding and vetting analytics providers in a suddenly crowded market. If more robust analysis tools are required, there are two options: Outsource the task entirely, or supplement existing IT staff with professionals from a reputable third party. In many cases, this kind of shared responsibility offers the best mix of flexibility and security; in-house IT administrators retain control of all mission-critical data while outsourced IT handles the day-to-day details. With IT staffers already hard pressed to keep up with cloud deployments and mobile advancement, there’s nothing wrong with getting a little help for extreme data analysis.

The flow of data isn’t slowing—in fact, its pace continues to increase. Companies can’t ignore this data, nor can they blindly analyze portions hoping for actionable results. The right tools, the right training and the right help make all the difference in the transition from big to extreme.

By David Eisner

David Eisner

David is the President & CEO of Dataprise Cloud Services, an IT Services company based in Maryland. Visit Dataprise's site today to learn more about available services.

CONTRIBUTORS

Chris

The Cloud Isn’t a Security Issue; It’s a Security Opportunity

Security Issue In order to stay ahead in today’s competitive business landscape, companies need to constantly innovate. Development teams must ...
5 Ways New Technology is Having an Impact on the Energy Sector

5 Ways New Technology is Having an Impact on the Energy Sector

Tech Energy Sector We’ve discussed here in the past how cleantech (a blanket term that includes technologies that affect recycling, ...
The Unintended – and Intended – Consequences of Cloud Data Sovereignty

The Unintended – and Intended – Consequences of Cloud Data Sovereignty

Cloud Data Sovereignty It seems that everything has unintended consequences – whether positive or negative. Intended consequences are those that ...
Cyber Security Tips For Digital Collaboration

Cyber Security Tips For Digital Collaboration

Cyber Security Tips October is National Cyber Security Awareness Month – a joint effort by the Department of Homeland Security ...
3 Ways to Protect Users From Ransomware With the Cloud

3 Ways to Protect Users From Ransomware With the Cloud

Protect Users From Ransomware The threat of ransomware came into sharp focus over the course of 2016. Cybersecurity trackers have ...
Do Not Rely On Passwords To Protect Your Online Information

Do Not Rely On Passwords To Protect Your Online Information

Do Not Rely On Passwords Simple passwords are no longer safe to use online. John Barco, vice president of Global ...
Digital Twin And The End Of The Dreaded Product Recall

Digital Twin And The End Of The Dreaded Product Recall

The Digital Twin  How smart factories and connected assets in the emerging Industrial IoT era along with the automation of ...
Istio 1.0: Making It Easier To Develop and Deploy Microservices

Istio 1.0: Making It Easier To Develop and Deploy Microservices

With the recent availability of Istio 1.0 it is not surprising that it continues to capture much attention from the ...
Top Security and IT Priorities To Pay Close Attention To

Top Security and IT Priorities To Pay Close Attention To

Top Security Priorities By 2019, cybercrime is expected to cost businesses over $2.1 trillion globally according to Juniper Research. Needless ...
A Look Beyond the Basics of Cloud Database Services: What’s Next for DBaaS?

A Look Beyond the Basics of Cloud Database Services: What’s Next for DBaaS?

Cloud Database Services When it comes to choosing the right database management system (DBMS), developers and data analysts today face ...