David Eisner

New Year, New Cloud? Managing “Extreme Data”

Managing Extreme Data

Companies are familiar with the concept of “big data”; every piece of information they generate from day-to-day business processes or when interacting with customers is considered valuable, so long as these pieces are quickly analyzed and converted into actionable results. But according to predictions from the Institute of Electrical and Electronics Engineers (IEEE), big data is just the beginning—2014 marks the start of a race to establish leaders in the “extreme data” market.

Going to Extremes

Most IT professionals recognize the “three Vs” of big data: Volume, Variety and Velocity. To make the most of constant data flows, companies need a high volume of information for analysis, a wide variety of data to examine and a high rate of data transfer. One other V gaining ground is Veracity, which speaks to any inherent bias in collected data along with its relevance to the problem at hand. In other words, the three Vs are a good starting point but won’t point a company in the right direction if the data collected contains a massive bias or intersects only tangentially with the question being asked.

Making this market more complex is the increasing amount of data coming from previously untapped sources. This is an extension of a concept called the Internet of Things, which focuses on bringing objects and products outside the technology spectrum into the online community using the cloud. Wireless sensors and radio frequency identification (RFID) chips are commonly used to track products as they move from assembly to production to quality control and are finally shipped to the customer, or to report “unstructured” data in real time, for example how consumers use appliances and technology in their homes. The result is an rapid increase in the amount of data available to companies, enough to justify the shift in market wording from merely big to extreme.

Tools and Training

Companies face two challenges when it comes to managing extreme data: Tools and training. Technology giants and startup companies alike are able to compete in this emerging market, since it’s not the hardware they sell that matters but rather the kind of analysis they can deliver through an accessible cloud portal. Where prescriptive analytics once ruled as a way to correct inefficient businesses processes, predictive algorithms have emerged able to intelligently consider data and deliver predictions about the future rather than mere confirmations about the present.

This leads to the second challenge: Training. While these tools are deceptively simple, they will never reach maximum efficiency without skilled hands guiding the wheel. Strides are being made in cognitive computing—the science of teaching computers to think like human beings—but there’s still no replacement for the human capacity to examine the “bigger picture” and put data sets in context. Analysis tools for extreme data have evolved at a steady pace and can now handle massive information volumes with virtually no hardware lag, but trained data scientists are needed to ask the right questions; even the best data means nothing without interpretation.

Finding the Time

If hiring data scientists and paying for analytics tools sounds costly, it can be, especially if companies want to get it right the first time. As a recent Forbes article notes, however, analytics half-measures aren’t effective and can cripple efforts to make best use of extreme data.

Companies, therefore, have several choices. If this is their first foray into extreme data, it’s worth considering strategic consultation for a single analysis project. Using a trusted, experienced IT service provider lets businesses skip the step of finding and vetting analytics providers in a suddenly crowded market. If more robust analysis tools are required, there are two options: Outsource the task entirely, or supplement existing IT staff with professionals from a reputable third party. In many cases, this kind of shared responsibility offers the best mix of flexibility and security; in-house IT administrators retain control of all mission-critical data while outsourced IT handles the day-to-day details. With IT staffers already hard pressed to keep up with cloud deployments and mobile advancement, there’s nothing wrong with getting a little help for extreme data analysis.

The flow of data isn’t slowing—in fact, its pace continues to increase. Companies can’t ignore this data, nor can they blindly analyze portions hoping for actionable results. The right tools, the right training and the right help make all the difference in the transition from big to extreme.

By David Eisner

David Eisner

David is the President & CEO of Dataprise Cloud Services, an IT Services company based in Maryland. Visit Dataprise's site today to learn more about available services.

Battling Bandwidth: How to Make the Most of Collaboration Technology with What You’ve Got

Battling Bandwidth: How to Make the Most of Collaboration Technology with What You’ve Got

Collaboration Technology As collaboration technology adoption grows, high-definition (HD) video has become the industry standard and has replaced the traditional, ...
How Machine Learning Quantifies Trust & Improves Employee Experiences

How Machine Learning Quantifies Trust & Improves Employee Experiences

Machine Learning Quantifies Trust Bottom Line: By enabling enterprises to scale security with user behavior-based, contextual intelligence, Next-Gen Access strategies are ...
Multi or Hybrid Cloud, What’s the Difference?

Multi or Hybrid Cloud, What’s the Difference?

Multi Cloud You’ve likely heard about the latest trend in cloud computing commonly referred to as multi-cloud, and it is ...
RSA Conference: FUD-free or filled?

RSA Conference: FUD-free or filled?

IoT 15 Billion Units By 2021 At the annual RSA conference, there were plenty of discussions and presentations on the ...
How Artificial Intelligence Is Revolutionizing Business

How Artificial Intelligence Is Revolutionizing Business

Artificial Intelligence Revolution 84% of respondents say AI will enable them to obtain or sustain a competitive advantage. 83% believe ...
Digital Identity Trends 2017 – Previewing The Year Ahead

Digital Identity Trends 2017 – Previewing The Year Ahead

Digital Identity Trends 2017 The lack of security of the Internet of Things captured public attention this year as massive ...
Tainted, crypto-mining containers pulled from Docker Hub

Tainted, crypto-mining containers pulled from Docker Hub

Security companies Fortinet and Kromtech found seventeen tainted Docker containers that were essentially downloadable images containing programs that had been designed to mine cryptocurrencies. Further investigation found that they had been downloaded 5 million times, suggesting that hackers were ...
Cisco Announces Intent to Acquire July Systems

Cisco Announces Intent to Acquire July Systems

Today we are announcing our intent to acquire July Systems, a privately-held company headquartered in Burlingame, California with offices in Bangalore, India. We are excited to welcome July Systems and its cloud-based mobile experience and ...
Worldwide Spending on Industry Cloud by Healthcare Providers Will Be More Than Twice the Size of Financial Firms’ Industry Cloud Spending in 2018, According to IDC

Worldwide Spending on Industry Cloud by Healthcare Providers Will Be More Than Twice the Size of Financial Firms’ Industry Cloud Spending in 2018, According to IDC

FRAMINGHAM, Mass. June 14, 2018 – Industry cloud spending across four major industry groups (finance, manufacturing, healthcare, and the public sector) will total $22.5 billion globally in 2018, according to a new Worldwide Semiannual Industry Cloud Tracker from ...