January 21, 2014

New Year, New Cloud? Managing “Extreme Data”

By David Eisner

Managing Extreme Data

Companies are familiar with the concept of “big data”; every piece of information they generate from day-to-day business processes or when interacting with customers is considered valuable, so long as these pieces are quickly analyzed and converted into actionable results. But according to predictions from the Institute of Electrical and Electronics Engineers (IEEE), big data is just the beginning—2014 marks the start of a race to establish leaders in the “extreme data” market.

Going to Extremes

Most IT professionals recognize the “three Vs” of big data: Volume, Variety and Velocity. To make the most of constant data flows, companies need a high volume of information for analysis, a wide variety of data to examine and a high rate of data transfer. One other V gaining ground is Veracity, which speaks to any inherent bias in collected data along with its relevance to the problem at hand. In other words, the three Vs are a good starting point but won’t point a company in the right direction if the data collected contains a massive bias or intersects only tangentially with the question being asked.

Making this market more complex is the increasing amount of data coming from previously untapped sources. This is an extension of a concept called the Internet of Things, which focuses on bringing objects and products outside the technology spectrum into the online community using the cloud. Wireless sensors and radio frequency identification (RFID) chips are commonly used to track products as they move from assembly to production to quality control and are finally shipped to the customer, or to report “unstructured” data in real time, for example how consumers use appliances and technology in their homes. The result is an rapid increase in the amount of data available to companies, enough to justify the shift in market wording from merely big to extreme.

Tools and Training

Companies face two challenges when it comes to managing extreme data: Tools and training. Technology giants and startup companies alike are able to compete in this emerging market, since it’s not the hardware they sell that matters but rather the kind of analysis they can deliver through an accessible cloud portal. Where prescriptive analytics once ruled as a way to correct inefficient businesses processes, predictive algorithms have emerged able to intelligently consider data and deliver predictions about the future rather than mere confirmations about the present.

This leads to the second challenge: Training. While these tools are deceptively simple, they will never reach maximum efficiency without skilled hands guiding the wheel. Strides are being made in cognitive computing—the science of teaching computers to think like human beings—but there’s still no replacement for the human capacity to examine the “bigger picture” and put data sets in context. Analysis tools for extreme data have evolved at a steady pace and can now handle massive information volumes with virtually no hardware lag, but trained data scientists are needed to ask the right questions; even the best data means nothing without interpretation.

Finding the Time

If hiring data scientists and paying for analytics tools sounds costly, it can be, especially if companies want to get it right the first time. As a recent Forbes article notes, however, analytics half-measures aren’t effective and can cripple efforts to make best use of extreme data.

Companies, therefore, have several choices. If this is their first foray into extreme data, it’s worth considering strategic consultation for a single analysis project. Using a trusted, experienced IT service provider lets businesses skip the step of finding and vetting analytics providers in a suddenly crowded market. If more robust analysis tools are required, there are two options: Outsource the task entirely, or supplement existing IT staff with professionals from a reputable third party. In many cases, this kind of shared responsibility offers the best mix of flexibility and security; in-house IT administrators retain control of all mission-critical data while outsourced IT handles the day-to-day details. With IT staffers already hard pressed to keep up with cloud deployments and mobile advancement, there’s nothing wrong with getting a little help for extreme data analysis.

The flow of data isn’t slowing—in fact, its pace continues to increase. Companies can’t ignore this data, nor can they blindly analyze portions hoping for actionable results. The right tools, the right training and the right help make all the difference in the transition from big to extreme.

By David Eisner

David Eisner

Derek Pilling

Diversify for Success: The Multi-Cloud Advantage

What is Multi-Cloud? For good reason there is a lot of discussion about multi-cloud among [...]
Read more
Anastasios Arampatzis

Insider Threats: The Trojan Horses in Intellectual Property Theft

The Invisible Enemy In the rapidly evolving landscape of global business, intellectual property (IP) stands [...]
Read more
Frank Suglia

Forecasting Cloud Trends in 2024

The past few years have rapidly accelerated cloud adoption and impacted the overall IT landscape. [...]
Read more
Nancy Zafrani

The Future of Relocation: AI-Powered Solutions

The Future of Relocation Artificial intelligence (AI) isn’t going anywhere — in fact, it seems [...]
Read more

SIEM Tools: Cloud-Based vs. On-Premises

What Are SIEM Tools? SIEM tools are designed to help security professionals identify, track, and [...]
Read more
David Anandraj

Tips to Protect Business Texting & Navigate 10DLC Compliance

Navigating 10DLC Compliance Texting has become a communication game-changer for businesses. Texting allows companies to [...]
Read more

SPONSOR PARTNER

Explore top-tier education with exclusive savings on online courses from MIT, Oxford, and Harvard through our e-learning sponsor. Elevate your career with world-class knowledge. Start now!
© 2024 CloudTweaks. All rights reserved.