New Year, New Cloud? Managing “Extreme Data”

Managing Extreme Data

Companies are familiar with the concept of “big data”; every piece of information they generate from day-to-day business processes or when interacting with customers is considered valuable, so long as these pieces are quickly analyzed and converted into actionable results. But according to predictions from the Institute of Electrical and Electronics Engineers (IEEE), big data is just the beginning—2014 marks the start of a race to establish leaders in the “extreme data” market.

Going to Extremes

Most IT professionals recognize the “three Vs” of big data: Volume, Variety and Velocity. To make the most of constant data flows, companies need a high volume of information for analysis, a wide variety of data to examine and a high rate of data transfer. One other V gaining ground is Veracity, which speaks to any inherent bias in collected data along with its relevance to the problem at hand. In other words, the three Vs are a good starting point but won’t point a company in the right direction if the data collected contains a massive bias or intersects only tangentially with the question being asked.

Making this market more complex is the increasing amount of data coming from previously untapped sources. This is an extension of a concept called the Internet of Things, which focuses on bringing objects and products outside the technology spectrum into the online community using the cloud. Wireless sensors and radio frequency identification (RFID) chips are commonly used to track products as they move from assembly to production to quality control and are finally shipped to the customer, or to report “unstructured” data in real time, for example how consumers use appliances and technology in their homes. The result is an rapid increase in the amount of data available to companies, enough to justify the shift in market wording from merely big to extreme.

Tools and Training

Companies face two challenges when it comes to managing extreme data: Tools and training. Technology giants and startup companies alike are able to compete in this emerging market, since it’s not the hardware they sell that matters but rather the kind of analysis they can deliver through an accessible cloud portal. Where prescriptive analytics once ruled as a way to correct inefficient businesses processes, predictive algorithms have emerged able to intelligently consider data and deliver predictions about the future rather than mere confirmations about the present.

This leads to the second challenge: Training. While these tools are deceptively simple, they will never reach maximum efficiency without skilled hands guiding the wheel. Strides are being made in cognitive computing—the science of teaching computers to think like human beings—but there’s still no replacement for the human capacity to examine the “bigger picture” and put data sets in context. Analysis tools for extreme data have evolved at a steady pace and can now handle massive information volumes with virtually no hardware lag, but trained data scientists are needed to ask the right questions; even the best data means nothing without interpretation.

Finding the Time

If hiring data scientists and paying for analytics tools sounds costly, it can be, especially if companies want to get it right the first time. As a recent Forbes article notes, however, analytics half-measures aren’t effective and can cripple efforts to make best use of extreme data.

Companies, therefore, have several choices. If this is their first foray into extreme data, it’s worth considering strategic consultation for a single analysis project. Using a trusted, experienced IT service provider lets businesses skip the step of finding and vetting analytics providers in a suddenly crowded market. If more robust analysis tools are required, there are two options: Outsource the task entirely, or supplement existing IT staff with professionals from a reputable third party. In many cases, this kind of shared responsibility offers the best mix of flexibility and security; in-house IT administrators retain control of all mission-critical data while outsourced IT handles the day-to-day details. With IT staffers already hard pressed to keep up with cloud deployments and mobile advancement, there’s nothing wrong with getting a little help for extreme data analysis.

The flow of data isn’t slowing—in fact, its pace continues to increase. Companies can’t ignore this data, nor can they blindly analyze portions hoping for actionable results. The right tools, the right training and the right help make all the difference in the transition from big to extreme.

By David Eisner

Derrek Schutman

Implementing Digital Capabilities Successfully to Boost NPS and Maximize Value Realization

Implementing Digital Capabilities Successfully Building robust digital capabilities can deliver huge benefits to Digital Service Providers (DSPs). A recent TMForum survey shows that building digital capabilities (including digitization of customer experience and operations), is the ...
Jim Fagan

Behind The Headlines: Capacity For The Rest Of Us

Capacity For The Rest Of Us We live in the connected age, and the rise of cloud computing that creates previously unheard of value in our professional and personal lives is at the very heart ...
Doug Hazelman Cloudberry

Managing an Increasingly Complex IT Environment

Managing Complex IT Environments The hybrid work model is here to stay—at least for the time being. That’s how things feel in these still uncertain times. This new way of work that has evolved from ...
James Corbishly

Addressing Teams Sprawl in the Remote Workspace

Teams Sprawl in the Remote Workspace As working from home has become the new everyday norm, with more employers embracing the remote-work model as a new and likely permanent fixture of the employment world, there ...
Dr. Mike Lloyd

How to Mitigate Security Risks in the Cloud

How to Mitigate Security Risks in the Cloud Enterprises continue to spend billions annually on security technology, yet cyber breaches continue to come fast and furious. So what exactly is going on here? Why are ...

CLOUD MONITORING

The CloudTweaks technology lists will include updated resources to leading services from around the globe. Examples include leading IT Monitoring Services, Bootcamps, VPNs, CDNs, Reseller Programs and much more...

  • Opsview

    Opsview

    Opsview is a global privately held IT Systems Management software company whose core product, Opsview Enterprise was released in 2009. The company has offices in the UK and USA, boasting some 35,000 corporate clients. Their prominent clients include Cisco, MIT, Allianz, NewVoiceMedia, Active Network, and University of Surrey.

  • Nagios

    Nagios

    Nagios is one of the leading vendors of IT monitoring and management tools offering cloud monitoring capabilities for AWS, EC2 (Elastic Compute Cloud) and S3 (Simple Storage Service). Their products include infrastructure, server, and network monitoring solutions like Nagios XI, Nagios Log Server, and Nagios Network Analyzer.

  • Datadog

    DataDog

    DataDog is a startup based out of New York which secured $31 Million in series C funding. They are quickly making a name for themselves and have a truly impressive client list with the likes of Adobe, Salesforce, HP, Facebook and many others.

  • Sematext Logo

    Sematext

    Sematext bridges the gap between performance monitoring, real user monitoring, transaction tracing, and logs. Sematext all-in-one monitoring platform gives businesses full-stack visibility by exposing logs, metrics, and traces through a single Cloud or On-Premise solution. Sematext helps smart DevOps teams move faster.