Technology Cloud Contributor

IoT Data Centers – “We’ve Always Done IoT, We’re Just Terrible At Marketing It”

IoT Data Centers

An often repeated phrase by many data center professionals is “We’ve always done IoT, we’re just terrible at marketing it.”

They’re right, of course (at least regarding IoT!), and IoT proper is now firmly finding its role in three key areas regarding data centers: Within, Across, and Outside In. Let’s consider each of these in turn.

Within

A data center is a particularly hostile environment: Tens of thousands of pieces of equipment, from numerous manufacturers, which don’t work well together, if at all. Inconsistent equipment behavior (the same model UPS from the same provider manufactured at different times can behave differently!) and differing protocols, data semantics (is that value Watts or KiloWatts?) and security mechanisms result in an environment which quickly spirals in complexity, defying holistic management.

Providers (and even many end users) have responded by creating an abstraction layer atop the hardware to normalize the differences outlined above, allowing equipment to share information and state with higher level monitoring and control systems in a somewhat standard way.

Sound a lot like IoT? Of course, albeit with a large, recognized downside: It is difficult to scale this approach, as each piece of equipment usually requires significant investment to define and test the appropriate adapter; these adapters are brittle, as they often are created empirically and rely on data streams and behavior often specific to a model, type or even serial number. Further, it is not uncommon for new firmware versions to change the behavior of the unit.

The great promise of IoT within the data center, then, addresses the complexity and silos within data centers by providing “drop and play” functionality for equipment and critical infrastructure, with:

  • Standard interfaces and adapters, ideally provided and maintained by the manufacturer,
  • Well defined and expected behavior, and
  • Extensibility and flexibility for management and control.

The underlying benefit, of course, is increased standardization and interoperability across the many different systems and providers in the data center, resulting in streamlined deployment, greater choice and new functionality.

Across

Traditional data centers focused on the systems within the confines of the building proper. With the emergence of edge data centers and various cloud and colocation options, a much larger and richer topology needs to be considered. Edge data centers need not be standalone; they must be incorporated into the larger computing fabric. Colocation facilities must not be black boxes; they hold valuable equipment and computing power which need to be optimally used. Further, federation and aggregation of data center monitoring, control and reporting must span all facilities, from core to colo to edge. Monitoring and control of remote facilities, where the luxury of “walking the aisles” doesn’t exist, is especially important given their outsized growth. Achieving any of these goals requires effective communication across the many, differing nodes, and critically, must encompass a very rich set of equipment within those nodes, down to the component level. Why shouldn’t we know how much storage or networking capacity we have across all of our computing fabric? How much thermal capacity exists, and what’s its status and efficiency at any given time?

IoT provides an ideal mechanism for linking together the many different nodes and devices within a rich data center topology. It is lightweight and open, can easily accommodate any type of device or data, has a wealth of tools and platforms, and the industry has started to provide purpose-built equipment such as IoT gateways for aggregating and transmitting IoT packets quickly and securely.

Today’s rich data center topologies must be cohesively linked and managed, providing an ideal use case for IoT.

Outside In

The most intriguing use of IoT regarding data centers involves “Outside In” data. Consider autonomous vehicles: They both generate and consume a tremendous amount of temporal and spatial data; sensors need to be placed at well-defined and known locations; and numerous other systems, such as emergency response, need to be incorporated. The resulting “intelligent grid” ultimately will be managed by data centers, likely via geographically dispersed edge facilities cooperating with core data centers. The ideal communication mechanism for this grid, anchored by computing facilities? IoT.

There are numerous other examples of very high scale, wide-area, lightweight devices driving traffic into data centers for optimal outcomes. Think of Smart Homes: When the numerous smart devices in our homes need to communicate with our utility to allow them to manage electrical supply better (hopefully with a discount for opting in!), what better mechanism than IoT?

Finally, consider the following data from IHS:

Those 20+ billion devices today are, by definition, communicating. As these devices evolve from novelty to utility, from light switch to industrial control, they will flow an ever-growing amount of increasingly critical information up to data centers for processing. This will further drive growth in data centers and their rich topology, reinforcing the role of IoT and data centers: Within, Across and Outside In. This time we’ll get the marketing right.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

View Website

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

How To Be Data Compliant When Using The Cloud

How To Be Data Compliant When Using The Cloud

Data compliant Companies using the cloud for data storage, applications hosting or anything else, have to carefully consider data compliance. Governance, risk management and compliance professionals, as well as managers of information security, need to ...
Global Public Cloud Spending To Double By 2020

Global Public Cloud Spending To Double By 2020

The Cloud and Endpoint Modeling The worldwide migration of IT resources to the public cloud continues, at a head-spinning pace. Global public-cloud spending was forecast to reach $96.5 billion in 2016, according to IDC — ...
How IoT and OT collaborate to usher in the data-driven factory of the future

How IoT and OT collaborate to usher in the data-driven factory of the future

The Data-driven Factory The next BriefingsDirect Internet of Things (IoT) technology trends interview explores how innovation is impacting modern factories and supply chains. We’ll now learn how a leading-edge manufacturer, Hirotec, in the global automotive industry, takes advantage of ...
Top Security and IT Priorities for 2017

Top Security and IT Priorities for 2017

Top Security Priorities By 2019, cybercrime is expected to cost businesses over $2.1 trillion globally according to Juniper Research. Needless to say, security and IT professionals and teams have been under immense pressure to secure ...
Virtual Reality Healthcare Trends

Virtual Reality Trends and Possibilities in Healthcare

Virtual Reality Healthcare Trends Virtual reality tends currently to focus on entertainment and gaming, but it’s a field that’s beginning to show advances into more esteemed areas such as healthcare and medicine. Already high-tech simulations are allowing ...
The Lighter Side Of The Cloud - Turmoil
The Lighter Side Of The Cloud - Bottlenecking
Star Wars IoT CES
The Lighter Side Of The Cloud - Energy Battle
The Ligther Side Of The Cloud - Speed Browsing
The Lighter Side Of The Cloud - Wearable Infection
The Lighter Side Of The Cloud - Dial-up Speeds
The Lighter Side Of The Cloud - YTF
The Lighter Side Of The Cloud - Virtual Office Space

CLOUDBUZZ NEWS

StumbleUpon is closing down after 16 years

StumbleUpon is closing down after 16 years

StumbleUpon, the social content discovery platform founded way back in 2001, is closing down. Cofounder Garrett Camp made the announcement in a blog post earlier today, stating that StumbleUpon accounts can be ported over to another of ...
Silicon breakthrough could make key microwave technology much cheaper and better

Silicon breakthrough could make key microwave technology much cheaper and better

THURSDAY, MAY 24, 2018 - Researchers using powerful supercomputers have found a way to generate microwaves with inexpensive silicon, a breakthrough that could dramatically cut costs and improve devices such as sensors in self-driving vehicles ...
Artificial Intelligence to Add US$182 Billion to UAE Economy by 2035, Accenture Research Shows

Artificial Intelligence to Add US$182 Billion to UAE Economy by 2035, Accenture Research Shows

Financial services, healthcare, and transport and storage industries likely to see the biggest gains DUBAI, United Arab Emirates; May 21, 2018 – Artificial intelligence (AI) has the potential to boost economic growth in the United ...