Robo-Advisors vs. Financial Advisors: What Do Millennials Prefer?

Robo-Advisors vs. Financial Advisors: What Do Millennials Prefer?

Robo-Advisors vs. Financial Advisors For technology-loving millennials, robo-advisors may seem appealing. With a robo-advisor, a portfolio is managed online by using software algorithms instead of a real person. It makes sense millennials would flock to robo-advisors, but they’re not, according to a LendEDU study. The
Strategies for Monetizing Data: 2018 and Beyond

Strategies for Monetizing Data: 2018 and Beyond

Strategies for Monetizing Data The data revolution is here, and it creates an investment priority for enterprises to stay competitive and drive new opportunities. One of the brightest areas is data monetization, which describes how to create economic benefits, either additional revenue streams or savings,

CONTRIBUTORS

Apcela

Fulfilling the promise of UCaaS requires a better network

UCaaS Systems Unified communications (UC), by combining voice, video and text messaging into a single system, has long promised efficiency ...
Why Open Source Technology is the Key to Any Collaboration Ecosystem

Why Open Source Technology is the Key to Any Collaboration Ecosystem

Open Source Collaboration Ecosystem Open source – software whose source code is public and can be modified or shared freely ...
Safeguarding Data Before Disaster Strikes

Safeguarding Data Before Disaster Strikes

Safeguarding Data  Online data backup is one of the best methods for businesses of all sizes to replicate their data ...

RECENT NEWS

Amazon picks New York City, Virginia for $5 billion new headquarters

Amazon picks New York City, Virginia for $5 billion new headquarters

SAN FRANCISCO (Reuters) - Amazon.com Inc (AMZN.O) said on Tuesday it will build offices for up to 25,000 people in ...
Pressure grows on Zuckerberg to attend Facebook committee hearing

Pressure grows on Zuckerberg to attend Facebook committee hearing

Australia, Argentina and Ireland join UK and Canada in urging Facebook CEO to give evidence to parliaments Parliamentary committees from ...
Batteryless smart devices closer to reality

Batteryless smart devices closer to reality

Researchers at the University of Waterloo have taken a huge step towards making smart devices that do not use batteries ...
Oracle Cloud Unveils New HPC Offerings to Support Mission Critical Workloads

Oracle Cloud Unveils New HPC Offerings to Support Mission Critical Workloads

Oracle Cloud Unveils New HPC Offering Oracle now provides a complete set of solutions for any high performance computing workload, ...
Alibaba's on-demand online services unit valued at $30 billion: sources

Alibaba’s on-demand online services unit valued at $30 billion: sources

HONG KONG (Reuters) - Alibaba Group’s newly formed on-demand online services unit has rocketed in value to as much as ...
Technology Cloud Contributor

IoT Data Centers – “We’ve Always Done IoT, We’re Just Terrible At Marketing It”

IoT Data Centers

An often repeated phrase by many data center professionals is “We’ve always done IoT, we’re just terrible at marketing it.”

They’re right, of course (at least regarding IoT!), and IoT proper is now firmly finding its role in three key areas regarding data centers: Within, Across, and Outside In. Let’s consider each of these in turn.

Within

A data center is a particularly hostile environment: Tens of thousands of pieces of equipment, from numerous manufacturers, which don’t work well together, if at all. Inconsistent equipment behavior (the same model UPS from the same provider manufactured at different times can behave differently!) and differing protocols, data semantics (is that value Watts or KiloWatts?) and security mechanisms result in an environment which quickly spirals in complexity, defying holistic management.

Providers (and even many end users) have responded by creating an abstraction layer atop the hardware to normalize the differences outlined above, allowing equipment to share information and state with higher level monitoring and control systems in a somewhat standard way.

Sound a lot like IoT? Of course, albeit with a large, recognized downside: It is difficult to scale this approach, as each piece of equipment usually requires significant investment to define and test the appropriate adapter; these adapters are brittle, as they often are created empirically and rely on data streams and behavior often specific to a model, type or even serial number. Further, it is not uncommon for new firmware versions to change the behavior of the unit.

The great promise of IoT within the data center, then, addresses the complexity and silos within data centers by providing “drop and play” functionality for equipment and critical infrastructure, with:

  • Standard interfaces and adapters, ideally provided and maintained by the manufacturer,
  • Well defined and expected behavior, and
  • Extensibility and flexibility for management and control.

The underlying benefit, of course, is increased standardization and interoperability across the many different systems and providers in the data center, resulting in streamlined deployment, greater choice and new functionality.

Across

Traditional data centers focused on the systems within the confines of the building proper. With the emergence of edge data centers and various cloud and colocation options, a much larger and richer topology needs to be considered. Edge data centers need not be standalone; they must be incorporated into the larger computing fabric. Colocation facilities must not be black boxes; they hold valuable equipment and computing power which need to be optimally used. Further, federation and aggregation of data center monitoring, control and reporting must span all facilities, from core to colo to edge. Monitoring and control of remote facilities, where the luxury of “walking the aisles” doesn’t exist, is especially important given their outsized growth. Achieving any of these goals requires effective communication across the many, differing nodes, and critically, must encompass a very rich set of equipment within those nodes, down to the component level. Why shouldn’t we know how much storage or networking capacity we have across all of our computing fabric? How much thermal capacity exists, and what’s its status and efficiency at any given time?

IoT provides an ideal mechanism for linking together the many different nodes and devices within a rich data center topology. It is lightweight and open, can easily accommodate any type of device or data, has a wealth of tools and platforms, and the industry has started to provide purpose-built equipment such as IoT gateways for aggregating and transmitting IoT packets quickly and securely.

Today’s rich data center topologies must be cohesively linked and managed, providing an ideal use case for IoT.

Outside In

The most intriguing use of IoT regarding data centers involves “Outside In” data. Consider autonomous vehicles: They both generate and consume a tremendous amount of temporal and spatial data; sensors need to be placed at well-defined and known locations; and numerous other systems, such as emergency response, need to be incorporated. The resulting “intelligent grid” ultimately will be managed by data centers, likely via geographically dispersed edge facilities cooperating with core data centers. The ideal communication mechanism for this grid, anchored by computing facilities? IoT.

There are numerous other examples of very high scale, wide-area, lightweight devices driving traffic into data centers for optimal outcomes. Think of Smart Homes: When the numerous smart devices in our homes need to communicate with our utility to allow them to manage electrical supply better (hopefully with a discount for opting in!), what better mechanism than IoT?

Finally, consider the following data from IHS:

Those 20+ billion devices today are, by definition, communicating. As these devices evolve from novelty to utility, from light switch to industrial control, they will flow an ever-growing amount of increasingly critical information up to data centers for processing. This will further drive growth in data centers and their rich topology, reinforcing the role of IoT and data centers: Within, Across and Outside In. This time we’ll get the marketing right.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

View Website

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

Cloud Community Supporters

(ISC)²
Cisco
SAP
CA Technologies
Dropbox

Cloud community support comes from (paid) sponsorship or (no cost) collaborative network partnership initiatives.