Technology Cloud Contributor

“Management by AI”: Analytics in the Data Center

Management by AI

Behind any cloud, hosted environment or enterprise computing environment are data centers with tens of thousands of servers, racks upon racks of networking equipment, and supporting critical infrastructure, from power distribution to thermal management.

AI Management

The scale, complexity and required optimization of these modern data centers necessitate “Management by AI” as they increasingly cannot be planned and managed with traditional rules and heuristics. AI leads to many direct, and a few unexpected, benefits: The massive amount and variety of available data, from environmental to critical infrastructure to IT systems and applications, when synthesized and analyzed by an AI system, will provide the best outcomes for ever increasing availability and optimization, helping to address SLAs and minimize operating expenses.

Numerous factors are contributing to the need for AI in data centers:

  • Efficiency and environmental impact: According to a U.S. Department of Energy report, a data center uses up to 50 times more energy per square foot than a typical commercial building, and as an industry, data centers consume more than 2% of all electricity in the U.S. The industry has faced undeniable scrutiny over its energy footprint; coupled with the costs of consumption, operators are addressing efficiency in ever more creative and complex ways.
  • Data center consolidation: Data centers absolutely benefit from economies of scale, and whether corporate data centers are consolidated or moved to colocation facilities, the result is ever larger facilities, with density and power usage to match.
  • Growth of colocation providers: Colocation providers, such as Equinix and Digital Realty, for whom availability, efficiency and reducing costs are paramount, are growing five times faster than the overall market, according to a recent 451 Group report. These providers, with the necessary scale of their facilities and their efficiency-driven business models, stand to disproportionately benefit from, and are thus driving, AI.
  • Edge computing: The rise of Edge data centers- smaller data centers often geographically dispersed – allows computing and data to be optimally placed. Rather than being stand-alone entities, these Edge nodes combine with central data centers or cloud computing to form a larger, cooperative computing fabric. This rich topology provides numerous inputs and controls for optimization and availability, which again are best managed by AI.

There are several areas where AI is being researched and applied in data centers today:

  • Optimizing availability by accurately predicting future application behavior down to the rack and server; workloads are pre-emptively moved within or across data centers based on future power, thermal or IT equipment behavior.
  • Optimizing energy usage by managing the numerous types of cooling, across room, row and rack, with great precision. It is not uncommon for different cooling systems to conflict with each other; with its continual feedback and optimization algorithms, AI provides an ideal mechanism for managing this complexity. Some of the best and most intriguing examples use weather algorithms to predict and address hot spots in the data center.
  • Multi-variate preventative maintenance, delving into the component level within equipment to predict failure.
  • Optimizing IT equipment placement by forecasting future states of the data center rather than simply the current configuration.
  • Intelligently Managing alarms and alerts by filtering and prioritizing significant events. A common problem in data centers is dealing with chained alerts, making it difficult to address the root cause. AI, when coupled with Change of Rate, deviation or similar algorithms provides an ideal mechanism to identify critical alerts.

Although AI has numerous benefits and is a certain trend in data centers, two points are critical for a successful implementation:

  • AI thrives on rich and large data streams; the right systems must be in place to collect and aggregate this data across the key elements in the data center, from Critical Infrastructure to IT Systems to Applications.
  • Expectations need to be set for the outcomes of AI, especially regarding autonomous control. One of the largest benefits of AI is real-time analysis on rich and huge data streams; delaying action can negate many of the benefits an AI system provides. This is not an issue of relinquishing control but rather putting the appropriate management systems in place to achieve the full benefit from AI while still setting boundaries and limits.

Data centers present an ideal use case for AI: Complex, energy intensive and critical, with a very large set of inputs and control points that can only be properly managed through an automated system. With ever-evolving innovations in the data center, from Application Performance Management linked with physical infrastructure to closely linked multi-data center topologies, the need for and benefit of AI will only increase in the coming years.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

View Website

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

Kodak Bitcoin mining 'scam' evaporates

Kodak Bitcoin mining ‘scam’ evaporates

The company behind a Kodak-branded crypto-currency mining scheme has confirmed the plan has collapsed. In January, a Bitcoin mining computer labelled Kodak KashMiner was on display on Kodak's official stand at the CES technology show ...
Protect Your Small Business

2.3 Billion Account Credentials Compromised from 51 Organizations in 2017; New Research Shows Breadth of Breach Impacts

MOUNTAIN VIEW, Calif., July 18, 2018 (GLOBE NEWSWIRE) -- Shape Security, the provider of advanced security and fraud technology for the world’s largest companies, today released its second annual Credential Spill Report, shedding light on the extent ...
'What the Hock?'; Broadcom shares sink on shock software deal

‘What the Hock?’; Broadcom shares sink on shock software deal

(Reuters) - Broadcom Inc’s (AVGO.O) surprise bid to buy software company CA Inc (CA.O) knocked $11 billion off the value of the chipmaker in trading before the bell on Wall Street on Thursday, with analysts ...
cloud computing certification

Why Certification Matters for Cloud Service Providers

Certification for Cloud Service Providers As of 2017, the concept of “cloud” has become more of a norm for companies ...
Tesla is Worth More Than Ford or GM. Is this the Automakers iPhone Moment?

Tesla is Worth More Than Ford or GM. Is this the Automakers iPhone Moment?

The Automakers iPhone Moment Remember Blackberry? How about Nokia or Motorola? Vaguely you say. Will we one day state the ...
Mark Carrizosa

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Record Breaches Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT ...
5 Ways Cloud-based Tools Can Help Accountants Escape The IT Treadmill

5 Ways Cloud-based Tools Can Help Accountants Escape The IT Treadmill

Accountant Cloud Tools Digital tools and software have become an inseparable part of any accountant's profession. There are software for ...
What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) ...
Being relevant, leading and remaining differentiated in the era of AI

Being relevant, leading and remaining differentiated in the era of AI

In the previous post "Yes, AI could be smart enough to take your job," I mentioned that AI would affect the way ...