Apcela

Can your network meet the challenges of Office 365?

Network Challenges of Office 365 Microsoft's focus on growing commercial adoption of Office 365 in 2018 has resulted in the number of licensed seats growing an average of 29% on a yearly basis for each quarter of fiscal 2018. With an estimated 135 million commercial
Building a Vibrant Open Source Community, and the “Take A Penny, Leave A Penny” Doctrine

Building a Vibrant Open Source Community, and the “Take A Penny, Leave A Penny” Doctrine

Open source software is different than proprietary software in one very important area: open source software can enable new ways to create value. The very nature of open source software – and the community that springs up around that open source software – provides a
Technology Cloud Contributor

“Management by AI”: Analytics in the Data Center

Management by AI

Behind any cloud, hosted environment or enterprise computing environment are data centers with tens of thousands of servers, racks upon racks of networking equipment, and supporting critical infrastructure, from power distribution to thermal management.

AI Management

The scale, complexity and required optimization of these modern data centers necessitate “Management by AI” as they increasingly cannot be planned and managed with traditional rules and heuristics. AI leads to many direct, and a few unexpected, benefits: The massive amount and variety of available data, from environmental to critical infrastructure to IT systems and applications, when synthesized and analyzed by an AI system, will provide the best outcomes for ever increasing availability and optimization, helping to address SLAs and minimize operating expenses.

Numerous factors are contributing to the need for AI in data centers:

  • Efficiency and environmental impact: According to a U.S. Department of Energy report, a data center uses up to 50 times more energy per square foot than a typical commercial building, and as an industry, data centers consume more than 2% of all electricity in the U.S. The industry has faced undeniable scrutiny over its energy footprint; coupled with the costs of consumption, operators are addressing efficiency in ever more creative and complex ways.
  • Data center consolidation: Data centers absolutely benefit from economies of scale, and whether corporate data centers are consolidated or moved to colocation facilities, the result is ever larger facilities, with density and power usage to match.
  • Growth of colocation providers: Colocation providers, such as Equinix and Digital Realty, for whom availability, efficiency and reducing costs are paramount, are growing five times faster than the overall market, according to a recent 451 Group report. These providers, with the necessary scale of their facilities and their efficiency-driven business models, stand to disproportionately benefit from, and are thus driving, AI.
  • Edge computing: The rise of Edge data centers- smaller data centers often geographically dispersed – allows computing and data to be optimally placed. Rather than being stand-alone entities, these Edge nodes combine with central data centers or cloud computing to form a larger, cooperative computing fabric. This rich topology provides numerous inputs and controls for optimization and availability, which again are best managed by AI.

There are several areas where AI is being researched and applied in data centers today:

  • Optimizing availability by accurately predicting future application behavior down to the rack and server; workloads are pre-emptively moved within or across data centers based on future power, thermal or IT equipment behavior.
  • Optimizing energy usage by managing the numerous types of cooling, across room, row and rack, with great precision. It is not uncommon for different cooling systems to conflict with each other; with its continual feedback and optimization algorithms, AI provides an ideal mechanism for managing this complexity. Some of the best and most intriguing examples use weather algorithms to predict and address hot spots in the data center.
  • Multi-variate preventative maintenance, delving into the component level within equipment to predict failure.
  • Optimizing IT equipment placement by forecasting future states of the data center rather than simply the current configuration.
  • Intelligently Managing alarms and alerts by filtering and prioritizing significant events. A common problem in data centers is dealing with chained alerts, making it difficult to address the root cause. AI, when coupled with Change of Rate, deviation or similar algorithms provides an ideal mechanism to identify critical alerts.

Although AI has numerous benefits and is a certain trend in data centers, two points are critical for a successful implementation:

  • AI thrives on rich and large data streams; the right systems must be in place to collect and aggregate this data across the key elements in the data center, from Critical Infrastructure to IT Systems to Applications.
  • Expectations need to be set for the outcomes of AI, especially regarding autonomous control. One of the largest benefits of AI is real-time analysis on rich and huge data streams; delaying action can negate many of the benefits an AI system provides. This is not an issue of relinquishing control but rather putting the appropriate management systems in place to achieve the full benefit from AI while still setting boundaries and limits.

Data centers present an ideal use case for AI: Complex, energy intensive and critical, with a very large set of inputs and control points that can only be properly managed through an automated system. With ever-evolving innovations in the data center, from Application Performance Management linked with physical infrastructure to closely linked multi-data center topologies, the need for and benefit of AI will only increase in the coming years.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

View Website

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

Cloud’s Mighty Role - Why Custom Development is the Next Big Thing (Again)

Cloud’s Mighty Role – Why Custom Development is the Next Big Thing (Again)

Custom Development is the Next Big Thing Today, software is playing a very important role in performing basic business processes ...
Istio 1.0: Making It Easier To Develop and Deploy Microservices

Istio 1.0: Making It Easier To Develop and Deploy Microservices

With the recent availability of Istio 1.0 it is not surprising that it continues to capture much attention from the ...
Infosec thought leaders

Cryptocurrencies and Ransomware: How VDI Can Help Defend Against the Next Ransomware Attack

Cryptocurrencies and Ransomware The WannaCry ransomware made headlines back in May when it crippled hospitals across the UK and put ...
Cloudification - Budgets are Shifting Toward a “Cloud-first” and “Cloud-only” Approach

Cloudification – Budgets are Shifting Toward a “Cloud-first” and “Cloud-only” Approach

Cloudification and the Budget Shift Gartner has recently predicted that by 2020, a corporate "no-cloud" policy will be as rare ...
Survey results reveal the biggest Artificial Intelligence challenges

Survey results reveal the biggest Artificial Intelligence challenges

Biggest Artificial Intelligence Challenges We’ve been told countless times over the past few years what an impact Artificial Intelligence (AI) ...