Enzo Greco Technology Cloud Contributor
Enzo Greco

The Competitive Cloud Data Center

The Competitive Cloud

The corporate data center was long the defacto vehicle for all application deployment across an enterprise. Whether reporting to Real Estate, Finance or IT, this relationship served both data centers and their users well, as it allowed the organization to invest in specialized skills and facilities and provided the enterprise with reliable and trusted services.

The first major shift in this structure occurred with the success of Salesforce.com in the early 2000s, where Salesforce removed a significant hurdle for their key target, the VP of Sales, by providing the application “off premise”. VPs of Sales typically have no significant relationship with their IT organization, so this hosted offering allowed the VP to authorize the purchase fairly autonomously. Fast forward to today, and nearly all functions, from HR to Finance to ERP, are offered “as a Service”, further pressuring the corporate data center.

competitive cloud

A recent report from Gartner estimated that today, 60% of workloads are running on-prem, and this percentage is expected to drop to 33% by 2021 and 20% by 2024. All major software companies, from Microsoft to IBM to Oracle, are focused on offering their software as a Service and in fact report on this growth as a key metric in their financials. Microsoft, in particular, noted “that by FY ’19 we’ll have two-thirds of commercial Office in Office 365, and Exchange will be north of that, 70 percent.” Traditionally, these applications have represented some of the largest enterprise workloads, so the trend is unmistakable.

There are many reasons enterprise data centers will be required: location, security, financial considerations and regulatory requirements to name a few; certain organizations will continue to use them as strategic and differentiating assets. However, enterprise data centers will increasingly need to justify themselves against alternatives ranging from Applications-as-a-Service to Amazon Web Services to colocation providers like Digital Realty and Equinix.

The market has accepted that most organizations will have a happy medium of hybrid facilities, a combination of on-prem and cloud or colocation options. A most important “point of inflection” occurs in this journey: The moment an organization extends their footprint beyond their in-house data centers, they immediately need to address a host of new issues; a few include:

  • Management by SLA: Availability will always be key, but when the facilities are not theirs, an organization must rely on the SLAs and escalation procedures of the provider. This is usually one of the biggest leaps in the journey to colocation as many procedures and scenarios need to be rethought and redefined, now with an outside party.
  • Application placement: Now that there are options, where does an application run best? On prem? In a colo? In AWS? What are the right metrics to manage this placement? Cost? Security? Availability? How dynamic should this workload placement be? This is a nascent area which many organizations overlook; many vendors, from startup to established, are investing heavily to provide tools and intelligence to assist.
  • DevOps: Organizations have developed detailed DevOps procedures; when a platform such as AWS is brought into the mix, those procedures need to be reworked and tailored specifically for that platform.

The trend towards hybrid clouds is unmistakable, with commensurate benefits, but an organization must balance and justify this hybridization. One of the best examples of this consideration process come from the U.S. Federal government in their Data Center Optimization Initiative (https://datacenters.cio.gov/). DCOI is a federal mandate that “requires agencies to develop and report on data center strategies to

  • Consolidate inefficient infrastructure,
  • Optimize existing facilities,
  • Improve security posture,
  • Achieve cost savings,
  • and transition to more efficient infrastructure, such as cloud services and inter-agency shared services.”

Like numerous organizations, the U.S. Federal government has a cloud-first policy, but it has instituted a rigorous reporting and oversight process to prudently manage their data center footprint and computing strategy. Many other organizations, both in the public and private sector, should consider similar processes and transparency as they optimize their many hybrid cloud-computing options.

Existing enterprise data centers represent significant assets; how should their role in the overall cloud computing fabric be determined? Availability, cost and security will always be the dominant factors for any physical computing topology, with agility a recent addition. Define the key metrics and drivers, transparently and consistently, for the overall environment, and the best set of options will present themselves.

By Enzo Greco

Enzo Greco Contributor
Chief Strategy Officer for Nlyte Software
Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets. Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity. Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work. After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999. Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce. Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.
BI Data

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole ...
How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove ...
Michela Menting

Protecting Devices From Data Breach: Identity of Things (IDoT)

IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create ...
As Enterprises Execute Their Digital Strategies, New Multi-cloud Landscape Emerge

As Enterprises Execute Their Digital Strategies, New Multi-cloud Landscape Emerge

The Multi-cloud Landscape The digital universe is expanding rapidly, and cloud computing is building the foundation for almost infinite use cases and applications. Hence, it’s ...
Daren Glenister

What’s Next In Cloud And Data Security?

Cloud and Data Security It has been a tumultuous year in data privacy to say the least – we’ve had a huge increase in data ...
Facebook

Facebook admits to another data leak, saying that up to 100 developers accessed people’s data from Groups

More than a year after Facebook clamped down on how much personal data third parties could see, the company has found some app developers still had access to people's data ...
BBC Tech

Play store apps to be scanned for malware

Google is beefing up the way it checks if any of the apps uploaded to its Play store are malicious. All new apps will be scanned by malware-spotting tools from three ...
Reuters news

Robot Wars: Russia’s Yandex begins autonomous delivery testing

MOSCOW (Reuters) - Russian internet giant Yandex has started testing autonomous delivery robots, the latest addition to its technological arsenal, the company said on Thursday. Named after the space exploration ...