How Data Innovators Lead Data Transformation Forward

How Data Innovators Lead Data Transformation Forward

Data transformation is currently upon us, and we can’t deny the fact that the future we talked about is now here. The world is finally progressing towards a more mature period of digital competence, and data innovators sit at the center of it all. Data-driven
Remote Patient Monitoring – One of the Most Important Applications of IoT in Healthcare

Remote Patient Monitoring – One of the Most Important Applications of IoT in Healthcare

Remote Patient Monitoring The application of IoT in Healthcare services is bringing the paradigm shift in terms of how this industry operates, the accuracy of diagnosis and quality of treatment. It is changing the model from ‘hospital-centric’ to ‘home-centric’, making medical attention more affordable to

CONTRIBUTORS

How Blockchain Has Unexpectedly Improved Big Data Integrity

How Blockchain Has Unexpectedly Improved Big Data Integrity

Big Data Integrity Blockchain technology was developed to improve the integrity of bitcoin. However, as bitcoin became more popular, its ...
Digital Innovation Starts with a Digital Core

Digital Innovation Starts with a Digital Core

Digital Innovation A lot of times when the prevalent industry trends are discussed among industry folks, there are usually two ...
Breakthroughs in Clinical Trials Utilizing the Power of the Cloud

Breakthroughs in Clinical Trials Utilizing the Power of the Cloud

Cloud Computing and the Medical Field Clinical trials play an essential role in the drug development process by effectively demonstrating the ...
Beacons Flopped, But They’re About to Flourish in the Future

Beacons Flopped, But They’re About to Flourish in the Future

Cloud Beacons Flying High When Apple debuted cloud beacons in 2013, analysts predicted 250 million devices capable of serving as ...
Ransomware Cyber-Attacks: Best Practices and Preventative Measures

Ransomware Cyber-Attacks: Best Practices and Preventative Measures

Ransomware Cyber-Attacks “WanaCrypt0r 2.0” or “WannaCry,” an unprecedented global ransomware cyber-attack recently hit over 200,000 banking institutions, hospitals, government agencies, ...
Staying on Top of Your Infrastructure-as-a-Service Security Responsibilities

Staying on Top of Your Infrastructure-as-a-Service Security Responsibilities

Infrastructure-as-a-Service It’s no secret many organizations rely on popular cloud providers like Amazon and Microsoft for access to computing infrastructure ...
Apcela

After the SD-WAN: leveraging data and AI to optimize network operations

AI to Optimize Network Operations Increasing numbers of companies have implemented SD-WAN technology, thanks to benefits like higher performance, lower ...
The Lighter Side Of The Cloud - Low Tech
Technology Cloud Contributor

The Competitive Cloud Data Center

The Competitive Cloud

The corporate data center was long the defacto vehicle for all application deployment across an enterprise. Whether reporting to Real Estate, Finance or IT, this relationship served both data centers and their users well, as it allowed the organization to invest in specialized skills and facilities and provided the enterprise with reliable and trusted services.

The first major shift in this structure occurred with the success of Salesforce.com in the early 2000s, where Salesforce removed a significant hurdle for their key target, the VP of Sales, by providing the application “off premise”. VPs of Sales typically have no significant relationship with their IT organization, so this hosted offering allowed the VP to authorize the purchase fairly autonomously. Fast forward to today, and nearly all functions, from HR to Finance to ERP, are offered “as a Service”, further pressuring the corporate data center.

competitive cloud

A recent report from Gartner estimated that today, 60% of workloads are running on-prem, and this percentage is expected to drop to 33% by 2021 and 20% by 2024. All major software companies, from Microsoft to IBM to Oracle, are focused on offering their software as a Service and in fact report on this growth as a key metric in their financials. Microsoft, in particular, noted “that by FY ’19 we’ll have two-thirds of commercial Office in Office 365, and Exchange will be north of that, 70 percent.” Traditionally, these applications have represented some of the largest enterprise workloads, so the trend is unmistakable.

There are many reasons enterprise data centers will be required: location, security, financial considerations and regulatory requirements to name a few; certain organizations will continue to use them as strategic and differentiating assets. However, enterprise data centers will increasingly need to justify themselves against alternatives ranging from Applications-as-a-Service to Amazon Web Services to colocation providers like Digital Realty and Equinix.

The market has accepted that most organizations will have a happy medium of hybrid facilities, a combination of on-prem and cloud or colocation options. A most important “point of inflection” occurs in this journey: The moment an organization extends their footprint beyond their in-house data centers, they immediately need to address a host of new issues; a few include:

  • Management by SLA: Availability will always be key, but when the facilities are not theirs, an organization must rely on the SLAs and escalation procedures of the provider. This is usually one of the biggest leaps in the journey to colocation as many procedures and scenarios need to be rethought and redefined, now with an outside party.
  • Application placement: Now that there are options, where does an application run best? On prem? In a colo? In AWS? What are the right metrics to manage this placement? Cost? Security? Availability? How dynamic should this workload placement be? This is a nascent area which many organizations overlook; many vendors, from startup to established, are investing heavily to provide tools and intelligence to assist.
  • DevOps: Organizations have developed detailed DevOps procedures; when a platform such as AWS is brought into the mix, those procedures need to be reworked and tailored specifically for that platform.

The trend towards hybrid clouds is unmistakable, with commensurate benefits, but an organization must balance and justify this hybridization. One of the best examples of this consideration process come from the U.S. Federal government in their Data Center Optimization Initiative (https://datacenters.cio.gov/). DCOI is a federal mandate that “requires agencies to develop and report on data center strategies to

  • Consolidate inefficient infrastructure,
  • Optimize existing facilities,
  • Improve security posture,
  • Achieve cost savings,
  • and transition to more efficient infrastructure, such as cloud services and inter-agency shared services.”

Like numerous organizations, the U.S. Federal government has a cloud-first policy, but it has instituted a rigorous reporting and oversight process to prudently manage their data center footprint and computing strategy. Many other organizations, both in the public and private sector, should consider similar processes and transparency as they optimize their many hybrid cloud-computing options.

Existing enterprise data centers represent significant assets; how should their role in the overall cloud computing fabric be determined? Availability, cost and security will always be the dominant factors for any physical computing topology, with agility a recent addition. Define the key metrics and drivers, transparently and consistently, for the overall environment, and the best set of options will present themselves.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

View Website

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

CLOUDTWEAKS COMMUNITY PARTNERS

Each year we provide a number of highly customized branded programs to community support partners and going into our 10th year at CloudTweaks is no different. Sponsorship opportunities will be available for all budgets and sizes including the (premium) thought leadership exposure program or the webinar, podcast, white paper or explainer video lead generation programs.  Contact us for more information on these opportunities.

Cloud Community Supporters

(ISC)²
Cisco
SAP
CA Technologies
Dropbox

Cloud community support comes from (paid) sponsorship or (no cost) collaborative network partnership initiatives.