Going Virtual? Keep one eye on the hardware and the other on habits
Central to the huge trade and education event called VMWorld is the notion that everything is going virtual in a big way. Terms such as virtualization and software defined networking are now becoming mainstream, or more more precisely, must now become mainstream, and the 22,000+ conference attendees are listening and talking intently. VMWorld CEO Pat Gelsinger, as well as VMWorld Hybrid Cloud SVP Bill Fathers, and others demonstrated during the Monday morning opener that not only is virtualization crucial to the commercial success of organizations moving forward, but many key players including eBay and GE have already embraced it.
This comes as no surprise to experts like Iddo Kadim, Director of Datacenter Technologies for Intel. He points out that this is no mere swing on the roundabout, a repetition of the growth of mainframe or networked technologies from previous eras; it is now a matter of scale across more than just the physical dimension. The high-tech environment has grown to a size where management of it has gone far beyond simply being a hardware issue. “The manual processes that are required to manage it the old way,” he says, “can’t keep up with the growth that is required to satisfy the demand for compute, storage and networking.” As an example he adds, “you can count 100 jelly-beans, even if it takes time to get it right. But try counting 10,000.”
Kadim states that in this newest of new ages, there needs to be a desegregation of control from physical infrastructure. There needs to be an automated, separated place where software resides a layer above the hardware itself, in order to facilitate multiple operating systems and applications.
He points out that there still exists great deal of confusion with the term cloud, let alone virtualization. Executives, he said, must be careful not to get carried away. As with all major change, they might find untenable risks to their business if issues of control are managed separately, such as by a third party provider who, by owning the information, essentially owns the business.
In makes sense, he says, for organizations to first build a private cloud. By doing this they can virtualize the separate elements and grown and scale as needed. He describes this as a type of automation umbrella: “build internally with an awareness for an eventual hybrid cloud later.”
Kadim also made an excellent point about the inevitable abstraction that comes from virtualization. Even as a company builds a virtualization strategy, at end of day, the applications still work on hardware – they work differently, and more dynamically, but machines are still there, somewhere. That means that to have real control, decision-makers still need to have information about their infrastructure. They still need to say, “these are my walls, these are my machines.” Even though the data and the processing have become virtual, adequate information in on the mechanical is essential.
CloudTweaks asked Kadim for his thoughts regarding the recent outages suffered by Amazon, Google, Apple, and Sony, and how this affects public perception and acceptance of major technological leaps forward such as those we now face. He echoed the sentiments of many leaders in the virtualization community, including Simon Crosby, ex of Intel and Citrix, and now founder of and CTO of Bromium, who, at Interop 2011 stated that cloud technology, in terms of reliability and safety, is much like commercial airlines, in that big planes do not go down very often, and when they do, the results can be painful, but overall, the cloud remains safer than IT infrastructure, as planes are when compared to private cars.
Bad things happen, Kadim continues, even when redundancy is built in, when there is a confluence of events. Sometimes the failure is total, and other times it is partial, as in the example of Netflix, where some services blacked out but others remained. The bottom line, he says is that there must be clear communication between a company and its cloud vendors – clear dialog, and clear understanding. As with many other areas of life, it is up to the customer to establish the rules and to demand clarity, to ensure that loss of control or of data or even presence is unlikely and never total.
There are many providers out there who will start to offer virtualization services to align with the growth in demand. Some will come from a hosting background and will build an enterprise mindset. Others will embrace the Amazon model: a blank infrastructure with an impressive partner ecosystem, which is built well, with other well-built services included on top.
Kadim warns that as organizations large and small move into the new world of software defined networking, effectively separating the intelligence from the hardware, that they do so without falling prey to either hype or outdated mindsets.
By Steve Prentice
Steve is an acclaimed author and professional speaker who delivers timely, relevant, entertaining and informative keynotes dealing with technology, people and productivity in the workplace. As a mentor, he works with executives in one-on-one discussions, delivering answers and guidance to issues dealing with technology, personal time management and other practical skills. In addition, Steve is also a technology writer and consultant for CloudTweaks Media.
Latest posts by Steve Prentice (see all)
- Would You Like ‘Gravy’ With Your Cloud Monitoring? - March 10, 2014
- Cloud Infographic: Application Performance Monitoring - March 4, 2014
- Federal Agencies Moving To The Cloud: Bellwether Report - February 25, 2014