It appears sometimes that the shift of both technology and mindset to the concept called the cloud is happening extremely quickly, and although innovation and competitiveness are key concerns for any CEO or executive, the speed by which a company’s entire IT structure must transform can sometimes be overwhelming.
This is something that the experts at VMware and Hitachi have known for a long time, and is the reason why their 10-year relationship has done so well for their clients: they are very good at explaining the how’s and why’s of the shift towards the cloud and its necessity going forward.
Take the Software Defined Data Center, for example. This VMware concept allows, in the most basic terms, for a transition to a virtual world, in which all the elements that drive an IT system are no longer tied to hardware, but reside in a virtual layer, allowing IT technicians to make upgrades and add additional functionality without having to pop the hood and pull out and replace an actual component. When you combine this with flash-based virtual storage systems, a company becomes better able to capitalize on its ever growing need for storage space, processing speed and data transfer without having to resort to greater investment in “spinning disks.”
At the VMworld Conference held in San Francisco in August, CloudTweaks spoke with Hu Yoshida, VP and CTO of Hitachi Data Systems, who described how an integration of these two concepts: virtual (in terms of running things) and physical (in terms of low-cost large-scale storage) must co-exist in a way that can only be properly managed through the entire stack by a Software Defined Data Center. He points out how the Hitachi Data Systems vCenter makes this effectively seamless by creating a manageable and singular interface that replaces the previous technique of working through API’s. Given that this approach is available to all types of hypervisors, IT managers are able to manage more effectively as well as bring in new functions like vCloud without having to change the infrastructure.
In other words, the advent of the Software Defined Data Center creates a different look for IT: a simplified environment, where every new function that comes in does not require a deep dive into the elements. This makes it much simpler for companies to transition to new technologies.
The ongoing collaboration between VMware and Hitachi Data Systems helps pave the way towards a flexible and scalable approach to using the cloud. As Mr. Yoshida points out, “there still exists a current reluctance [for companies to] put core applications into the public cloud.” The idea of starting with a private cloud still appeals to many corporate decision-makers. However, when the time comes for a sharing of applications and resource between private and public clouds, the interface and procedure will be already in place and useable.
Although there are a great many providers of data services that an organization can choose from to work alongside their own IT department, the long relationship between VMware and Hitachi Data Systems, not only as technology partners, but as actual innovators and developers of the technologies and procedures, means that organizations still struggling to decide how to manage their data and their mission-critical applications into the future, would do well to check out the discussions, videos and white papers available at http://maximizeyourit.com where the wisdom of these two key players is available for review.
Post Sponsored By VMware/Hitachi Data Systems
By Steve Prentice
- When The Cloud Comes To The Black Friday Table - November 24, 2016
- Security Audits, Cyberattacks and other Potential Front Line Issues - November 23, 2016
- The DDoS That Came Through IoT: A New Era For Cyber Crime - October 13, 2016
- Ransomware’s Great Lessons - September 28, 2016
- IoT: Connected Manufacturing Leads To Service as a Product - March 9, 2016