OpenStack Interoperability – Dawn Of A New Era?

The Interoperability Challenge!

OpenStack has always had interoperability as one of its unique selling points. Simply put – you can use OpenStack on-premise and what you develop will also work with other OpenStack environments. Open APIs and open source is the common denominator. However until now, it has been an elusive feature or really dream that many talks about – but never truly implemented at scale. It has not been mature enough and for the few that has managed to interoperate between clouds – has spent a lot of time getting there. The industry often says – build a private cloud and then scale out in the public cloud. While there are ways to do this – there are few that truly has gotten this to work fluidly and efficiently. It is not just a technical question but equally management. Governance around what can run where – is a tricky balancing act and requires solid and clear policies – not just technical capability. No vendor lock-in sounds great – but is it a reality? There are of course always grades of lock-in.

Connect And Build

The reasons why you would like to connect two clouds – whether private or public are many. Scaling out from private to public brings not only true scalability but there are workloads that simply fit better in a different environment. Maybe as simple as you requiring a different price point? The same could be between two public clouds. From not putting all your eggs in one basket to actually utilize different features, price points or simply compliance aspects.

Enter OpenStack Kilo. It is a milestone for the promise of true interoperability in the OpenStack community. While Defcore was already created after the summit in Hong Kong in 2013 it is now that the community is starting to put the critical pieces together. In the latest OpenStack interoperability press release during the summit in Vancouver – 32 companies signed up to adhere to the guidelines to make sure it is not just software – but companies and people behind the promise of true interoperability. In the kilo release OpenStack allows for seamless federation between private and public clouds and the interest is big from both vendors as well as customers.

OpenStack has shown the way and it is now up to the vendors to not only sign up and adhere to a set of guidelines. For true interoperability cloud vendor will have to take it further than that. The CEO of Aptira, Tristan Goode held a presentation in 2012, where he outlined exactly what we have today. It took three more years to get to basic interoperability. I think we will see about the same amount of time for public cloud vendors to get governance and other management issues cleared for customers to easily move workloads between vendors. It also forces strict ways in controlling upgrades and functionality to make it as smooth as we all would like to see it.

Universal Connections

connecting

2015 is the year when we go from the dream of interoperability to actually starting to plan for us to fluidly move workloads between cloud-vendors as well as between private and public clouds. Private to public will be first out but I am sure we will soon see public to public clouds as well. We are not there but we are seeing the light and the future ahead for OpenStack is bright and will finally allow for true “no-vendor-lock-in”.

While I think OpenStack has a lot to offer any company already today – I think the interoperability is what will make it truly unique and is an immense opportunity for both vendors and customers. As we all engage our businesses more and more into the cloud – it is important for us to also have ways out of it – or at least be able to move.

By Johan Christenson

Ramanan GV

Establishing a Unified Governance Model for the Digital Workforce

Increase visual control and reduce OPEX by 30% The Digital Service Providers (DSPs) are riding an automation wave. Painful manual tasks, which burdened staffs for ages, can now be easily handled by the software bots ...
Deepak Jayagopal

Leveraging DevOps Infrastructure as Code to Improve Cloud Provisioning Time by 65%

Improving Cloud Provisioning Time Infrastructure provisioning used to be a highly manual process for Digital Service Providers (DSPs). Infrastructure engineers would rack and stack the servers and will manually configure them. Then they will install ...
Gary Bernstein

Mapping Crime Though Big Data – Leading Sources

Online Crime Maps Online crime maps is a set of tools used by law enforcement agencies to map and make an analysis of crimes and incidents in order to find possible patterns in them. Online ...
Steve Prentice

The Human Element of Zero Trust

The Awareness of Malicious and Threat Actors Security specialists have long known that a single weak link in a chain is all that is needed to bring down a cyberdefense. Sometimes this comes down to ...
Ajoy Krishnamoorthy

The Business Benefits of Mobile Expense Reporting

Mobile Expense Reporting Benefits Digital business management applications have been a game changer: transforming the ways businesses oversee day-to-day operations, add value to the bottom line, and compete in competitive markets. Cloud technology coupled with ...
Kaylamatthews

What Amazon’s Kendra Means for the AI and Machine Learning Future

Amazon's Kendra Learning Future Most people feel a bit astounded when they type a query into Google and get relevant results in milliseconds. They're probably not as impressed when using an enterprise search feature at ...