Category Archives: Cloud Computing

Private And Public Cloud Migration Standards

Private And Public Cloud Migration Standards

Cloud Migration Standards

Ever since cloud computing has seen mainstream adoption, the ability to migrate data between public or private clouds is becoming a key concern, especially due to the large size and the cost involved in the switch. The first question that comes to mind when discussing cloud migration is if there are any cloud standards that will ensure interoperability? This is because in the absence of such frameworks, the effort of translation or manual data transfer operations might exceed budgets, especially for large public clouds. Hence it is essential to understand if there are any interoperability standards that may help in the migration. Another key point is to understand the reasons behind a cloud migration whose assessment will justify such a shift.

We can say that cloud interoperability is still a topic under discussion by various cloud providers including HP, Red Hat, Rackspace, Citrix etc. The forum known as ‘OpenStack’ is promoting to build an open platform with open standards that various cloud providers can integrate in their systems making them more interoperable. The platform is backed by several players as seen in this list of supporting companies and many are hopeful that the cloud community will soon witness a complete set of standards along with guidelines to form an OpenStack certified cloud system. The standard is expected to cover all major components in the cloud including compute, networking, storage and OpenStack shared services running on standard hardware and providing an OpenStack dashboard. The custom user applications will run on top of this framework. This is collectively known as OpenStack cloud operating system that is easier to move between public and private cloud setups due to ease of interoperability.


Another organization that runs by the name of Open Data Centre Alliance (ODCA) is also working to formalize the a specifications set for enterprise ready cloud. They have released a virtual machine interoperability usage white paper that describes their suggestions with examples from a test bed which features several offerings in this proposal along with architecture diagrams. The paper concludes with the statement that, “A capability for VM interoperability is an important precondition to truly realize the oft expressed benefits of virtualized clouds, such as the ability to balance resources through fungible pools of resources, business continuity and load balancing by leveraging distributed publicly available resources, as well as demonstrable avoidance of lock in to a single Cloud Provider, platform or technology.”

There is a cost associated with cloud migration which becomes a significant factor in making a decision about this move. The key factors to consider include IT service reimplementation, data destruction and sanitization, developers resource training, user guidance, regulatory compliance, vendor lock-ins and portability. Hence it comes down to whether or not this shift is justified in terms of expenditure, downtime and future gain in the business. Ideally one should look into cloud systems can support at least one open standard to facilitate future migration.

By Salam UI Haq

How CloudCheckr Is Leading The Cloud Analytics Charge

How CloudCheckr Is Leading The Cloud Analytics Charge

How CloudCheckr Is Leading the Cloud Analytics Charge 

Over the past few years, there has been an explosive growth in cloud computing, mainly due to the cost efficiencies of its pay-as-you-go model as compared to legacy IT. However, many companies that have migrated to the cloud have done so for more than cost savings; for them, performance has been the deciding factor. Whatever be the reason, these companies want to know what they are getting for their money.


(Image Source: Cloudcheckr)

This is where the specialized field of cloud analytics comes in to deliver answers. From utilization and cost analytics that are more important in the pay-as-you-go model, to performance and security analytics that are the focus of different cloud architectures, cloud analytics is a field that has attracted a number of players.

Among the leading contenders is CloudCheckr, a company that has carved a niche for itself by concentrating on the most popular public cloud provider Amazon Web Services (AWS). Unlike many of its competitors that focus solely on cost analytics, CloudCheckr places equal importance on performance analytics. As CloudCheckr founder Aaron Klein explained, “Cost is a major consideration for public cloud users, but it is not the only one. Users need tools for control beyond cost. Both SMBs and enterprises need visibility into performance. This means tracking resources, utilization and deployment changes. This means alerts around security and best practices. CloudCheckr’s goal is to deliver all of these capabilities while still retaining its best in class cost and utilization analytics.”

The company recently announced major service improvements that address the aforementioned user needs. It has added change monitoring alerts, map overlays, and utilization heat maps for both Glacier and ElastiCache, and usage predictors for services such as S3. While all these allow public cloud users to improve performance, CloudCheckr has given cost analytics equal importance by expanding its cost predictors for EC2, RDS and other AWS services. Additionally, to cater to the needs of growing SMBs and enterprises, it has added multiple sorting, tagging and cost allocation features.

The company is also growing on the ground, as evidenced by a fresh $2 million of funding in April and additions to its workforce. CloudCheckr offers new customers a full-featured free trial (, a permanent free tier of service, along with a competitively-priced $179/month professional tier. With more and more businesses moving to the cloud and looking to optimize their deployments, CloudCheckr should continue to grow aggressively.

By Sourya Biswas

The Dark Side Of Cloud Computing Branding

The Dark Side Of Cloud Computing Branding

The Dark Side of Cloud Computing Branding

The promises of a better IT world being spouted by Cloud Computing are often followed by a shadow that is trying to brand all of its claims as lies and marketing propaganda made by service vendors in order to make more profit. When something is claimed to be the end-all-be-all of computing, a lot of people will undoubtedly be pushing on the opposite side of that claim.


(Image Source: Shutterstock)

The claim is that Cloud Computing is a myth, a fabrication by marketing experts in order to sell services or software again to the people who already own them. They are saying that Cloud Computing is only a rebrand of already available technology we know as data centers, enterprise computing, and the internet itself, or rather through the internet. It is supposedly just a jumble of words like Software as a service, utility computing, virtualization and the World Wide Web. It’s actually just confusing, and it was apparently meant that in order to sell because supposedly if people cannot understand it, then it must simply be too advanced and above their understanding; people will eat it up.

Cloud Computing is just a combination of technical and economic trends that have been around for more than a decade now, an old and rundown car with a new paint job. That statement is a blow to what Cloud Computing certainly stands for, which is a fresh and a new take, a new paradigm in the usage of computing resource which enables everybody to take part in seemingly infinite computing power affordably. And this dark view is sadly backed by Private Cloud providers knowingly or unwittingly.

Since real cloud providers are enjoying economies of scale allowing them to set prices really low because of a massive user-base, other companies are trying to take their own slice of the pie through some subterfuge and sleek marketing. These companies are highlighting all the flaws and issues of Cloud Computing and making big deals out of them, blowing them out of proportion in order to sell their Private Clouds. This is one of the reasons that Cloud Computing is getting a lot of chaff from many experts.

What happens when a customer vies into a Private Cloud is that he is buying or renting hardware that is powerful enough to accommodate the largest usage spikes within their network, set up the hardware and software infrastructure in the same way as a Public Cloud and then allowing only those with access to their private network into their own Private Cloud. True that it might be, in all technicality and definition, an implementation of Cloud Computing, but the essence of it is lost. And this essence which speaks of affordability and elasticity/flexibility and its on-demand nature is truly what Cloud Computing means. Companies getting into Private Cloud are the ones buying that old car which looks and smells new. As I mentioned many times before, we have a term for this kind of Implementation – Data Center.

There may be a lot who will disagree about this, but this view is a reality and is just as valid as any. If it is not public if it is not on-demand, if it does not show flexibility or elasticity through dynamic provisioning, if it is not affordable, then it is not Cloud Computing.

By Abdul Salam

IBM To Acquire SoftLayer To Accelerate Adoption Of Cloud Computing In The Enterprise

IBM to Acquire SoftLayer to Accelerate Adoption of Cloud Computing in the Enterprise

IBM to Form New Cloud Services Division

ARMONK, N.Y. – 04 Jun 2013: IBM (NYSE: IBM) today announced a definitive agreement to acquire SoftLayer Technologies, Inc., the world’s largest privately held cloud computing infrastructure provider. The acquisition will strengthen IBM’s leadership position in cloud computing and will help speed business adoption of public and private cloud solutions. Financial terms were not disclosed.

As businesses add public cloud capabilities to their on-premise IT systems, they need enterprise-grade reliability, security and management. To address this opportunity, IBM has built a portfolio of high-value private, public and hybrid cloud offerings, as well as software-as-a-service business solutions,” said Erich Clementi, Senior Vice President, IBM Global Technology Services. “With SoftLayer, IBM will accelerate the build-out of our public cloud infrastructure to give clients the broadest choice of cloud offerings to drive business innovation.”

IBM is acquiring SoftLayer to make it easier and faster for clients around the world to incorporate cloud computing by marrying the speed and simplicity of SoftLayer’s public cloud services with the enterprise grade reliability, security and openness of the IBM SmartCloudportfolio.

SoftLayer accelerates IBM’s ability to integrate public and private clouds for its clients, with flexibility that provides deployment options that enable a faster, broader transformation for small, medium and large businesses with a range of performance and security models.

Headquartered in Dallas, Texas, SoftLayer serves approximately 21,000 customers with a global cloud infrastructure platform spanning 13 data centers in the U.S., Asia and Europe. Among its many innovative cloud infrastructure services, SoftLayer allows clients to buy enterprise-class cloud services on dedicated or shared servers, offering clients a choice of where to deploy their applications. These clients will benefit greatly as new enterprise grade functionality from IBM emerges for SoftLayer customers, who will then have a unique opportunity to incorporate it as their business grows.

“SoftLayer has a strong track record with born-on-the-cloud companies, and our move today with IBM will rapidly expand that footprint globally as well as allow us to go deep into the large enterprise market,” said Lance Crosby, CEO of SoftLayer. “The compelling opportunity is connecting IBM’s geographic reach, industry expertise and IBM’s SmartCloud breadth with our innovative technology. Together SoftLayer and IBM expand their reach to new clients – both born-on-the-cloud and born-in-the-enterprise.”  

Already one of the world’s leading cloud providers, IBM expects to reach $7 billion annually in cloud revenue by the end of 2015. IBM offers more than 100 SaaS solutions to help marketing, procurement, ecommerce, customer service, human resources, city management, and other professionals make better decisions and better serve their customers. IBM also offers Watson solutions such as Client Engagement Advisor in the cloud, superior solutions such as IBM PureSystems and SmartCloud Enterprise+, as well as mission critical cloud services for SAP.

IBM is a leader with enterprise customers based on its vertical industry expertise delivered from 10 cloud computing centers on five continents. The acquisition of SoftLayer will complement IBM’s existing SmartCloud portfolio, providing enterprises with easy access to a broader range of choices that transform their workloads while continuing to innovate with SoftLayer to meet the needs of born-on-the-cloud firms.

New IBM Cloud Services Division 

Recognizing the importance of cloud to global clients, IBM today is announcing the formation of a new Cloud Services division. Following the close of the acquisition of SoftLayer, which is expected in 3Q 2013, this new division will combine SoftLayer with IBM SmartCloud into a global platform. The new division will provide a broad range of choices to both IBM and SoftLayer clients, ISVs, channel partners and technology partners. SoftLayer’s services will complement the existing portfolio with its focus, simplicity and speed. The division will report to Erich Clementi, Senior Vice President, IBM Global Technology Services.

“Our clients are telling us they want to realize the transformative benefits of cloud today – not just for individual applications, but across their entire enterprise,” said Clementi. “SoftLayer is a perfect fit for IBM. It will help us smooth the transition of our global clients to the cloud faster, while enabling IBM to more efficiently offer them its broad portfolio of open IT infrastructure and software services.”

IBM intends to expand SoftLayer cloud offerings to include OpenStack capabilities, consistent with its entire SmartCloud portfolio and historic commitment to open standards such as Linux. Given that most companies will mix public and private cloud services, clouds need to interoperate. In that way, firms can better leverage cloud to run their social, mobile and Big Dataapplications.

IBM will also support and enrich the SoftLayer cloud-centric partners and ecosystem and its performance capabilities for Big Data and analytics. IBM will provide go-to-market and customizable resources for its expanding cloud ecosystem.

The Value of SoftLayer 

Among its many innovative cloud infrastructure services, SoftLayer allows clients to buy enterprise-class cloud services on dedicated or shared servers, offering clients a choice of where to deploy their applications. By building out a cloud with IBM and SoftLayer, a client can choose the work that belongs on a dedicated or a shared computing resource – thereby tailoring the privacy, data security and overall computing performance to the client’s needs. Importantly, this level of reliability and scale is critical for cloud-centric companies.  

SoftLayer provides the infrastructure for cloud-centric, performance-intensive applications in the areas of mobile, social media, gaming and analytics. The growing number of businesses incorporating mobile computing is helping drive SoftLayer growth.

·    In the last two quarters, more than 60 new gaming companies have moved to the SoftLayer global platform, frequently migrating from commodity cloud platforms because of problems with cost, latency, availability and raw performance.

·    SoftLayer’s architecture provides superior technical capabilities such as a software definable environment critical to a cloud infrastructure, programmable interfaces, and hundreds of hardware and network configurations.  This is designed to deliver a higher level of flexibility — mixing virtual and dedicated servers to fit a variety of workloads—automation of interfaces and hybrid deployment options.

·    SoftLayer’s automated networking infrastructure supports public, private and data center-to-data center architectures, and is designed to provide maximum flexibility and control for clients. The SoftLayer IT infrastructure connectivity enables connections with leading global network providers and Internet access networks.

•   IBM SaaS solutions for Smarter Cities, Smarter Commerce and other applications will be made available via SoftLayer over time, providing line-of-business clients improved time to value and new innovation across an increasingly integrated portfolio of solutions that accelerate business process innovation, provide analytics at the point of impact, and connect collaborative business networks within and across organizations.

The acquisition is expected to close following customary closing conditions including regulatory clearances.

About SoftLayer
SoftLayer, whose majority shareholder is GI Partners of Menlo Park, Calif., operates a global cloud infrastructure platform built for Internet scale. Spanning 13 data centers in the United States, Asia and Europe and a global footprint of network points of presence, SoftLayer’s modular architecture provides unparalleled performance and control, with a full-featured API and sophisticated automation controlling a flexible unified platform that seamlessly spans physical and virtual devices, and a global network for secure, low-latency communications. With 100,000 devices under management, SoftLayer is the largest privately held Infrastructure-as-a-Service (IaaS) provider in the world with a portfolio of leading-edge customers from Web startups to global enterprises.

About IBM Cloud Computing

IBM has helped thousands of clients adopt cloud models and manages millions of cloud- based transactions every day. IBM assists clients in areas as diverse as banking, communications, healthcare and government to build their own clouds or securely tap into IBM cloud-based business and infrastructure services. IBM is unique in bringing together key cloud technologies, deep process knowledge, a broad portfolio of cloud solutions and a network of global delivery centers. For more information about cloud offerings from IBM, visit Follow us on Twitter at and on our blog at

Simplifying Workplace Collaboration Using Cloud Communications

Simplifying Workplace Collaboration Using Cloud Communications

Simplifying Workplace Collaboration Using Cloud Communications Cloud…

gives enterprises an opportunity to streamline all their business operations into one big place, the “Cloud”. With multitudes of small, medium and big enterprises embracing the Cloud and the early adopters broadening their penetration, why should communications be left behind? For years, voice communications systems within an organization, big or small, were never considered to be a good candidate for deployment on the Cloud under a SaaS or IaaS model. Thanks to the “liberalization” of communication technologies, Cloud based communications have made their way into the Cloud and are here to stay. Services like Twilio have now made it easier than ever to build enterprise communication services which entirely reside on the Cloud, saving customers the cost, complexity and time it requires to set up an on-premise voice communication infrastructure. Customer services, both internal and external will stand to benefit the most.


(Image Source: Shutterstock)

In the early days of phone based customer service, complete voice routing and phone systems had to be installed in a dedicated customer care “call center”. This changed with IP based voice systems which promised significant cost savings, both in terms of infrastructure required to operate a service center and also the cost of voice calls. With Cloud, another major shift is happening, serving voice based services directly from the Cloud, without the need for sophisticated and bulky infrastructure and delivering better, if not same quality of service. Cloud communications have also achieved something which even IP telephony could not: liberalization of voice communication services. This is evident with the success which Twilio has achieved, both in terms of adoption and the quality it delivers. What’s interesting is that even independent developers can put together a voice communication service within minutes! I personally tried it to route calls from an online number, which I purchased on Twilio, creating a voice menu for the number and then routing the call to the relevant person based on what the caller selects. It took me less than an hour to achieve this.

Enterprises now have a plethora of collaboration and communication services to meet diversified and changing needs and requirements of the modern workforce with the goal to increase productivity and keep the focus on business instead of managing the infrastructure. Unifying all these collaboration and communication services deployed in-house for employees and external customer care services not only improves employee collaboration and customer care but also addresses the changing workplace with small teams placed remotely and home based or mobile employees. Solutions served out of the Cloud give enterprises, small and big, the opportunity to deploy UC (Unified Communication) and extract all the benefits which comes with this strategy while at the same time cut down on IT cost, both in terms of the spend on infrastructure and its management.

Frost & Sullivan has announced an eBroadcast featuring talks from Cloud communication industry experts. This eBroadcast will give you an update on the Cloud UC (Unified Communications) and external customer care markets. It will also discuss some of the benefits of cloud communications and will recommend strategies for selecting the right cloud solution and provider.

To register, visit this page.

By Salam UI Haq

How Amazon Brainwashed Us All (and Joyent Too)

How Amazon Brainwashed Us All (and Joyent Too)

How Amazon Brainwashed Us All (and Joyent Too)

When you enjoy a first-mover advantage in a new market as Amazon has for the last 7 years in the public cloud, you get to dictate the terms of the initial conversation (Think Henry Ford “You can have any color so long as it’s black”). That doesn’t mean we all have to keep listening, though, allowing them to brainwash everyone into thinking about cloud exclusively in a way that plays to their advantages. Instead of challenging Amazon to stick to the fundamentals of flexibility that supposedly necessitated the public cloud in the first place, companies like Joyent are following along…

Why cloud in the first place?

Remember this?


(Image Source – thanks,

This is the diagram that justified public cloud, showing how in a classic on-premise solution you were forced into capital expense based on future capacity predictions. With cloud, we were told, you don’t have to make predictions which are ultimately doomed to failure anyway causing either an overcapacity spending nightmare or an under capacity business limiting situation. It’s supposed to be about flexibility.

At least, until it’s inconvenient for Amazon to be flexible.

The deal you REALLY make with Amazon

When you sign up for AWS, the deal you’re really making is that you’ll agree the following:

  • I’ll let you dictate to me what sizes my VMs can be. Instead of letting me pick how many CPU cores and how much RAM I need for my workload, I’ll choose from among 18 sizes you pick for me and pay for resources I don’t need. After all, those cookie-cutter sizes make the multi-tenancy density easier and more profitable for you so I’ll do my part by paying for things I won’t use.
  • I can’t possibly expect consistent performance from my VMs. Is that fair to you, really? Instead of holding you accountable for any quality, I’ll design around it with a deployment strategy that launches five VMs, runs performance benchmarks on each, and keeps the one good one.
  • If I need to scale, I’ll do so horizontally. Why even consider a vertical scaling option? All my apps were designed with an on-premise solution in mind where we could add memory whenever we wanted to. I’m sure running something intended for a single machine will run on multiple VMs just fine as my demand grows. What could possibly go wrong?
  • To get the best pricing, I’ll predict how much resource I expect to need and pay you a large upfront fee. I don’t even have to recoup all of it if my predictions were wrong. I’d just like to reserve that price for resources I might not use.

Wait a minute, what about that last one? Isn’t cloud supposed to be flexible resources on-demand instead of making a prediction on capacity need that will ultimately be incorrect? So how come the best AWS pricing comes with that exact same model?

Don’t ask Joyent, they just did the same exact thing.

A hypnotized market

The conversation around public cloud is starting to resemble that old Jon Lovitz SNL skit about the hypnotist with a Broadway show (Amazing Alexander, thanks Hulu) where too many people keep repeating the same lines Amazon feeds them over and over again. Amazon’s business model is to buy commodity hardware at insane volume direct from manufacturers and then sell a notion of flexibility that isn’t what it could be. It’s better than the old on-premise world, but when you overhear someone at a trade show debating the merits of a m1.xlarge vs a m2.xlarge vs m2.2xlarge instead of just how many CPU cores and how much RAM they actually need, there’s room for improvement.

So when Joyent announced support for reserved pricing in a model very similar to what Amazon is doing, not only are they chasing a competitor whose economies of scale they probably can’t match they are sending a dangerous signal to the marketplace that the status quo is just fine.

It isn’t.

Worldwide IT spend is estimated to be around $4 trillion and public cloud spend only $4 billion. What are the 99.9% waiting for? Something better.

Price/Performance > Price

There’s growing sentiment that price/performance should be a part of the purchase decision. Value with anything, including public cloud, is derived from a combination of factors and getting started with a performance characterization of public cloud providers has never been easier. Third party cloud benchmarking and performance reports, like those available from Cloud Spectator, can provide a guide for narrowing choices among IaaS vendors before running your own application-specific tests. Only then, and when considering flexibility factors, can you truly judge the total cost of ownership for a cloud solution.

The bottom line is that Amazon got to define the way the market things about public cloud functionality by going first, but that doesn’t mean they get to own the definition in perpetuity. As Rackspace CEO Lew Moorman recently saidWhen public cloud came out, and you could suddenly provision a server in a minute when it used to take 3 months, those were intoxicating advances. . . you get drunk on them, but when things settle in there are tradeoffs.” As a consumer, you owe it to yourself to explore what those tradeoffs are and in what choices Amazon is falling short in not providing.

By Pete Johnson,

Senior Director of Cloud Platform Evangelism, ProfitBricks

After a 19-year career with HP that included a 6-year stint running Enterprise Architecture for as well as being a founding member of HP’s public cloud efforts, Pete Johnson joined ProfitBricks in February 2013 as Senior Director of Cloud Platform Evangelism. @nerdguru on Twitter, Pete is active in social media, trade shows, and meet ups to raise awareness of Cloud Computing 2.0 from ProfitBricks.

Implement 2013: The Ideal Data Backup And Cloud Trends

Implement 2013: The Ideal Data Backup And Cloud Trends

Implement 2013: the Ideal Data Backup and Cloud Trends

It’s likely you’ve heard phrases like “data is the new currency.” This is true, but more importantly data is how you and your small company make transactions with customers. It is important to almost every aspect of your small business, so the way you store it and back it up really matters.

There is no shortage of data storage options for you to choose from. For years, small and large businesses alike depended on tape-based and hard disk storage. In fact, many companies still do. But as data storage trends start to shape how we view and manage data, more businesses are migrating to cloud-based and/or virtual storage systems.

Let’s take a closer look at current data storage trends, and what you should look for in a data storage provider.

Small Business Data Storage Trends

Historically, SMBs have not been early adopters. They usually gravitate to tried and true methods for data storage that fit within their shoestring budget. However, in recent years localized storage has become clunky and inefficient.

Now, more than ever, SMBs are gravitating to larger, modern technologies like cloud computing and virtualization to manage their data storage and backups. Beyond the cloud, these same SMBs are looking to alternatives found in hard disk storage, solid-state disk (SSD) storage, hard disk/solid-state hybrid systems, and alternative storage methods like compression and data de-duplication.

Cloud Computing Service Concept

An Ideal Data Storage Solution

For many SMBs the data storage and back up landscape seems overwhelming. Navigating the world of cloud storage providers, as well as a wide range of on-premise data solutions that are still available is nearly impossible if you’re not 100 percent sure what your data needs. The bottom line is that small businesses everywhere need to start by looking at the bigger picture.

This means it’s time to get serious about your data management strategy. This also entails mapping out your current , and often means being real about what’s not working in your current data strategy. From there you’ll be better poised to start considering what the best storage option for you.

Essential Cloud Storage Features

While we’ve mentioned all sorts of data storage options, most SMBs are gravitating toward the cloud. This is largely due to the fact that the cloud is more affordable than high-speed on-premise data storage solutions. Unfortunately, not all cloud providers are created equal. Some may not meet your needs, or they may be completely out of your price range. Let’s look at some essential cloud storage features that will help move your new data strategy forward.

  • Storage efficiency: This seems like a no-brainer since this is at the core of every cloud-based data storage business. But some storage systems simply aren’t efficient. There are lags in performance. Servers go down regularly. Make sure you get what you are paying for in stability, performance and bandwidth.
  • Responsive design: The cloud storage dashboard will scale seamlessly between laptops, desktops and mobile devices. Today’s small businesses are always on the go, and after all, one of the main advantages to cloud storage is that you can have access to your data from virtually anywhere in the world.
  • Easy-to-use Dashboard: For many cloud storage and backup services, the service is only as good as its dashboard. Is it easy to navigate? Can you chat with other administrators in a group chat session? Is the dashboard secure? Can you understand the basic functions of the dashboard in a quick glance? These are all important questions to answer.

Cloud Questions

As you continue your search for the right cloud provider, get a feel for how reputable the cloud storage provider is and how easy they are to work with. Here are three essential questions you should ask every service provider:

  1. How can we download/upload files and folders?
  2. Is my data secure?
  3. Is your service easy to implement with my existing data?

By Walter Bailey

Australia Follows US, UK Lead In Embracing The Cloud Nationally

Australia Follows US, UK Lead In Embracing The Cloud Nationally

Australia Follows US, UK Lead in Embracing the Cloud Nationally

The Land Down Under has finally prioritized the use of cloud computing for public bureaucracy, albeit a notch lower than the dedicated approach by two economic powers, on either side of the Atlantic. The Aussie approach of the cloud niche is one of gradual adoption, where necessary, whereas that of Britain and the United States is a ‘do or die’ unilateral approach. It was only last month that the United Kingdom conjoined all IT departments, in public offices, through a single, mandatory cloud infrastructure. Now, Australia has used its National Broadband Network (NBN) to reach out to public institutions, to maximize on the storage, efficiency and economical attributes of cloud computing.

Australia Desert

A Little Low key

Though the state has shown its willingness to achieve great heights by acknowledging the power of technology, nationally, it is, nevertheless, not an all-out run into the cloud. Rather, the government has specified that a huge chunk of the $5b that goes to the Information Technology budget, per year, will branch off to the cloud segment of the sector. Furthermore, the state advised that while it is not mandatory that all mass offices must adopt the technology, particular departments should upgrade to the cloud when it is virtually necessary.

Though jittery about the lack of holistic immersion by the government, analysts are lauding the move, calling it ‘smart,’ especially from an economical point of view. This is the initial step for an industry that will surely receive whole embracing by the country in coming years. Observers are also simmering down the hardcore stand by critics who advise of more government commitment in the sector, by saying that even when the state declines to follow the British lead, it will still reap rewards.

They also call this a prudent move because of the fact that the Land Down Under is not as under similar financial constrains as its more illustrious equivalents, the US and UK. This explains why the administration did not regard the migration to the Infrastructure as a Service (IaaS) niche, so critical. Indeed, most developing countries, by their very economical dilemmas, have no choice but use the cheap service of compute technology to save costs. Thus, it is not surprise that a well-off economy like that of Australia should stay behind times and wait until when its cloud industry has reached maturity and then act. Who knows but the economy will be better off, then, than it is now?


There are some in Australia, and abroad, who beg to differ on the issue of cloud computing playing second fiddle to IT, in general. They are cautioning that the sector is impossible to do without in the future and thus, the state should have brought about a strong blueprint and deadlines to make this a sector with staying power. As a first move, they are advising that there should have been a plan of gradual adoption of the technology, rather than the sketchy, financial tack that is the current approach.

Others are pointing out to the offshore cloud’s taking advantage of this mechanism in the dearth of a dynamic national program. They say that had the administration decided to strengthen its sector, it could have attracted open source app developers and cloud players from abroad that could have helped develop a unique framework within the country’s borders.

There is a light at the end of the tunnel, however, for Aussie companies. Firms are touting that this move tilted the dice table to their side in that they can wholly immerse themselves in the international market. They say that this is where the future of cloud computing lies. Furthermore, they feel like they can be more competent when they are tackling rivals all over the World Wide Web.

By John Omwamba

(Image Source: Shutterstock)

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

5 Things To Consider About Your Next Enterprise Sharing Solution

5 Things To Consider About Your Next Enterprise Sharing Solution

Enterprise File Sharing Solution Businesses have varying file sharing needs. Large, multi-regional businesses need to synchronize folders across a large number of sites, whereas small businesses may only need to support a handful of users in a single site. Construction or advertising firms require sharing and collaboration with very large (several Gigabytes) files. Financial services…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

Achieving Network Security In The IoT

Achieving Network Security In The IoT

Security In The IoT The network security market is experiencing a pressing and transformative change, especially around access control and orchestration. Although it has been mature for decades, the network security market had to transform rapidly with the advent of the BYOD trend and emergence of the cloud, which swept enterprises a few years ago.…

Having Your Cybersecurity And Eating It Too

Having Your Cybersecurity And Eating It Too

The Catch 22 The very same year Marc Andreessen famously said that software was eating the world, the Chief Information Officer of the United States was announcing a major Cloud First goal. That was 2011. Five years later, as both the private and public sectors continue to adopt cloud-based software services, we’re interested in this…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…


Sponsored Partners