Mainframes -> PCs -> Cloud Computing?

Mainframes -> PCs -> Cloud Computing?

“I think there is a world market for maybe five computers.”
- Thomas J. Watson (1874-1956), former president of IBM.

“640 KB is more memory than anyone will ever need.”
- Bill Gates, founder of Microsoft.

Although there is no documentary evidence of these quotes having actually originated from these two stalwarts of computing, the fact that there is considerable popular literature attributed to them is evidence of the prevailing thought at different times in computing history.

So, what do these two quotes tell us? Only that expectations in the computing industry change considerably over the decades. While Watson lived in a time when computers came into existence, Gates saw the transition of computers from humongous mainframes to tabletop PCs. Now, some analysts opine, is the next inflexion point in the history of computers, the ideal time for the transition from PCs (or laptops) to cloud computing.

Senior Vice President and Chief Analyst Frank Gens of IT industry analyst firm IDC is a strong believer in this notion, and presented his views in the keynote address 46th annual IDC Directions Conference in San Jose, California, last week. “In 1986, mainframes and terminals were the standard. Coming up was a new class of end-user device and new types of networks and computing platforms driven by the PC radically expanded the users –and uses – of IT,” he said.


IT companies looked at what was happening, made some strategic decisions and chose a direction. As you can imagine, some of them gauged what was happening correctly, and some did not. Now, 25 years later, we’re again at a crossroads, and taking the correct path is as crucial now as it was then,” Gens added.

Source: Time magazine 1982

Let’s reflect on his comments from a historical perspective. The first computers were developed for defense application, gradually finding their way to industrial use. Their sheer sizes – most of them occupied entire rooms and weighed tons – precluded their use in a personal setting.

Then came the age of miniaturization, with each month bringing news of more transistors being crammed onto a single integrated circuit. This was the time when Moore’s law really became part of geek speak – “The number of transistors on an integrated circuit double every 18 months.” The result was the rise of the personal computer.

What started out as a curiosity for the well-heeled geek has today become ubiquitous. Although TIME had declared the PC as “Person of the Year” way back in 1982, it’s truly in the last decade that the PC, and its smaller cousin the notebook/laptop, become an integral part of their lives.

However, it suffers from one drawback – if we need to work on a specific application, we have to purchase it and load it onto the machine. Even if we need it only intermittently, we are stuck with it forever, unless we choose to uninstall and reinstall it repeatedly. Moreover, our ability to process data is limited by what our machines can handle. Also, this requirement may keep the really expensive applications out of our reach.

However, with cloud computing all these problems are resolved. We access an application over the network, without having to burden our machines with the software. Our ability to process data is no longer constrained by the capacities of our machines, and we pay for only what we need and use. This allows us to access expensive applications for some periods of time without having to put down huge sums for outright purchases. Our personal machines can be smaller and cheaper, yet provide more functionality by being connected to the cloud.

It is clear that there exists a logical flow from mainframes to PCs to cloud computing, with enhanced user experience, lower costs and smaller equipment being part of the equation. Therefore, Frank Gens’ beliefs are not illogical. The computing community truly is at a crossroad here; the road taken may very well determine the advance of human civilization in years to come.

By Sourya Biswas

sourya

Sourya Biswas is a former risk analyst who has worked with several financial organizations of international repute, besides being a freelance journalist with several articles published online. After 6 years of work, he has decided to pursue further studies at the University of Notre Dame, where he has completed his MBA. He holds a Bachelors in Engineering from the Indian Institute of Information Technology. He is also a member of high-IQ organizations Mensa and Triple Nine Society and has been a prolific writer to CloudTweaks over the years... http://www.cloudtweaks.com/author/sourya/

Comments

  1. SteveW says

    As an old timer in IT (30+ years), I remember sitting in an auditorium in the early 80′s watching an IBM Research fellow’s presentation on “elastic VMs” That term was used to describe virtual machines running on a bunch of IBM mainframes (running IBM’s Virtual Machine operating system) tied together in a shared system image that shared memory and disks. It’s an over simplification but I see “cloud computing” as the latest instantiation of the class “mainframe”. While its true that technologies have evolved and the “dumb” terminals have gotten a lot smarter, the concepts around virtualization and cloud computing have been around for a long time. Had software patents been in vogue in early 80′s, IBM could well have held the patents on most of what falls under the cloud computing umbrella today.

  2. says

    There is one basic flaw in the article. It assumes that “applications” work in isolation. This was exactly the reason why many people made the ROI’s that showed that PC’s were cheaper than Mainframes. For individuals, this may be the case, but for enterprises, it certainly is NOT. Data is the most valuable asset of any company. Hence the current focus on BI etc. Integration of the various data-sources is key for companies. Ask ANY company how much work is done each day to extract, combine, filter and clean data and you will understand.
    We were first introduced to this problem when people started buying applications that “came” with a distributed server. IT was soon asked to take care of the server AND worked frantically to integrate the “new” data with existing (legacy) systems. For 1 reason only: Companies need 1 version of the truth. If anybody just goes out and acquires the application one needs in the Cloud, who is going to add the real value to the company (ie: who is going to make sure we keep working with one version of the truth?). Many large companies who still own a Mainframe have recognized this, and this is one of the reasons why many of these companies are looking at using their Mainframe as an important component of their Cloud Architecture and Vision. Let’s learn from the past so we are not caught by surprise again… More on Mainframes & Cloud? http://bit.ly/i1gMaI

Add Comment Here