BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Intel Technology Journal 2012: The Past, The Present, and The Future of Cloud Computing

Following
This article is more than 10 years old.

I am happy to announce that a special cloud computing issue of Intel's Technology Journal has been published. The Intel Technology Journal is a peer-reviewed technical journal published by Intel that highlights the development of different technologies on a quarterly basis. I was given the honor of writing both the foreword as well as a paper focusing on "The Past, The Present, and The Future of Cloud Computing," which has been included in its entirety below.

For a complete copy of the journal, please visit the Intel Press website. 

In this issue of the Intel Technology Journal we explore some of the technologies, trends, opportunities as well as the challenges facing this exciting transition in our industry. We’ve assembled an experienced team of authors and contributors at the forefront of cloud computing who will act as your guide through this new world we call “the cloud.”

The year 2012 marks an important milestone in the emergence of cloud computing, including significant industry collaboration. We’ve seen a remarkable transformation in how we interact with Internet technologies on an individual basis and collectively as an industry. From new industry alliances to open application stacks, never before have we witnessed such a rapid transformation in how we work and interact. Often described as a revolution, cloud computing is an important transition, a paradigm shift in IT delivery. It refocuses how we view IT while creating new opportunities and challenges. Cloud computing has the potential to transform the design, development, and deployment of next-generation technologies.

The Past, The Present, and The Future of Cloud Computing

Introduction

To say cloud computing has entered the collective consciousness of the IT world would be putting it mildly. Over the last few years we’ve seen cloud computing emerge at the heart of a radical shift in the way we consume, deploy, and utilize computing technology within our digital lives. In this article I will explore the roots of the trend over several decades from desktop to mobile, to federated markets, as well as consider its future.

A Brief History

Cloud computing has been referred to as revolutionary, even magical. Like most trends in IT, cloud computing is a combination of a number of underlying trends that have long been in the works, a kind of evolutionary blend of our previous successes and failures. A key term driving the adoption of cloud computing has been the term “the cloud.” In essence the concept of “the cloud” is as a metaphor for the Internet as an operational environment where applications are utilized over the Internet rather than through more traditional means such as a desktop. No longer are users bound by the limitations of a single computing device, but instead are free to experience a multitude of devices, platforms, and mobility (both socially and physically).

To understand this trend we must follow its roots, ones that go back as far the 1960s as seen with Douglas F. Parkhill who first envisioned the coming trend. In his 1966 book The Challenge of the Computer Utility[1], Parkhill, a Canadian electrical engineer, predicted that the computer industry would come to resemble a public utility “in which many remotely located users are connected via communication links to a central computing facility.” A primary tenant of today’s cloud platforms, Parkhill’s “Computing Utility” vision spoke directly to the coming shift we see taking place today.

For many years, Parkhill’s computing utility concept remained unrealized, in part because of the immaturity of the underlying networks and technologies themselves.

In the 1980s and 1990s with the advent of the Internet and later its massive adoption, we began to see the emergence of ISPs and then ASPs, which began to form the first hosted application services. These early service providers made it possible to not only host end user applications that before this point were limited to the realm of desktops and single user servers, but also freed enterprises from managing IT operations not essential to their core businesses. Now applications could be provided as a service over the Internet and managed by a third party. Just as outsourced services had quickly become part of many enterprise strategies, companies were able to outsource there IT in the same way.

Thirty years after Parkhill’s book, in the mid-1990s, another trend began to immerge, one where service providers realized that hosting single tenant applications (applications hosted on single server or computer) was not an efficient use of computing resources. The emergence of virtualization, or the virtual representation of a computing resource, changed all this. With virtualization, computing resources become transient and adaptive, with the ability to adjust to demands of the macro environment in a near real-time fashion. A perfect storm was brewing with the Internet playing a central role.

Virtualization was the evolutionary missing link, one that gave computational resources a new found manageability and efficiency. For the first time “Virtual machines” would be able to not only scale horizontally (more resources added as needed) but vertically, whereby clones of application components could be replicated at will. This newfound freedom opened a world of possibilities. Freed from the constraints of the previous client/server models of the past, a new breed of service providers rose to take advantage of this flexibility.

The Present State of Cloud Computing

What does cloud computing look like today? Service providers have realized that access to computing capacity is the great equalizer. For the first time large scale computing had been democratized. What previously had been limited to the only the big players was now open to anyone, anywhere. The only the limitation was your imagination.

As important as the access to computing capacity is the data itself, and more importantly the information contained within. To understand this you need to think what the personal computer (PC) has done to information. What the PC revolution started and what the Internet supercharged is information creation. It could be said that more information is now being created in a few months than what was created in the hundreds of years before the Internet ever existed.

A recent article in The Economist[2] points out “that the world contains an unimaginably vast amount of digital information which is getting ever vaster ever more rapidly. This makes it possible to do many things that previously could not be done: spot business trends, prevent diseases, combat crime and so on. Managed well, the data can be used to unlock new sources of economic value, provide fresh insights into science and hold governments to account.”

For the most part the majority of the information humankind has created has not been accessible. Most of this raw data or knowledge has been sitting in various silos—be it a library, a single desktop, a server, database, or even data center. The most successful companies of the last decade have discovered how to tap into this raw data. They are better at analyzing, mining, and using this mountain of data and turning a useless raw resource into something much more useful: information.

Whether big or small, data has become the oil powering the information age.

The aforementioned Economist article also puts the idea of data as power into perspective. “When the Sloan Digital Sky Survey started work in 2000, its telescope in New Mexico collected more data in its first few weeks than had been amassed in the entire history of astronomy. Now, a decade later, its archive contains a whopping 140 terabytes of information. A successor, the Large Synoptic Survey Telescope, due to come on stream in Chile in 2016, will acquire that quantity of data every five days.”

Cloud computing has become an established approach to the management and deployment of applications within a large and growing number of businesses. The reality is that most new software is developed using cloud as the central architectural tenant. By 2015, over 2.5 billion people with more than 10 billion devices will access the Internet[3].

Gazing into the Future

Looking into the immediate future, an ever-increasing variety of choices are beginning to shape a market that is no longer made up of a few select providers, but is instead a globally expanding selection of regional and industry-specific clouds, each cloud built for the particular requirements of a business with even more particular verticals. Furthering this trend is a burgeoning array of open source software stacks powered in part by a multitude of devices all connected to the cloud that will eventually connect anything and everything.

An ongoing transition to the cloud is fueling a massive cloud infrastructure build, projected to be worth some 82.9 billion US dollars (USD) by the year 2016[4]. This new market will feature thousands of vertically focused clouds of all shapes and sizes, ranging from business, infrastructure and developer offerings, to consumer, mobile and gaming services.

Over the next few years a key opportunity within the cloud industry will be the creation of federated cloud ecosystems. These marketplaces will be defined by interoperability among multiple competing cloud computing providers and platforms using agreed-upon standards and application interfaces. One of the key attributes that will help facilitate this shift is the concept of “Global Web Scale” whereby distributed components (services, nodes, applications, users, virtualized computers) come together to form a massive global environment. These federated clouds may consist of hundreds of thousands of computing nodes from a variety of distributed resources both internal and external, and federating them together will lead to a massive global scale. This scale will require new technologies and techniques.

The management of application content and components will be an important part of this change. Driven in part by the ever-changing resource demand patterns and a lack of cooperation among end-user’s applications, particular sets of resources will get swamped by excessive workloads, which significantly undermines the overall utility delivered by the system. The ability to manage application content will become a critical part of future cloud management. How application components adapt to a constantly fluctuating environment will be an important aspect in future cloud deployments and architectures. Improved dynamic capacity allocation methods and components will enable applications to leave and join the system at will.

Finding commonality and standards within measurement, security, and power will help manage and define the growing cloud computing space. Transparency among providers and platforms will also be a significant aspect within globally federated environments. The ability to audit and define trust within a variety of clouds using common security procedures will become paramount. The ability to not only trust but verify the integrity of the entire stack is essential.

Establishing a common unit of measure, just like the kilowatt for electricity, the ability to understand in real time how much aggregate resources are required to get something done, and moreover how much those resources will cost on the smallest of increments as possible will be a important factor. The correlation of economics and measurement of resources will be a significant forthcoming trend. The ability to adapt workloads based on the amount of power consumed or required will be an even more important part of the equation as the cloud becomes larger and more global in its deployments and consumes greater amounts of energy.

In the coming years anything that can either collect or display information will. This will open a world of opportunities to connect anything and everything to the cloud. How this “cloud of things” is managed and maintained will be an increasingly important aspect of both personal and professional computing environments. The line between a personal computer and personal computing will quickly become blurred as everyday things start replacing the more traditional ways we interact with computing technology.

While some may see globalization as the key, the true opportunities will be found on a local basis. Localization will also drive an important part of the landscape for cloud services in the future as we see more and more emphasis placed on emerging markets for growth with varied business sectors. Look to Sub-Saharan Africa and Asian markets such as China and India to lead the

way in the use and adoption of cloud technology. These markets are uniquely positioned in that they have green field opportunities with a lack of legacy infrastructure in place. This lack of a legacy infrastructure creates the ideal environment to skip the technologies of the past and move directly to the more efficient cloud-centric ones of the future. This will become a technological advantage especially within high growth markets.

In Summary

Over the last forty years we have witnessed change at an ever increasing pace from Parkhill’s concept of “connecting remotely located users” to complex horizontally and vertically scaled virtual machines. With the supercharged information creation taking place, this pace will only continue accelerating as we enter the next stage of our connected future and federated cloud systems. Those who embrace the cloud will flourish enabling the next generation of technology.

Author Biography

Reuven is an early innovator in the cloud computing space as the founder of Toronto based Enomaly in 2004 (acquired by Virtustream in 2012). Enomaly was among the first to develop a self-service infrastructure as a service (IaaS) platform (ECP) circa 2005. As well as SpotCloud (2011) the first commodity style cloud computing Spot Market.

Apart from his current role as Senior Vice President at Virtustream, Reuven writes “The Digital Provocateur” column for Forbes Magazine, is the co- founder of CloudCamp (100+ Cities around the Globe). CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas and is the largest of the “barcamp” style of events. He is also the co-host of the DigitalNibbles Podcast sponsored by Intel.

You can learn more about Reuven at his website http://ruv.net or on Twitter at www.twitter.com/ruv

  • [1] Parkhill, Douglas F. 1966. The Challenge of the Computer Utility. Addison-Wesley Publishing Company, ASIN: B000O121OS
  • [2] The Economist, Febuary 25, 2010, Special report from the print edition, http://www.economist.com/node/15557443?story_ id=15557443 IDC, 2011, Intel’s Vision of the Ongoing Shift to Cloud Computing. Sources: IDC “Server Workloads Forecast” 2009; and IDC “The Internet Reaches Late Adolescence” Dec 2009
  • [3] Visiongain, November 7th 2011. Cloud 2011: Moving into the Realm of an Essential IT Strategy, http://www.visiongain.com/Report/643/ Cloud-2011-Moving-into-the-Realm-of-an-Essential-IT-Strategy
For a complete copy of the journal, please visit the intel.com website
©INTEL CORPORATION