BETA
This is a BETA experience. You may opt-out by clicking here
Edit Story

Virtualization: 6 Reasons To Finish What You Started

NetApp

Editor’s note: In this, the first of Peter H. Salus’s articles, he explains virtualization with an unusual metaphor, puts it in historical context, and suggests some often-overlooked benefits.


Children play games. They pretend they have a household, a ray-gun, a racing speedboat, a spaceship.  Fantasy sports leagues thrive: Hardly anyone owns a sports team, but many can simulate ownership.

Virtualization is similar, because while each user acts as though she or he has the hardware and software at their fingertips, it’s an illusion.

source: flickr.com/jayneandd (cc:by)

For the enterprise, virtualization can bring many advantages. In no particular order, here are six:

  1. Virtualization can help reduce capital expense through server consolidation: Each user appears to have their own machine.
  2. Management in a virtual structure is more efficient: Fewer machines require updating, fewer copies of software need installing.
  3. Virtualization can also reduce risk, by reducing the number of physical vulnerabilities, as only the source, not the virtual instance can be attacked.
  4. Enterprises can become more energy-efficient through reduced hardware use, as a few full-utilized servers require less power than many under-utilized ones.
  5. Server consolidation makes policy compliance easier, as installations can be easily created from template images.
  6. A large number of enterprise apps—including Exchange, SAP, Sharepoint, and SQL Server—can be virtualized, yet still provide high performance.

Decades ago, time-sharing and networking enabled users to employ the resources of “Big Iron.” Today the enterprise can be similarly enabled by employing virtual machines, virtual networks, virtual storage, and virtual software.

Over the past 40 years the impulse has been to have fewer pieces of hardware, fewer cables, and less software; but also to give the illusion of having more accessible to the user and the enterprise.

A Mercifully-Brief History Lesson

However, people miss many of the more subtle benefits of virtualization. As a technology historian, I also find that people think virtualization is somehow a recent thing.

A few years ago Margaret Rouse wrote:

Operating system virtualization is the use of software to allow a piece of hardware to run multiple operating system images at the same time. The technology got its start on mainframes decades ago, allowing administrators to avoid wasting expensive processing power.

In 2005, virtualization software was adopted faster than anyone imagined, including the experts. There are three areas of IT where virtualization is making headroads, network virtualization, storage virtualization, and server virtualization.

This is a typical viewpoint. There’s nothing wrong with it, as such, but virtualization theory and process go back as far as the 1960s.

It helps to fully understand the source and the paths taken, so that you can reap the benefits of virtualization.

The basis of everything was time-sharing, beginning with Fernando Corbato's Compatible Timesharing System at MIT in 1961. CTSS gave rise to several other systems and was soon employed by the manufacturers.

In 1967, IBM's CP-40 system contained a control program for CP/CMS. This was distributed for the next five years as what we would today call open source, and was re-implemented in 1972 for IBM's VM family of computers. Each CP/CMS user employed what appeared to be their own stand-alone computer. And every virtual machine had the complete capabilities of the underlying machine.

As early as 1969, ACM held a workshop on virtual machines and one of the papers there dealt with virtualizing a DEC PDP-10.

Virtualization Lite?

As well as “full fat” virtualization, software can also make one platform appear to be another, either to user, software applications, or both. This is often known as a virtual interface. Instead of the overhead of providing consolidation, it’s a simpler system of layers that pretend the platform is something it’s not.

By the mid-1970s, at Lawrence Berkeley Labs in California, Hall, Scherrer & Sventek published “A Virtual Operating System,” CACM September 1980. The paper pointed out that “considerable time and effort” were always involved in “moving both software and people to a new computing environment.” The authors went on to say that many of the problems vanish if a virtual interface is used; that users cannot distinguish between the machine and the interface they are using.

For example, nearly 20 years ago, Interix, an off-shoot from MKS in Waterloo, Ontario, offered software that enabled UNIX tools on desktops running Microsoft systems (its slogan was “UNIX system applications on Windows NT.”) The company was acquired by Microsoft and the software became the foundation for “Services for Unix” (SFU). This soon expanded to become Microsoft's “Subsystem for Unix-based Applications” (SUA).

A virtual interface, as pointed out over 30 years ago by Hall, Scherrer and Sventek, means that the users need less retraining, their interfaces remaining at the same time as the underlying software and machinery is changed.

So Where Are You?

Today the enterprise can be enabled through the use of virtual machines, virtual networks, virtual storage, and virtual software. You may not be living and working on a “holodeck,” but your business can employ facilities that you don't own: you save by virtually using the hardware, the storage, the software that your staff work with, rather than purchasing them.

In fact, most shops are virtually there. They focused on virtualizing the quick wins in the first few years, but many stopped right before the finish line.

Does that describe your organization? Is it time to finish what you started, to get the full benefit?

More from NetAppVoice:


Peter H. Salus is an historian of science and technology, among whose works are (1994) and (2008).