Skip to main content

The Virtues of Virtualization

Virtualization technologies have graduated to the big time, but it didn’t happen overnight. While early virtualization application experiments can be traced back to the 1960s, it is only in the past decade that there has been growing acceptance of this cost-saving technology.

Foreshadowing new virtualization breakthroughs, a 2006 IDC analysis projected that companies would spend more money to power and cool servers by 2009 than they would spend on the servers in the first place. And a recent Goldman Sachs survey of corporate technology users found that 45% of respondents expect to virtualize more than 30% of their servers (up from only 7% today).

The heart and lifeblood of virtualization consists in using a hypervisor (software that provides a virtual machine environment), situated between the hardware and the virtual machine. The virtual machine is, in essence, data, while its applications files are stored on a physical server or on a remote storage device. The result is that the virtual machine has portability, which translates into a strategic advantage in adverse situations.

Virtualization technologies have come a long way, says James Geis, director of integrated solutions development at Boston IT consulting firm Forsythe Solutions, because evaluating capacity was once difficult. Thanks to improved capacity management tools, that task has been simplified and has become a mainstream means for resource planning.

Geis also notes that, while massive adoption of virtualization solutions has become commonplace, not all servers and applications are meant to be virtualized. The choice, he says, of when, where, and how an application can be virtualized should be based on performance metrics. “There are cases where processing, memory, storage, and network requirements dictate a solely dedicated server.”

However, the value of virtualization as an enduring strategy for continued growth is enormous. Geis outlines the following benefits:

Capacity optimization. Virtualization places capacity planning and optimization at the forefront of data center management. Properly implemented, it produces the maximum return on investment per server dollar.

Rapid server provisioning. Speed and accuracy are essential in a frenetic virtual business environment. Using a server template, virtual servers can be created effortlessly. Geis says new server provisioning takes minutes or seconds, rather than the days or weeks required to procure a new box and install an operating system and software.

Server portability. Virtual servers and the applications they support can be easily moved or copied to other hardware, independent of physical location or processor type. This feature alone provides unlimited flexibility for hosting servers and applications on any combination of physical hardware.

Reduced hardware, facilities, and HR expenses. Fewer server boxes cost less, take up less floor space, require less electricity and air conditioning, and require less maintenance, thus reducing costs related to hardware procurement, real estate, utilities, and human resources.

Larry Honarvar, vice president, consulting services, at CGI, a Montreal-based IT and systems integration consulting company, employs virtualization technologies in the following areas: managed services, software development and maintenance, and hosting solutions.

For software development, virtualization better leverages hardware and software investments, Honarvar says. This works well, given the fact that customers are often scattered around the globe, working in different time zones. “Virtualization makes better use of our infrastructure investments because it allows us to test different development and testing environments. It lets us control costs and redirect funding into product maintenance and enhancement,” he explains.

In hosting solutions, CGI employs virtualization solutions to maximize services and, at the same time, contain costs. Honarvar stresses that a compelling selling point for clients is that virtualization offers transparency. “They see the benefit of being able to have more environments pre-configured and quickly available to map their needs.”

The virtualization solutions marketplace gets bigger every year. Many companies are turning out half a dozen virtualization solutions a year. Here are two examples:

Toronto-based company Asigra has developed a line of backup and recovery services. Its Multi-Tiered Storage Billing System is designed to save the time and expense of developing or modifying an existing billing system, which the company says could run up to thousands of dollars. Its features include “agentless simplicity” (software is installed on only one node, whether the customer has one PC or hundreds); advanced security features (authentication, encryption and non-escrowed keys); and autonomic healing (provides managed backup/restore services for customers).

Ottawa-headquartered Mitel has introduced a number of communication tools for small and medium-size businesses, offering reporting and a signaling protocol called SIP (Session Initiation Protocol) capabilities. Mitel is aggressively promoting its Business Dashboard, which allows companies to track call activity on an internal IP network with both historical and real-time reporting. It collects trend data on call volumes and times, and trunk usage. Its neatest feature is tracking the path of a single call through internal systems and departments, which makes for accurate management of calls.

And that’s just a brief sampling of the virtualization technologies on the market. Look for aggressive new startup companies from all over the globe to jump into this application-rich, expanding niche.

Bob Weinstein is a science and technology writer with Troy Media Corporation.