Planning for virtualization? Beware of server overload

Double-digit physical-to-virtual server ratios are things of the past

As virtualization stretches deeper into the enterprise to include mission-critical and resource-intensive applications, IT executives are learning that double-digit physical-to-virtual server ratios are things of the past.

Virtualization vendors may still be touting the potential of putting 20, 50 or even 100 virtual machines (VM) on a single physical machine. But IT managers and industry experts say those ratios are dangerous in production environments and can cause performance problems or, worse, outages.

"In test and development environments, companies could put upwards of 50 virtual machines on a single physical host. But when it comes to mission-critical and resource-intensive applications, that number tends to plummet to less than 15," says Andi Mann, vice president of research at Enterprise Management Associates Inc. in Boulder, Colo.

In a 2009 study of 153 organizations with more than 500 end users, EMA found that, on average, enterprises were achieving 6:1 consolidation rates for applications such as ERP, CRM, e-mail and databases.

Thevariance between the reality and the expectations, whether it's due to vendor hype or internal ROI issues, could spell trouble for IT teams. That's because the consolidation rate affects just about every aspect of a virtualization project -- budget, capacity and executive buy-in. "If you go into these virtualization projects with a false expectation, you're going to get in trouble," Mann says.

Indeed, overestimating physical-to-virtual ratios can result in the need for more server hardware, rack space, cooling capacity and power consumption -- all of which cost money. Worse yet, users could be affected by poorly performing applications. "If a company thinks they're only going to need 10 servers at the end of a virtualization project and they actually need 15, it could have a significant impact on the overall cost of the consolidation and put them in the hole financially. Not a good thing, especially in this economy," says Charles King, president and principal analyst at consultancy Pund-IT Inc. in Hayward, Calif.

Why is there a disconnect between virtualization expectations and reality? King says that up to this point, many companies have focused on virtualizing low-end, low-use, low-I/O applications such as test, development, log, file and print servers. "When it comes to edge-of-network, non-mission-critical applications that don't require high availability, you can stack dozens on a single machine," he says.

Bob Gill, an analyst at TheInfoPro Inc., agrees. "Early on, people were virtualizing systems that had a less-than-5% utilization rate. These were the applications that, if they went down for an hour, no one got upset," he says.

That's not the case when applying virtualization to mission-critical, resource-intensive applications -- and virtualization vendors, on the whole, have been slow to explain this reality to customers, according to some analysts.

Once you consider applications with higher utilization rates, greater security risks, and increased performance and availability demands, consolidation ratios drop off considerably. "These applications will compete for bandwidth, memory, CPU and storage," King says. Even on machines with two quad-core processors, highly transactional applications that have been virtualized will experience network bottlenecks and performance hits as they vie for the same server's pool of resources.

Here are four tips for avoiding server overload.

1. Start With Capacity Analysis

To combat the problem, IT teams have to rejigger their thinking and dial back everyone's expectations. The best place to start: a capacity analysis, says Kris Jmaeff, information security systems specialist at the Interior Health Authority, a British Columbia government agency.

Four years ago, the data center at Interior Health was growing at a rapid clip. There was a lot of pressure to virtualize the 500-server production environment to support a host of services, including DNS, Active Directory, Web servers, FTP, and many production application and database servers.

Before starting down that path, Jmaeff first used VMware tools to conduct an in-depth capacity analysis that monitored server hardware utilization. (Similar tools are also available from Cirba, Hewlett-Packard, Microsoft, PlateSpin and Vizioncore, among others.) Rather than looking at his hardware environment piece by piece, he instead considered everything as a pool of resources. "Capacity planning should focus on the resources that a server can contribute to the virtual pool," Jmaeff says.

Already, the team has been able to consolidate 250 servers -- 50% of the server farm -- onto 12 physical hosts. And while Jmaeff's overall average data center ratio is 20:1, hosts that hold more-demanding applications either require much lower ratios or require that he balance out resource-intensive applications.

Jmaeff uses a combination of VMware vCenter and IBM Director to monitor each VM for telltale signs of ratio imbalances such as spikes in RAM and CPU usage, or performance degradation. "We've definitely had to bump applications around and adjust our conversion rates according to server resource demand to create a more balanced workload," he says. If necessary, it's easy to clone servers and quickly spread the application load, he adds.

"Because we did our homework with ratios of virtual servers by examining the load on CPU and memory and evaluated physical server workloads, we've been pleasantly surprised with our ratios," Jmaeff says.

2. Monitor Performance Continuously

At Network Data Center Host Inc., a Web service provider in San Clemente, Calif., the IT team quickly learned that when it comes to virtualizing mission-critical applications, you have to consider more than just RAM. "We originally thought, based on available RAM, we could have 40 small customers share a physical server. But we found that with heavier-used applications, it's not the RAM, it's the I/O," says Chief Technology Officer Shaun Retain.

The 40:1 ratio had to be pulled back to no greater than 20:1, he says. To help with that effort, the team has developed a control panel that allows customers to log in and see how their virtual machines are handling reads, writes, disk space usage and other performance-affecting activity. In addition, NDC Host uses homegrown monitoring tools to ensure that ratios aren't blown by a spike in a single VM's traffic.

Join the Good Gear Guide newsletter!

Error: Please check your email address.

Tags virtual serverserversvirtualisation

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Sandra Gittlen

Computerworld (US)

Most Popular Reviews

Follow Us

Best Deals on GoodGearGuide

Shopping.com

Latest News Articles

Resources

GGG Evaluation Team

Kathy Cassidy

STYLISTIC Q702

First impression on unpacking the Q702 test unit was the solid feel and clean, minimalist styling.

Anthony Grifoni

STYLISTIC Q572

For work use, Microsoft Word and Excel programs pre-installed on the device are adequate for preparing short documents.

Steph Mundell

LIFEBOOK UH574

The Fujitsu LifeBook UH574 allowed for great mobility without being obnoxiously heavy or clunky. Its twelve hours of battery life did not disappoint.

Andrew Mitsi

STYLISTIC Q702

The screen was particularly good. It is bright and visible from most angles, however heat is an issue, particularly around the Windows button on the front, and on the back where the battery housing is located.

Simon Harriott

STYLISTIC Q702

My first impression after unboxing the Q702 is that it is a nice looking unit. Styling is somewhat minimalist but very effective. The tablet part, once detached, has a nice weight, and no buttons or switches are located in awkward or intrusive positions.

Latest Jobs

Shopping.com

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?