You are here

All Together Now

By adopting a virtualized server approach, agencies can all but eliminate downtime and avoid the problems that arise when crucial systems grind to a halt.

The fact that the National Archives and Records Administration houses a lot of data is rather obvious, but figuring out how to manage it—especially the growing collection of electronic records—is less so.

It's a task that will occupy the thinking of some of the nation's brightest minds, and part of the solution will turn on server virtualization, says Robert Chadduck, director of research for NARA's Electronic Records Archives (ERA) initiative.

NARA awarded a $308 million ERA contract to Lockheed Martin in September to create a distributed database system that will serve as the archival backbone system for the government. When it rolls out in 2007, ERA will pull together a nationally distributed records database—of text records, images, sound files and video—and make it more easily searchable and retrievable.

An obvious approach for ERA is virtualization, Chadduck says. "The root issue is for the government to be able to manage, preserve and support robust, sustained access to distributed collections, wherever they may be in the NARA system," he says. "We want to create an electronic records archive that will serve a networked nation. Without virtualization, you can't do that."

Faster, Better, Cheaper

Virtualization software—tools such as EMC's VMware line and Microsoft's Virtual Server 2005—can let IT managers consolidate databases and other applications by partitioning servers and making full use of their capacity. In addition to the consolidation capabilties, server virtualization also can let an organization amass its data and run it on any server in its network at any time. By doing so, an agency can load balance traffic and data on its servers and provide uptime even if some systems fail. All the while, data is available to users—to whom the virtualization is transparent.

So how does an agency decide whether server virtualization makes sense for it? There are essentially four steps that must take place, agency and industry experts say: inventory the existing servers, data and applications; calculate the cost of hardware, software and technical support; figure out the number of servers that can be eliminated by virtualizing servers; and finally calculate the cost of operating the remaining servers, then add in the cost of the virtualization software and support costs.

"It's a solution and not an initiative," says Al Gillen, research director for systems software for International Data Corp. of Framingham, Mass. "This is not something where people say, 'We need to implement this because it's cool.' Instead, they say, 'We need to solve a problem.' Virtualization helps users create an intelligent solution to challenges they already have"—multiple applications and data stores that do not make the most efficient use of the servers in place.

Starting Point

If the Answer is Yes, Consider Virtualization
• Is there unused capacity in the current server infrastructure?
• Does the processing workload fluctuate radically making
load balancing tricky?
• Is it a mixed operating system environment?
• Do you have high maintenance costs for server hardware?
• Do your data services require round-the-clock uptime?

"Assess, assess, assess," says Todd Holcomb, a technical consultant with Evolving Solutions of Hamel, Minn.

Holcomb, whose company specializes in providing virtualization services, recommends that an agency run a week's worth of server load statistics—gathered through event monitoring—to find out where the bottlenecks are and what system resources are needed to alleviate them.

In addition, an agency must account for the costs of its current hardware base and the human resources necessary to maintain systems and services for its users.

Armed with these statistics, an agency's IT team has a baseline against which to measure possible savings and performance enhancements possible from server virtualization.

Holcomb recommends challenging three hardware vendors to come up with the systems they think will do the agency's job, and then have them run a demo. At the ultra-high end, NARA's Chadduck has run tests using "test collections of proxy electronic collections that demonstrate real problems."

Through these tests, an agency can determine the number of total servers it will need to run its applications. By calculating in the reduction in cost of maintaining those servers and the additional new cost for the virtualization software, an agency will begin to get a picture of its possible return on investment, the software consultants note.

But in addition to more efficient use of the existing infrastructure, there are other benefits that might lead to a higher ROI and a lower total cost of ownership.

For instance, "you never have to bring anything down again," Holcomb notes, and that lets an agency reduce what it's spending on maintenance.

"Because more and more of the individual components [in servers] have their own resident 'smarts,' they can be addressed independently," which gives users the "ability to pool all of those various resources that get strained at any given time," Holcomb says.

That means an overtaxed blade server, for instance, can be monitored by another blade, which picks up its mirrored image and takes over if the processing load becomes too heavy or some other problem arises. The result is no downtime when the IT staff pulls the failed unit offline to fix it; applications keep running.

Alternately, if processing causes a system resources strain, one blade can borrow resources from another: 10 blades with 2 gigabytes of RAM each and 1GB of cache memory effectively become a virtual single machine with 2GB of RAM and 10GB of cache."You can fail one server or app over to another," Holcomb says. "Now, you never have to worry about an outage to do what you have to do because everything can be failed over to" any other system on the network.

Final Tally

What's essential in all of this? For the NARA project, high-speed communications and supercomputing capabilities will come into play eventually. For less ambitious projects, systems require enough fault tolerance and redundancy to support the easy movement of data across systems, Holcomb says.


At NARA, the desire to make records now stored on disk available in real time is driving the move to server virtualization.

IDC's Gillen also advises that users take stock of the license agreements for their key software programs: Some might not allow automatic shifts from one machine to another; others might restrict the number of processors—real or virtual—on which an application can run. That would not preclude setting up a virtual server system, but it could drive up the startup costs.

But, Holcomb says, the ROI would still be worthwhile because of other savings—on maintenance and fewer total hardware components, for instance—and increased uptime.

For Chadduck, refining and developing virtualized servers—even on a scale for ERA's massive proportions—is vital. "The importance of the technology is to provide the basic foundation to manage and support access," he says, "to the electronic records that document our democracy and provide for the education of our citizens and basically provide for the continued strategic assets of electronic government, business and commerce, across a digital nation."

Dec 31 2009

Comments