You are here

Why 10 Gig-E Makes Sense

Federal agencies find that 10 Gigabit Ethernet supports growing data stores, and data consolidation and virtualization projects.

The National Oceanic and Atmospheric Administration, a federal agency within the Department of Commerce, has a heritage that dates back to the early 1800s, with roots in early science-driven federal agencies. In 1970, NOAA was officially established to predict changes in the oceanic and atmospheric environments and living marine resources, and to provide related data, information and services. Therefore, because NOAA is a data-intensive organization, the agency believes that 10 Gigabit Ethernet is perfect for them.

The agency has deployed 10 Gig-E over the past five years to support high-performance computers that drive much of NOAA’s research and the supercomputers used to deliver vast volumes of weather and climate information to the public. More recently, NOAA deployed 10 Gig-E into its data center to deliver more robust disaster recovery and business continuity potential for its financial systems.

“The implementation pace of 10 Gig-E networking in our data centers has been dependent on its function,” says David Michaud, director, High Performance Computing and Communications, with the agency. “NOAA is data-driven, so technologies that drive the speed, reliability and ease of use of data play a large role on how aggressive these technologies are implemented.”

Many organizations are upgrading their data center environments to aggregate traffic from multiple Gigabit Ethernet links connected to other switches, hypervisor machines or physical computers, according to Jeremy Littlejohn, president of consulting firm RISC Networks. Littlejohn works closely with CIOs and IT managers to help them optimize the reliability, scalability and performance of their entire IT infrastructures.

As for the benefits of 10 Gig-E, Littlejohn says organizations that upgrade expect to see faster backups and improved data traffic flows. Despite these clear benefits, Littlejohn cautions that organizations need to examine the root cause of their application performance/backup equation before they try to fix any networking issues with more bandwidth.

Census Refresh

The U.S. Census Bureau has been analyzing the networking infrastructure in its data center and is in the middle of a technology refresh.

“We have to conduct our refreshes around the census,” says Brian McGrath, the bureau’s associate director for information technology and CIO, adding that a refresh had to wait until the most recent decennial census was completed in 2010. But now the time is right now, and capacity is top of mind as the countdown has started for the next decennial census in 2020.

“Capacity is always a concern, and we want to stay one step ahead of demand,” he says.  “For us, 10 Gigabit Ethernet is the next logical step in the migration path for expanding our data communications capability within the data center.”

Today, the Census Bureau runs Gigabit Ethernet through its data center and either 100 Megabit Ethernet or Gigabit Ethernet to local switches and desktops. An aggressive virtualization initiative to facilitate data center consolidation and improve the utilization of hardware assets is adding to the pressure for increased bandwidth. Thus far, the bureau has about 610 Windows servers virtualized and has just finished building a Linux-based virtual farm of 50 servers that are in the pipeline and ready to be deployed.

“As we continue to highly virtualize our environment, we expect to see higher capacity and utilization of our infrastructure, and we want to make sure the communications side stays in step with that,” McGrath says. “We’ve done all the technical analysis and have presented the business plan for the organization. We are anticipating it will be approved and we’ll start the refresh in the fiscal 2011 and 2012 cycle, with a completion in 2012.”

For NOAA, consolidation, continuity of operations and the timely delivery of products and information are the primary factors leading to its use of 10 Gig-E networking. In 2007, NOAA deployed a 10 Gig-E wide area network to move substantial volumes of weather and climate information among various supercomputing systems and then out to the public.

650%

The estimated growth in enterprise data through 2013

SOURCE: Gartner

“These operational systems required large communications bandwidth not solely based on the volume of information, but rather the volume of data that needs to move in a discreet period of time to enable timely delivery to the public and to the backup system for rapid failover,” says Michaud.

10 Gig-E is also playing a role in the effort to consolidate its research and development HPC systems into more optimal data centers that are remote from NOAA scientists. “This required a network infrastructure to enable NOAA's scientists to move substantial amounts of data between these larger consolidated high-performance computing systems and the locations at which the scientists reside,” Michaud says.

Known as N-Wave, this national research network — funded through the American Recovery and Reinvestment Act — is a stable and secure network built using 10 Gig-E Wave Division Multiplexed fiber-optic links supplied by Internet2, the Global Research Network Operations Center and the National LambdaRail. 

May 03 2011

Comments