Anil Karmel developed cloud environments for Los Alamos National Lab and the National Nuclear Security Administration that include continuous monitoring and virtualization security software.

May 01 2012
Data Center

Agencies Sharpen Their Cloud Defenses

Security remains a top priority for federal IT leaders as they increase their adoption of cloud services.

Last summer, Los Alamos National Laboratory built the federal government’s first secure, hybrid cloud. Now the lab’s parent agency, the Energy Department’s National Nuclear Security Administration (NNSA), is applying that work on a much grander scale.

The NNSA, whose mission is to ensure that the country’s nuclear weapons are secure, safe and effective, is developing its own private cloud, but also leveraging several public-cloud providers. The result is a secure, hybrid community cloud, launching this summer, which will provide virtual servers and storage on a pay-per-use basis throughout the agency (and in the future, to other agencies within the Energy Department), says NNSA Chief Technology Officer Travis Howerton.

Anil Karmel, the architect behind both clouds, is using the design he developed at Los Alamos as the foundation for the NNSA cloud. In doing so, he’s using the same multipronged approach to security, which includes continuous monitoring and virtualization software that quarantines virtual machines (VMs) if security threats or risks are detected.

“We’re taking the innovative model we created and giving users two things that heretofore could not be used in the same sentence: agility and security,” says Karmel, a former solutions architect at the lab and now the NNSA’s management and operations chief technology officer.

Security remains a top priority for federal IT leaders as they increase their adoption of cloud services to meet the government’s cloud-first policy. But the government is making good progress in establishing cloud security standards and providing implementation guidelines to address those security concerns.

The National Institute of Standards and Technology (NIST) — which has released several cloud security publications, including a draft cloud computing roadmap in November — is working on new guidance. This spring, the agency will release a special publication that identifies 17 security requirements that federal IT managers must address as they migrate to the cloud, says Fred Whiteside, project manager with the NIST cloud computing program.

In July, NIST is expected to finalize updates to Special Publication 800-53, which includes new security controls and enhancements that address new challenges, such as cloud computing and mobile devices.

Meanwhile, the General Services Administration (GSA) is busy preparing for its June launch of the Federal Risk and Authorization Management Program. The interagency effort provides baseline security controls, based on NIST’s 800-53 publication, that agencies and public-cloud providers must meet to offer cloud services to agencies. FedRAMP also provides a standard approach to continuously monitor and authorize cloud services.

The goal is to foster cloud adoption by creating uniform security requirements and centralizing and streamlining the certification and accreditation process. Through FedRAMP, the government will certify a cloud provider’s service once on the baseline security controls, so individual agencies don’t have to go through the time-consuming and costly process on their own. If agencies require additional security controls based on their unique needs, they only need to certify those extra controls.

“There are no real cloud-specific standards in the industry today, so what FedRAMP does is give agencies a baseline on whether a cloud service meets minimum security standards, and the agencies can develop things beyond that,” says Amy Larsen DeCarlo, an analyst with Current Analysis.

GSA is implementing FedRAMP in phases over the next two years. This June, the agency expects to begin the full authorization process for two vendor services: a cloud infrastructure as a service and e-mail and collaboration software as a service. During this initial startup phase, expected to last six to eight months, the agency will also assess any need to fine-tune FedRAMP’s policies and operations. In early fiscal 2013, it will begin expanding its authorization process to include more cloud services, says Dave McClure, associate administrator of GSA’s Office of Citizen Services and Innovative Technologies.

Agencies will have two years to make their existing cloud services compliant with security authorizations using FedRAMP processes. Previously, agencies handled cloud security assessments on an independent and ad hoc basis, using their own expertise and interpretation of existing federal requirements as their guide. Early adopters, such as the NNSA, say they are now incorporating FedRAMP’s security policies into their own cloud.

“We are going through all of FedRAMP’s security controls and ensuring that we meet all the requirements, so as we move forward, we can work toward a goal of having a ‘FedRAMP-able’ cloud offering,” Howerton says.

Inspiration Close to Home

Last year, when NNSA’s leadership decided it was time to deploy a secure, hybrid community cloud to cut costs, improve computing services and operate more efficiently, it looked no further than to one of its own facilities — Los Alamos National Laboratory (LANL) in New Mexico.

In 2010, the lab built a private infrastructure as a service cloud, called “Infrastructure-on-Demand” (IoD), which allows researchers and employees to use a self-service web portal to quickly request and provision virtual servers and storage. “Provisioning servers used to take 30 days. Now it takes 30 minutes,” Karmel says.

The lab had previously virtualized its servers in 2006, and the move to a private cloud was a natural progression, Karmel says. The IT team purchased new equipment for the cloud project, including HP blade servers and a 2 petabyte NetApp storage area network.

The IT staff also deployed four key software components: a web portal; VMware vCloud Director, to create and manage the cloud; VMware vShield, to implement cloud security policies; and Microsoft SharePoint, to manage workflow and lifecycle management. SharePoint also serves as part of the user interface for custom software, called Green IT SmartMeter, that calculates the cost of services and measures energy savings.

Howerton
Photo: Tamara Reynolds
NNSA is scaling out Los Alamos’ cloud architecture to meet its own enterprise needs. “We’re taking it to the next level and making it better,” Travis Howerton says.

Last summer, Karmel added a public-cloud provider to the mix, turning the private cloud into a hybrid cloud. It’s a collocation arrangement, in which the public-cloud provider supplies the equipment, which the lab manages. Karmel linked his private cloud to the public-cloud provider through a secure virtual private network connection. Doing so lets researchers and employees provision more servers and storage resources as needed.

The lab took a multilayered approach to security. Using vShield software, the lab segregates VMs into specific security enclaves based on data security requirements. If the data is highly sensitive, for example, the IT staff can write security policies that prevent traffic flow from one enclave to another, Karmel says.

The lab continuously monitors the security of its cloud through the use of firewalls, antivirus software, intrusion detection and prevention systems and some custom security tools. If any security threats or risks are detected, such as malware or applications that need patching, the vShield software is called to automatically move the affected VMs into a quarantined zone with no network ­connectivity, Karmel says. Users or IT managers must manually resolve the security issues, and then VMs are automatically moved back to their source enclave.

When LANL linked its private cloud to the public-cloud provider, Karmel installed the same security architecture on the public-cloud equipment.

Scaling Up

The NNSA, which employs about 2,700 federal employees and about 30,000 contractors across nine facilities, is taking LANL’s cloud architecture and scaling it out to meet the NNSA’s enterprise needs. “We’re taking it to the next level and making it better,” Howerton says.

The cloud is part of a larger strategy by the NNSA to modernize and centralize its IT infrastructure. Previously, each of the agency’s facilities managed its own IT infrastructure.

The NNSA began building its community cloud, called YOURcloud, early this year, and expects to launch it this summer. The agency is building its own private cloud and will leverage multiple commercial data center providers in a private-cloud configuration that spans multiple regions of the country. LANL’s cloud — which houses 700 to 1,000 VMs — will be integrated into the NNSA’s cloud as well, Karmel says.

When YOURcloud is launched, users will be able to choose between different tiers of service, as well as the location of the data center, through a self-service web portal. A secure-cloud service broker developed at LANL will let users choose an ultra-green data center or an ultra-secure data center, whichever meets their requirements, Howerton says.

Because of NNSA’s role with nuclear weapons and safety, data security is critical, so Karmel is building a security architecture based on LANL’s design. NNSA is encrypting sensitive data and using role-based access control to make sure no unauthorized users access data.

69%

of federal IT managers say the “cloud first” requirement to move three services to the cloud by mid-2012 is too fast. 71% say the pressure to move to the cloud is inadvertently creating greater security risks for their agency.

Source: Ponemon Institute

“You need security controls up and down the technology stack — from the network layer down to the storage layer,” he says.

While vetting public-cloud providers, the agency is looking at physical security as well. The NNSA is checking building access to the data centers and making sure that its employees have cleared background checks, Howerton says.

“Given what we do, information assurance is critical, so having this robust architecture makes sure none of our data or applications is tampered with,” he says.

To improve security further, NNSA has partnered with the Energy Department to build a Joint Cybersecurity Coordination Center, which will constantly monitor the health and security of its data, systems and networks.

Private Versus Public Clouds

The Department of Homeland Security, which will manage continuous monitoring for FedRAMP, has spent the past year implementing a private cloud to take advantage of the security built into its own data centers.

The department is building infrastructure, platform and software as a service offerings, providing virtual servers, storage, e-mail, business intelligence and other applications to its users. The department is currently using the public cloud only for nonsensitive data, such as website hosting, while important applications and sensitive data are housed in its private cloud, says CIO Richard Spires.

“Given the nature of what we do, we feel more comfortable with the private model,” he says. “We have a full set of security controls for FISMA [Federal Information Security Management Act]-low and FISMA-moderate built into our data centers. And anything new that we do, including those private-cloud offerings, will inherit those controls.”

Homeland Security is open to migrating some applications with sensitive data to the public cloud once FedRAMP becomes established, says Spires, who serves on FedRAMP’s Joint Authorization Board. He expects to see a significant increase in the use of public-cloud services by federal agencies in the next two years.

“We will look at the risk factors, but I feel strongly over the next couple of years that cloud service providers, with the right security controls, will match what we can do internally,” he says.

Read more about the launch of FedRAMP.

<p>Chris Corrie</p>
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT