You are here

Legacy Systems: Plan Appropriately

With the administration demanding data center consolidation, agencies need to make sure they address the potential risks of moving legacy apps to virtualized and cloud environments.

Most agencies are currently evaluating options to replace or update legacy systems within their environments. For many, these systems are mission-critical and need to stay in production for some time to come.

But the burden of managing aging technology, along with vast data stores, is leading some organizations to evaluate newer solutions such as virtualization and cloud computing. There are a number of risks inherent with these approaches that government IT decision-makers must evaluate before migration.

Into a Virtual Environment

Let’s start with virtualization. Virtualizing legacy systems and applications makes a lot of business sense, but there’s no magic bullet that enables highly customized or outdated systems to be ported to a virtual model in every case.

To effect this transition, some agencies may need help from virtualization vendors to upgrade or convert code or make highly customized and specific hardware emulation changes to virtualization platforms. In addition, virtualization technology can significantly increase the overall rate of change and amount of complexity in an IT environment.

Compliance and security controls must be re-engineered and system interdependencies can be introduced by adding a new layer of software abstraction.

The following are considerations (and the associated risks) that agencies should address when looking to virtualize:

Additional software complexity: All software programs have flaws, and virtualization technologies are no exception. Depending on the exposure level of the virtual systems (for instance, in a public demilitarized zone versus internal), some agencies may not be comfortable leveraging complex virtualization technologies that have been shown to have significant vulnerabilities in the past several years.

In reality, however, virtualization platforms tend to have a smaller code base than most other multipurpose platforms and have relatively fewer vulnerabilities as a result.

Configuration of hosts and guests: Virtualization platforms (hosts) and virtual machines (guests) must be properly hardened and configured for optimal security, including communications between the VMs and the underlying host. This adds additional IT overhead and management requirements that must be taken into account.

Data and system classification: How many different data classification types and levels will be involved in the migration? A general security best practice for virtualizing systems is to only host virtual machines with like data security and sensitivity categories on a single hypervisor platform.

This limits exposure if the underlying hypervisor is compromised in some way, which could grant an attacker access to all your VMs. Depending on an agency’s need for distinct data classification levels, the number of physical systems required for virtualization could grow significantly, reducing the overall cost savings from the project.

Network security controls: Controls for virtual environments are still relatively immature. Do you need to monitor all system-to-system traffic? Virtual firewalls and intrusion detection systems are available as VMs that can be integrated into virtual environments, but they may not offer the same filtering and monitoring capabilities as traditional physical controls.

Operation and compliance: Are you able to integrate virtualization technology into your current change control and compliance evaluation processes? Virtualization enables rapid cloning, migration and provisioning of resources that require modifications to change management procedures and policies.

As for compliance, most certification and accreditation processes, as well as compliance mandates, such as the Federal Information Security Management Act, do not have explicit guidance for evaluating virtualization platforms.


Illustration: Elizabeth Hinshaw
"The burden of managing aging technology, along with vast data stores, is leading some organizations to evaluate newersolutions such as virtualization and cloud computing." —

— Dave Shackleford

Into the Cloud

Now, let’s look at what’s involved in migrating legacy systems and applications to cloud environments. As with virtualization, there can be tremendous incentives to do so, ranging from cost savings to operational flexibility and reduced management overhead.

In his recent report, State of Public Sector Cloud Computing, federal CIO Vivek Kundra outlines why cloud computing is something the U.S. government should pay attention to. Several case studies in the report describe legacy applications and systems moving to cloud environments, such as the Army moving to a customized software-as-a-service use of Salesforce.com to host the service’s Army Experience Center application.

The following outlines some major issues that agencies looking to move data, systems and applications to cloud environments need to consider.

Multitenancy: One cloud computing concept that agencies should focus on is multitenancy. In cloud environments, multiple parties’ data and services often share a single physical platform, especially given the use of virtualization technologies in most cloud service environments.

With limited ability to control data, application or virtual platform segmentation — or audit the controls a cloud provider uses to segment in a multitenant infrastructure — many potential public cloud customers are concerned about migrating to external cloud providers, and with good reason.

Within private clouds, multitenancy remains an architectural issue that should be addressed during design and implementation. Multitenancy can be enacted at many layers, including storage, application and database, as well as operating platform and hypervisor-based infrastructure.

Cloud delivery model: The most common cloud models are software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS).

The kind of model needed for legacy application hosting and data storage will depend on the sensitivity of the data, as well as the ease of migration.

SaaS is often the easiest to get started with but offers almost no security controls that agencies can monitor and manage themselves. PaaS environments are often used for development and may require porting legacy apps to newer coding standards that can make use of standard application programming interfaces. IaaS models afford the greatest security management and monitoring capabilities but require moreoperational investment, such as installing and managing operating platforms.

Cloud deployment model: There are many models from which to choose, including public (open to anyone), private (run by one organization for itself), hybrid (a combination of public and private) and community (cloud services for a specific group or organization type).

For agencies looking to migrate sensitive data and applications to the cloud, a private or community model may be most appropriate. A new initiative, the National Institute of Standards and Technology’s Federal Risk and Authorization Management Program, seeks to provide a common standard for defining and assessing security controls for a federal cloud computing infrastructure. FedRAMP includes security controls consistent with NIST Special Publication 800-53, the security implementation and audit process guide; templates for service providers to fill out regarding protection of sensitive data; and contract and service-level agreement language that potential customers can reference.

Community cloud vendors are readying their environments using FedRAMP as a template, even going so far as to seek FISMA certification. Some agencies, including the Defense Information Systems Agency, have created customized private clouds for their own use. DISA, for instance, has created the Rapid Access Computing Environment and Forge.mil private clouds for use with the Defense Department.

Identity and access management: An agency also will need to consider the kind of authentication (“Who are you?”) and authorization (“What are you allowed to do and access?”) needed for legacy applications and data.

When moving to the cloud, agencies potentially could lose granular control of this with SaaS and PaaS models. IaaS models offer more options, as the user can install and maintain robust identity management solutions if necessary.

Data protection and disposal: Do you need to encrypt and decrypt data and dispose of it securely? Depending on the cloud model and provider, this may or may not be possible, so review the capabilities and contract provisions.

Incident response: Most federal organizations have incident response programs. How will this equate to a provider’s ability to identify potential incidents and handle them? Again, this is a question all potential cloud users should ask; you also need to ensure proper SLAs are in place.

With the administration’s Federal Data Center Consolidation Initiative under way, agencies should already have conducted inventories of systems and begun developing plans to consolidate and reduce data center assets by the beginning of the year.

With new security and compliance mandates to implement continuous monitoring, agencies looking to implement virtualization or move data, applications and systems to the cloud will need to ensure they have the right tools and environment in which to do so. Cloud providers are standing by.

Robert Rounsavall, director of secure information services at cloud provider Terremark, says, “We have seen a significant increase in federal customers moving legacy systems, applications and data into our specialized federal facilities. We’ve built them to meet all 800-53 and 800-86 requirements and can even offer Defense customers true SCIF [Sensitive Compartmented Information Facility] capabilities.”

As agencies look for solutions to update legacy technology while simultaneously reducing data center footprints, interest in virtualization and cloud computing will continue to grow. Understanding the risks involved will help agencies make better, more informed decisions.

 

 

 

Nov 02 2010

Comments