Obama administration aims for bottom-up approach to creating global standards for protection of IT and critical infrastructure.
Previous positions include:
Innovation is imperative as agencies advance their use of technology. Under the direction of Charles Romine, the National Institute of Standards and Technology’s IT Laboratory (ITL) is designed to promote innovation and enhance security.
Romine spoke with FedTech managing editor Matt McLaughlin about the work ITL is doing to meet this mission, as its scientists explore a wide variety of technologies, including cloud computing, electronic voting, smart grid, homeland security applications and health IT.
ROMINE: ITL is an amazing organization. Our mission to a large extent is inherited from NIST’s mission, which is, in ITL’s case, promoting innovation, competitiveness and security through research and development at the intersection of measurement science and IT to enhance our economy and improve our quality of life. You’ll see many of those words in the primary mission for NIST as a whole, and this is ITL’s component of that mission.
ROMINE: We have a large number of things going on. A big component of what ITL does is in the cybersecurity space. We’ve been in that space for decades, and we’ve got some of the best people in the world here working on various aspects of cybersecurity.
Our role is to work collaboratively with both the Department of Homeland Security and also industry representatives, particularly those industries that own and operate critical infrastructures as defined through the executive order referenced to the DHS critical infrastructures. And to develop a framework for advancing cybersecurity or improving the security of the critical infrastructures across the U.S.
In addition to that, we have a primary role in improving the security of the non-national security parts of the federal government’s IT systems. We do that through the Federal Information Security Management Act, or FISMA. A very successful and solid program there.
We’ve got programs in collaboration with HHS in the area of improving adoption of electronic healthcare records. We’ve been working with them for quite a number of years. We’ve got a part to play in the collaboration with the Department of Energy on the smart grid and specifically the interoperability and security issues surrounding the smart grid. We’ve got a substantial effort in the area of trying to measure software quality. Those are just a smattering of some of the more successful things that we’ve done recently and continue to work on.
ROMINE: We’re in a challenging situation. Certain parts of the federal government tend to be a target of people who seek to obtain information or interfere with the smooth functioning of the government. There are all kinds of security threats, and those threats are evolving constantly. There are challenges of identifying the threats but also identifying and fixing the vulnerabilities across the federal government. It’s such a huge enterprise, and it’s very challenging to work with our federal partners to identify ways in which their security can be improved.
In a way, the rapid change in the threats and in the vulnerabilities is a reflection of the rapid change in IT, broadly speaking. The challenge that we have at NIST in ITL is reflective of the challenge that we face not just in cybersecurity but across the board — this idea that the IT field is just dramatically changing on an accelerating pace.
We have a challenge of doing very robust identity management to ensure that only those people who are authorized to use federal IT systems are able to use them. But agencies are moving into the cloud environment that’s part of the digital strategy. Part of the strategy that was articulated by the Office of Management and Budget several years ago — this idea of approaching cloud as a way of making government more efficient and streamlining processes.
That cloud movement has its own share of challenges with regard to security, and we’ve been addressing those with publications that discuss cloud security and working on FedRAMP, the Federal Risk and Authorization Management Program, with our other federal partners.
More recently, we have seen interest in moving the federal government to embrace mobile technologies for more efficient government operations. Mobility solutions are terrific, but they carry with them other approaches to security that we need to deal with.
Security is challenging for two reasons: one is that the threats and the malefactors are evolving, and the other is that the IT space in which we operate is evolving just as quickly. Those two things are moving at the same time, and that’s a huge challenge for us.
ROMINE: A big part of it is that we get the very best people that we can find. To address the pace of change in IT, we get people who are agile and who are interested in interdisciplinary work so that they can work with ideas from other disciplines.
Our security effort has grown substantially over the past few years. We’ve been fortunate to get support from both the administration and Congress to build new capabilities in mobile security solutions. For example, we’ve established a National Cybersecurity Center of Excellence, or the NCCOE, which gives us an opportunity to work with industry and academia and federal partners to identify a specific challenges or use cases. We can come to a specific location and work on providing templates or modular solutions that will combine multiple layers of defense.
We have a substantial effort through the NSTIC program, the National Strategy for Trusted Identities in Cyberspace. ITL is fortunate to be the housing program for that, and we’ve initiated some pilot studies to look at the ways in which identity management can be improved in both industry and government.
ROMINE: NIST has been a full partner with the General Services Administration, the Defense Department and the Office of Management and Budget, as well as other agencies, in establishing the framework for FedRAMP, which agencies can use to ensure that when they procure cloud services, they do so with the most confidence in the security and the interoperability of those services. NIST has been responsible for developing a series of security documents and also a roadmap with a reference architecture for the use of cloud computing in partnership with other agencies.
Mobility doesn’t have a centralized program like FedRAMP. Rather, we’ve undertaken activities to ensure the security and reliability of mobile solutions. Our computer security folks have worked to understand the threats that we face in the mobile space, and we’ve issued some guidelines with respect to security in mobile devices. We’ve also got some programs in mobile forensics — that is, ensuring that the tools that law enforcement uses to analyze mobile devices in an investigation are reliable and effective.
Recently, we established a wireless testbed to have a repeatable measurement capability for wireless communications as part of our advanced network technologies activities. That’s very exciting because many of the measurements of wireless technologies are taken in field conditions, which can vary quite a bit depending upon line of sight and many other things. Those measurements are not as repeatable as one might like for drawing conclusions about the performance of wireless systems, so we’re establishing a testbed here and working in partnership with the folks at NIST in Boulder, Colo., who are looking at wireless technologies for public safety. I think that measurement capability is going to provide a lot of insight into the performance of wireless technologies in general.
ROMINE: Like cloud computing was several years ago, Big Data is generating a lot of interest, but people may have different ideas about what the term actually means. One of the services that we provided, among the advances that were spearheaded at NIST and ITL for cloud computing, was actually clarifying the definition that we’re talking about.
Early documents that we worked on in cloud computing are now foundational documents for the field. People now are all discussing the same the thing when they talk about cloud computing. I see an opportunity for ITL to do something similar in the space of Big Data.
Big Data is bit amorphous. It’s unquestionably an important field to be working in, but I think some clarity with regard to what we’re discussing is going to be very beneficial. We’re looking at embarking on that kind of a definition or clarification phase in the early stages of our work in Big Data. Of course, we don’t do that in a vacuum; we do that in consultation with the community at large. In the case of the cloud, we had a series of workshops to work out with the community exactly what was meant by cloud computing.
In the same way, several months ago, we had a joint workshop with cloud computing and Big Data. That was really an exciting way to kick off this idea of us working with the community to clarify some of these issues.
One of the things that I am excited about is the notion that analysis of data is not a new field at all. The field of statistics is more than 150 years old, and we have some absolutely outstanding statisticians in our Statistical Engineering Division here. We have all of the brainpower necessary to confront some of the major analytical challenges with respect to Big Data.
We also face challenges in the management of Big Data, as well fundamental IT challenges such as interoperability and privacy that emerge when you start talking about Big Data. We have an enormous reservoir of talent that we can use to tackle some of these challenging problems.
ROMINE: You mentioned weather; I think the National Oceanic and Atmospheric Administration does a really outstanding job. The Department of Energy has had a footprint in the Big Data space for a long time, even before it was called Big Data. I think they’ve had to grapple with some of the large-scale experimental apparatuses or installations that they managed. They used to generate terabytes of data per day, and now some of these things are generating terabytes of data in seconds or minutes. Now, the question is how to manage that in a reasonable way so that you’re not wasting all of the effort that goes into generating these experimental data.
Big Data also has role to play in law enforcement, and there must be an understanding of the privacy issues associated with it. I’m certain that virtually every scientific agency, such as the National Institutes of Health, NASA, the Environmental Protection Agency, the National Science Foundation and so on, all have issues of Big Data that they have to deal with on a regular basis. Anything that we do going forward at NIST will involve invitations to all of those organizations and collaboration to understand the complexities.
ROMINE: I’m glad you mention the next couple of years and decades, because it really does follow a remarkable timeline. I should also issue a caveat here that predicting advances in IT is very challenging. You can lose a lot of money doing that.
The transformation to mobility seems inevitable. As technology advances, one of the challenges is trying to understand how people will interact and use the systems. ITL must be very agile to cope with an environment that’s changing so rapidly. Further, IT is everywhere, and mobile technologies are just accelerating that. IT is also by its very nature a very complex enterprise; software that runs to millions of lines and that can be re-used in many different contexts, hardware and firmware that are generated to manage these things are really quite remarkable.
Another exciting investment that we’re making in partnership with the sister laboratory here at NIST, the Physical Measurement Laboratory, is investigating the power of quantum computing. I think the community at large is convinced that there are no major physics impediments to beginning to harness the power of quantum mechanics in a real sense, and there are several really important questions in the information space.
There are some real challenges in the manipulation of individual quantum bits, or Q-bits, on the physics side, but on the information side, there are some unbelievably fascinating problems. We don’t yet know very much about the capabilities of a quantum computer. We are investigating the implications of quantum computing, quantum communications and quantum complexity, particularly as it pertains to security. For example, there are some very interesting implications about the strength or weakness of our current cryptographic systems in the face of the computing paradigm that’s represented by quantum computers.
We have some theorists in my organization that are working with the experimentalists in the Physical Measurement Laboratory on quantum information science. It’s a fascinating topic. It’ll be about 10 or 20 years before there are any sort of practical implications on this, but we’ve got to get started right away.
ROMINE: There are some things that are already known. Mathematician Peter Shor demonstrated, for example, that a quantum computer of a relatively modest size would be able to circumvent certain cryptographic systems and in particular those that are most used today, which involve public keys.
It’s pretty well established by Shor’s algorithm that a quantum computer will be able to circumvent that kind of encryption. The good news is that a practical quantum computer capable of doing that kind of computing isn’t imminent at this point. And it’s one of the reasons that NIST and ITL is preparing to establish another kind of crypto system or at least investigate the possibilities of other security mechanisms that are resistant to that kind of attack. Quantum computing is in its infancy, and we need to know much more about what the implications are.
ROMINE: It works in several different ways, all of which are productive but take on different characteristics.
The way that’s been most seamless is an organic collaboration where we identify other agencies with common areas of interest. ITL was a founding member of what's now known as the Networking and IT Research and Development Program, or the NITRD Program, an interagency forum specifically designed to collaborate and share information about common interests and challenges in IT. That kind of forum allows us to identify other agencies that have common interests and to work together on areas such as security, networking and the human factors involved in the use of IT.
Because of the strength of our staff and our productivity and our neutrality, we often get called upon by the administration or by Congress to undertake specific mandates associated with IT. For example, in the electronic voting space, we work with the Election Assistance Commission, the EAC, and we work with the Office of the National Coordinator for Health IT in the Health and Human Services Department. We also work collaboratively with DHS, and the list goes on.
The quality of our staff and the high quality of our work also leads other agencies to turn to us to work on areas of interest for them. We are funded through other agencies to do many different things, such as biometrics work. For example, we’ve got a decades-long history of working in the fingerprint arena on behalf of law enforcement, specifically the FBI, and also DHS and DOD. And that has been extended beyond fingerprints to other kinds of biometrics such as iris prints or face recognition, and even methods such as identifying someone through their gait or voice recognition. Other agencies will often recognize the value of providing funding to our folks in order to get their missions accomplished. We are happy to partner with other agencies where our missions coincide.
ROMINE: There’s a tremendous success story in the establishment of a fully interoperable system of biometrics. One of the things that’s not widely known is that ITL is an accredited standards developer by the American National Standards Institute. We’re responsible for the ANSI standard for interoperability of biometrics, including fingerprints but others as well. That’s had an enormous impact on the ability of law enforcement to be much more effective in sharing and transmitting information for security and effective law enforcement across the country. I’d say that’s a huge win.
In the last couple of months, we’ve collaborated with ONC on the release of the second instantiation of meaningful use tests to determine whether physicians and other health care providers are adopting the electronic health records standards set forth by the Office of the National Coordinator. They rely on NIST-developed testing to verify the qualification of these folks for this incentive payment that Congress authorized under the Affordable Care Act several years ago. That’s an enormous success because Congress allocated approximately $20 billion of incentive payments and other funds to accelerate the adoption of these electronic health records. Without the tests that we developed in partnership with ONC, I think it would be impossible to determine the validity of claims on those incentive payments. That’s an enormous step forward and one that I’m very proud of.
Another example is the smart grid effort that we’ve been involved in in partnership with the Engineering Laboratory here at NIST. Our role has been in the standards and guidelines for interoperability and for security of smart grid communications and smart grid data. That’s been an enormous success in accelerating the deployment of the smart grid. In each case, the involvement of ITL has made a significant difference in the adoption of these technologies, something I’m very proud of.