Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Connected Government
- Consolidating Mission-critical Systems
- Constituent Servicing
- Continuous Monitoring: Tools and Techniques for Trustworthy Government IT
- The Data Privacy Imperative: Safeguarding Sensitive Data
- Eliminating the Pitfalls: Steps to Virtualization in Government
- Federal Executive Forum
- Federal Tech Talk
- Government Cloud Brokerage: Who, What, When, Where, Why?
- Government Mobility
- Mission-critical Apps in the Cloud
- Mobile Device Management
- The Modern Federal Threat Landscape
- The Path from Legacy Systems
- Understanding the Intersection of Customer Service and Security in the Cloud
Shows & Panels
DHS satisfied with privacy safeguards in new data sharing programs
Monday - 2/3/2014, 4:02am EST
The Department of Homeland Security thinks it may have found a promising way forward to tie together information from its various components while protecting privacy and civil liberties.
The department is constructing several demonstration systems under the auspices of the broader DHS Data Framework.
Its overall goal is to gather together huge stores of data from more than a dozen different IT silos scattered across the various DHS components, and then translate their content into a common, modern data format and make them centrally searchable by only the DHS staff who have a legitimate need to access a given piece of information.
"We collect a lot of data under a lot of different authorities for a lot of different purposes, and so there are a lot of different legal and privacy issues that come with that," said Rebecca Richards, DHS' senior director for privacy compliance. "But there are also a lot of good uses for that data, and right now, when we're looking at an individual in front of us who might be on a terrorist watch list, we have to go to 17 different systems with 17 different logins. That's just physically difficult. There's nothing else to say about it."
Richards explained details of the program last week to the Data Privacy and Integrity Advisory Committee (DPIAC), a panel of outside experts, a day after announcing she would depart DHS. She'll soon assume duties as the National Security Agency's first-ever civil liberties and privacy officer.
Looking for a solution
DHS has been testing out the program, which is actually made up of three separate new IT systems, for the last several months.
While it's a long way from being ready for day-to-day mission use, officials say the pilots proved that the basic concept works technologically — which is a coup in and of itself — but also from the perspective of being able to maintain basic privacy safeguards.
The department has been searching for several years for a reasonable and cost- effective way to tie together all this information. Many of these various IT systems were designed and built decades before anyone even conceptualized the idea of consolidating the nation's internal security functions into one cabinet-level department.
The department says vastly widening the datasets that DHS analysts and enforcement officers have easy access to is great from a mission perspective; but absent new safeguards, it also presents a huge potential for abuse, either making privacy violations much easier or enabling the next would-be Edward Snowden to walk off with a vast trove of data.
"What we wanted to do was build on this concept of making sure that we can give access to data to the people who should have it, but control it in a way that you're not creating a honey pot for a whole bunch of people to come and look at," Richards said. "And we need to make sure people only have access to it for the purposes they're allowed to."
To make that possible, DHS is first building a "user attribute hub" that will serve as the foundation for both user-based and role-based access control. It will manage various characteristics of every user in the department, such as their security clearance level, their job function and training level.
In the new aggregated data environment, each piece of data will be tagged with identifiers that control which users can access it, and for what purpose. And when users do get access to the data for any reason, that activity will be stored in audit logs that DHS says won't be able to be changed or destroyed, even by system administrators.
Tested on three independent systems
The first piece of the project is called Neptune.
Its sole job is to scoop information from those 17 IT systems across DHS on a regular basis and translate it into a common format, adding data tags along the way to dictate who can and can't access it.
For the pilot, DHS officials pulled data from three separate systems in its components that weren't originally designed to interoperate with one another: The Transportation Security Administration's Alien Flight School Program, Immigration and Customs Enforcement's Student Exchange and Visitor System, and Customs and Border Protection's Electronic System for Visitor Authorization.
Neptune managed to turn data from those three systems into something that's useful across organizational boundaries. But besides providing a proof-of-concept for the information sharing project, Richards said the pilot produced some other knock-on benefits, like pinpointing serious data quality issues in the DHS components' legacy systems.