Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
Information Technology News
Federal information sharing hamstrung by technology
Friday - 1/27/2012, 5:09am EST
Few would argue with the fact that agencies have taken significant steps to securely make data available to their colleagues and even, to a limited extent, to industry. But the problem continues to be how agencies can use and manage the huge amount of information the government collects, especially when it comes to terrorism and homeland security.
David Shedd, deputy director, Defense Intelligence Agency
Shedd added the problem is how do analysts process all the data with the right algorithms and right technology to find the needles in the haystack.
"The problem for that analyst today is you can't possibly in a 24 hour day, if they were to work 24 hours a day, get through all that data even in their area of responsibility," he said.
Possible solutions to the data flood?
Despite this enormous challenge, there are some potential solutions starting to emerge.
Kshemendra Paul, the program manager for the Information Sharing Environment, said data standards and tagging are key to making the data searchable.
He said efforts such as the National Information Exchange Model (NIEM) or the National Suspicious Activity Reporting initiative are two examples of where this already is happening.
Paul said 18 agencies are using NIEM in one way or another. And this week, Paul said the Centers for Disease Control and Prevention is hosting officials from the U.S., Canada and Mexico to discuss how to implement NIEM around specific public health and law enforcement areas.
As for the Suspicious Activity Reporting (SAR) standard, Paul said more than 200,000 law enforcement officers have been trained and another 200,000 will receive education later this year.
And now, Paul said the government is expanding the SAR initiative to offer training to fire, emergency medical services 911 operators and critical infrastructure owners and operators.
Better technology needed
DIA's Shedd added technology also must play a big role.
He said analysts can't just pull the data they need or think they need all the time. The systems must push the data based on a set of standards and rules.
Kshemendra Paul, program manager, Information Sharing Environment
Paul and members of the intelligence community are working on a new national strategy for secure information sharing that likely will address the data overload challenge. The draft in the works and a final document could come out in the next three to six months.
The national strategy also will highlight expectations in the post-WikiLeaks environment.
The intelligence community already has made some changes and is in the middle of making others, said retired Air Force Maj. Gen. James Clapper, the director of national intelligence.
He said among the Office of the Director for National Intelligence's top six priorities is the need to share and safeguard information. Clapper said WikiLeaks has caused the IC to do more in terms of auditing, monitoring and controlling movable media.
"We have to do more to both tag data and ensure we can properly identify people," he said. "so, if we are sharing information, we are assured that they have the bona fides and that they are authorized to receive the information."
He said greater identity management and improved labeling, cataloging and tagging of data actually improves sharing across the intelligence community.
"If you can be sure the information you are sharing is actually going to an authorized recipient that actually is an inducement to do more sharing," Clapper said. "We will, of course as we always do, install all the appropriate IT mouse traps to prevent a recurrence of WikiLeaks, but in the end our system is based on personal trust."
President Barack Obama signed an Executive Order last October to codify many of the changes made in the wake of the WikiLeaks. It also created new offices to oversee the move to secure sharing.
IC defending against insider threats
Clapper said the IC has varying degrees of capabilities to do the auditing and monitoring. He said the intelligence community will invest in new technologies to address the insider threat.
Retired Air Force Maj. Gen. James Clapper, director of national intelligence
ODNI is developing a system to tag data and another one to monitor employees. Clapper said both are a work in progress.
Clapper said he expects the systems to be part of the IC's new IT architecture, which will promote integration and efficiency and make it easier to share data more broadly within the intelligence community.
Clapper said employees tag data, and then can label, account and catalog it, which will make establishing a community of interest more quickly and more efficiently than how the IC does it now.
"If you know what the data is in question, you know where it is and you know with whom it can be shared and then you can account for it when it is, you are in a much better posture both in terms of security and from a sharing standpoint," he said.
Clapper said the IC has made some progress, but it will take about five years to establish such a system.
Beyond securing and safeguarding data, Clapper said the continued integration of the IC, the development of standards especially around IT, ensuring privacy and embracing a common operating model and shared services are among his top priorities.
In the end, Clapper said the goal is to ensure the data meets the needs of the analysts but the sources are kept secure.