Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
Intelligence community cloud coming online in early 2013
Wednesday - 10/10/2012, 5:33am EDT
James Clapper, the director of National Intelligence first announced the Intelligence Community Information Technology Enterprise (ICITE) a year ago at the annual GEOINT Symposium. Returning to the same conference a year later, he told attendees the shared IT infrastructure will reach initial operating capability in March 2013.
"The plan is centered on cloud computing and cloud storage and the security enhancements bound up in the bumper sticker slogan, 'tag the data, tag the people,'" he told the 2012 GEOINT symposium in Orlando, Fla., Tuesday. "If we execute this right, we'll save a lot of money, but maybe more importantly, the IC will be able to take intelligence integration to the next level. We'll transition from agency-centric IT in what I would charitably call a confederation to an enterprise model that shares both resources and data. This is something we've talked about for years, but we've never had the incentive to actually do it. Now we do have that incentive, and so we must do it."
There are several milestones planned along the way, but by 2018, Clapper said, the entire intelligence community will have moved to the new architecture.
"We've progressed from concept to design to development, and importantly, we've built our budgets around this integrated architecture, so we're putting our programmatic money where our mouths are," Clapper said. "As [Central Intelligence Agency Director] David Petraeus would say, and it's critical that he did, 'We're all in.'"
Common desktop across the IC
The CIA and the National Security Agency are building a secure cloud computing architecture for the entire IC. Meanwhile, the Defense Intelligence Agency and the National Geospatial-Intelligence Agency will work together to build an IC-wide common desktop. Clapper said it's likely to resemble the desktop computing environment NGA built for its recently-opened campus in Springfield, Va. Letitia Long, NGA's director, also speaking at the GEOINT conference, said her agency already has started testing the new desktop environment. NGA and DIA plan to have 2,000 users plugged in by March 2013. The goal by March 2014 is 60,000 users.
Letitia Long, director, National Geospatial-Intelligence Agency
"This enables all of us to log on from any computer anywhere in the community and get to our data and our apps," she said. "No more tunneling through networks or trying to find a computer that belongs to your own agency. Key also to us is the work that CIA and NSA are doing on the secure cloud to give us that common infrastructure. When we have that, it will enable us to rapidly scale our exploitation and processing capabilities and take advantage of all that is out there. We don't have to each build it all ourselves."
There are challenges involved in sharing IT resources between agencies though, Long said. For one, agencies and their vendors are accustomed to "point-to-point" contracting relationships in which one agency has a contract with one vendor to deliver a product or service.
"As DIA and NGA are working this common desktop environment, we have multiple contracts with our vendors, and as we try to put them together they have different terms. Some are better than others," she said. "It's a challenge, but I think it's workable."
New software licensing model needed
Another unanswered question is how the agencies will handle software licensing agreements in a shared, services-oriented technology environment that's built around interoperable data using open standards and modular apps.
"What's the compensation model when you talk about 10,000, 20,000, or 60,000 users? When an app goes viral and you have a license for 10,000, do you cut people off? Do I have to buy an unlimited license? That's unaffordable in today's environment," she said. "We need to work with our industry partners to figure out how we come up with a compensation model that works for both of us."
Some of those talks already are happening, Long said. In the target IT environment, the agency envisions 75 percent of the apps it will use being created by industry. She said NGA just held its first industry day for applications, where the agency proposed a new compensation model she said is based on commercial business practices.
"We would want developers creating apps speculatively, and then they'd be compensated based on the rating and the usage," she said. "We have incorporated a rating schema and business analytics into our app store, so we are ready to do that. We had very good conversations, and we're very helpful that we'll be able to implement such a compensation model."
Long said one of the benefits of the common IT architecture is that other agencies will be able to take a self-service approach to NGA's geospatial data, including raw data in some cases. To lay the groundwork for that, the agency aggressively is metatagging its new and existing data so that it's easy to find. And she said NGA has cleared out duplicate versions of imagery and other content so that there's now one authoritative source of each piece of data.
She said once the data is ready to be incorporated into a service-oriented architecture and accessible across the entire intelligence community, users will be able to serve themselves rather than asking NGA for help.
"You can access our content, you can tailor it, you can do what you need to do. What I ask in return is that when you do make enhancements or build a product or add to it that you share it back with us," she said. "That will enable us to learn, that will enable us to provide better service in the future. And in return, we will host it and serve it back out to the entire community. Exposing our content to tens of thousands of users, we don't know what will happen. It will be things that are un-thought of by NGA today. We've seen that happen."
Common geospatial data standards
Long said NGA also wants to host geospatial data generated by other agencies. First though, the data needs to conform to a common standard. She said as the functional manager for geospatial intelligence, NGA has the legal authority to create and publish those standards, and it's done so.
But it hasn't done a good job of enforcing their use by other agencies until recently. NGA has just finished helping the Air Force assess its compliance with the standards. A similar review is underway in the Army, and CIA is next in line.
"The feedback we got from Air Force was, 'Thanks. We found programs we didn't know we had, we found folks who were working on the same thing and didn't know each other, and by the way we found a few things we weren't in compliance with,'" she said. "As we've completed the first one, the rest of the community is saying they need to be a part of that too."