Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Connected Government
- Consolidating Mission-critical Systems
- Constituent Servicing
- Continuous Monitoring: Tools and Techniques for Trustworthy Government IT
- The Data Privacy Imperative: Safeguarding Sensitive Data
- Eliminating the Pitfalls: Steps to Virtualization in Government
- Federal Executive Forum
- Federal Tech Talk
- Government Cloud Brokerage: Who, What, When, Where, Why?
- Government Mobility
- Mission-critical Apps in the Cloud
- Mobile Device Management
- The Modern Federal Threat Landscape
- The Path from Legacy Systems
- Understanding the Intersection of Customer Service and Security in the Cloud
Shows & Panels
White House, agencies commit $200M to solving 'big data' quandary
Thursday - 3/29/2012, 8:53pm EDT
The push is led by a joint solicitation, from the National Science Foundation and the National Institutes of Health, for developing the core technologies needed to reign in big data. All told, six federal departments and agencies will take part in the program — committing more than $200 million in research-and-development investments.
The Big Data initiative includes the:
"Obviously, it's not the data per se that create value, said John Holdren, the head of the White House Office of Science and Technology Policy at an event Thursday sponsored by the American Association for the Advancement of Science. "But what really matters is our ability to derive from them new insights, to recognize relationships, to make increasingly accurate predictions. Our ability, that is, to move from data to knowledge to action."
Holdren was joined onstage by officials from all of the participating agencies at the event, which was also webcast live.
The joint solicitation to help develop the "core techniques and technologies" required to managed the government's ever-increasing streams of complex data sets anchors the new effort.
Meanwhile, a number of agencies also announced specific big data programs.
NSF: In addition to that cross-agency solicitation, NSF is providing $10 million in funding to the University of California-Berkeley to undertake a "complete rethinking of data analysis," NSF Director Subra Suresh said, to "develop new ways to turn data into knowledge and insight."
NIH: Along with NIH's role in the joint solicitation, the agency will embark on separate projects as well, NIH Director Francis Collins said.
The National Human Genome Research Institute has formed a collaboration with Amazon's cloud services, agreeing to store genome-sequencing data from the 1000 Genomes Project — about 200 terabytes of data, or 16 million file cabinets — on Amazon's EC2.
USGS: Summing up the need for the new investments, Marcia McNutt, the head of USGS, said the agency is "in danger of drowning in data, while starving for understanding."
However, the John Wesley Powell Center for Analysis and Synthesis in Fort Collins, Colo. is bucking that trend, McNutt added. The center announced the latest crop of awardees under a grant program for designing new tools to tackle huge sets of data.
DoD:The Defense Department is "placing a big bet on big data" to the tune of a $250 million investment, said Zach Lemnios, the assistant secretary of defense for research and engineering.
DoD's end goal is to harness its massive amount of data toward the development completely autonomous systems, ones "that understand and interpret the real world with computer speed, computer precision and human agility," Lemnios said.
DARPA: DARPA plans to invest about $25 million each year over the next four years, through its new XDATA program. The program aims develop new software tools for analyzing large volumes of data, including unstructured data, such as text documents, emails and social media.
Energy: Energy will provide $25 million to stand up a new institute specializing in data management and analysis.
White House officials also cited a role for the private sector and academia.
"We also want to challenge industry, research universities and non-profits to join with the administration to make the most of the opportunities created by Big Data," Tom Kalil OSTP's deputy director for policy, wrote on the OSTP blog. "Clearly, the government can't do this on its own. We need what the President calls an 'all hands on deck' effort."
A December 2010 report from the President's Council of Advisors on Science and Technology (PCAST), reported an under-investment in big data and helped jump-start the push for solutions.