Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- Government Perspectives on Mobility and the Cloud
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Reimagining the Next Generation of Government
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
White House, agencies commit $200M to solving 'big data' quandary
Thursday - 3/29/2012, 8:53pm EDT
The push is led by a joint solicitation, from the National Science Foundation and the National Institutes of Health, for developing the core technologies needed to reign in big data. All told, six federal departments and agencies will take part in the program — committing more than $200 million in research-and-development investments.
The Big Data initiative includes the:
"Obviously, it's not the data per se that create value, said John Holdren, the head of the White House Office of Science and Technology Policy at an event Thursday sponsored by the American Association for the Advancement of Science. "But what really matters is our ability to derive from them new insights, to recognize relationships, to make increasingly accurate predictions. Our ability, that is, to move from data to knowledge to action."
Holdren was joined onstage by officials from all of the participating agencies at the event, which was also webcast live.
The joint solicitation to help develop the "core techniques and technologies" required to managed the government's ever-increasing streams of complex data sets anchors the new effort.
Meanwhile, a number of agencies also announced specific big data programs.
NSF: In addition to that cross-agency solicitation, NSF is providing $10 million in funding to the University of California-Berkeley to undertake a "complete rethinking of data analysis," NSF Director Subra Suresh said, to "develop new ways to turn data into knowledge and insight."
NIH: Along with NIH's role in the joint solicitation, the agency will embark on separate projects as well, NIH Director Francis Collins said.
The National Human Genome Research Institute has formed a collaboration with Amazon's cloud services, agreeing to store genome-sequencing data from the 1000 Genomes Project — about 200 terabytes of data, or 16 million file cabinets — on Amazon's EC2.
USGS: Summing up the need for the new investments, Marcia McNutt, the head of USGS, said the agency is "in danger of drowning in data, while starving for understanding."
However, the John Wesley Powell Center for Analysis and Synthesis in Fort Collins, Colo. is bucking that trend, McNutt added. The center announced the latest crop of awardees under a grant program for designing new tools to tackle huge sets of data.
DoD:The Defense Department is "placing a big bet on big data" to the tune of a $250 million investment, said Zach Lemnios, the assistant secretary of defense for research and engineering.
DoD's end goal is to harness its massive amount of data toward the development completely autonomous systems, ones "that understand and interpret the real world with computer speed, computer precision and human agility," Lemnios said.
DARPA: DARPA plans to invest about $25 million each year over the next four years, through its new XDATA program. The program aims develop new software tools for analyzing large volumes of data, including unstructured data, such as text documents, emails and social media.
Energy: Energy will provide $25 million to stand up a new institute specializing in data management and analysis.
White House officials also cited a role for the private sector and academia.
"We also want to challenge industry, research universities and non-profits to join with the administration to make the most of the opportunities created by Big Data," Tom Kalil OSTP's deputy director for policy, wrote on the OSTP blog. "Clearly, the government can't do this on its own. We need what the President calls an 'all hands on deck' effort."
A December 2010 report from the President's Council of Advisors on Science and Technology (PCAST), reported an under-investment in big data and helped jump-start the push for solutions.