White House, agencies commit $200M to solving ‘big data’ quandary

Federal technology leaders unveiled an initiative to develop better ways of harnessing the rapidly growing volume of increasingly complicated data sets, known a...

Federal technology leaders unveiled an initiative Thursday to develop better ways of harnessing the government’s growing volume of increasingly complicated data sets — or, more simply, big data.

The push is led by a joint solicitation, from the National Science Foundation and the National Institutes of Health, for developing the core technologies needed to reign in big data. All told, six federal departments and agencies will take part in the program — committing more than $200 million in research-and-development investments.

The Big Data initiative includes the:

  • National Science Foundation
  • National Institutes of Health
  • Energy Department
  • Defense Department
  • Defense Advanced Research Projects Agency
  • U.S. Geological Survey

“Obviously, it’s not the data per se that create value, said John Holdren, the head of the White House Office of Science and Technology Policy at an event Thursday sponsored by the American Association for the Advancement of Science. “But what really matters is our ability to derive from them new insights, to recognize relationships, to make increasingly accurate predictions. Our ability, that is, to move from data to knowledge to action.”

Holdren was joined onstage by officials from all of the participating agencies at the event, which was also webcast live.

The joint solicitation to help develop the “core techniques and technologies” required to managed the government’s ever-increasing streams of complex data sets anchors the new effort.

Meanwhile, a number of agencies also announced specific big data programs.

NSF: In addition to that cross-agency solicitation, NSF is providing $10 million in funding to the University of California-Berkeley to undertake a “complete rethinking of data analysis,” NSF Director Subra Suresh said, to “develop new ways to turn data into knowledge and insight.”

NIH: Along with NIH’s role in the joint solicitation, the agency will embark on separate projects as well, NIH Director Francis Collins said.

The National Human Genome Research Institute has formed a collaboration with Amazon’s cloud services, agreeing to store genome-sequencing data from the 1000 Genomes Project — about 200 terabytes of data, or 16 million file cabinets — on Amazon’s EC2.

USGS: Summing up the need for the new investments, Marcia McNutt, the head of USGS, said the agency is “in danger of drowning in data, while starving for understanding.”

However, the John Wesley Powell Center for Analysis and Synthesis in Fort Collins, Colo. is bucking that trend, McNutt added. The center announced the latest crop of awardees under a grant program for designing new tools to tackle huge sets of data.

DoD:The Defense Department is “placing a big bet on big data” to the tune of a $250 million investment, said Zach Lemnios, the assistant secretary of defense for research and engineering.

DoD’s end goal is to harness its massive amount of data toward the development completely autonomous systems, ones “that understand and interpret the real world with computer speed, computer precision and human agility,” Lemnios said.

DARPA: DARPA plans to invest about $25 million each year over the next four years, through its new XDATA program. The program aims develop new software tools for analyzing large volumes of data, including unstructured data, such as text documents, emails and social media.

Energy: Energy will provide $25 million to stand up a new institute specializing in data management and analysis.

White House officials also cited a role for the private sector and academia.

“We also want to challenge industry, research universities and non-profits to join with the administration to make the most of the opportunities created by Big Data,” Tom Kalil OSTP’s deputy director for policy, wrote on the OSTP blog. “Clearly, the government can’t do this on its own. We need what the President calls an ‘all hands on deck’ effort.”

A December 2010 report from the President’s Council of Advisors on Science and Technology (PCAST), reported an under-investment in big data and helped jump-start the push for solutions.

RELATED STORIES:

Navy struggles to find the way ahead on big data

NIST taking on big data problem

Agencies must get grip on big data

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

    (Getty Images/iStockphoto/metamorworks)Medical technology concept.

    Pentagon’s reproductive healthcare policy used 12 times from June to December

    Read more