Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
NIST taking on big data problem
Friday - 2/3/2012, 2:37pm EST
Federal News Radio
The National Institute of Standards and Technology is leading an effort to make better sense of the flood of data many organizations in the public and private sector face.
NIST is hosting a conference with other agencies and members of academia and the computer industry to find patterns among the billions of documents created by federal agencies.
At the annual Text Retrieval Conference (TREC), public and private-sector organizations are hoping to use the patterns to strengthen digital infrastructures and develop new research strategies.
TREC encourages friendly competition in creating algorithms to organize and interpret the massive amount of data, including everything from emails and Tweets to, even, medical records.
Ellen Vorhees is the project manager of TREC at the National Institute of Standards and Technology. She joined The Federal Drive with Tom Temin to discuss its goals and methods.
"TREC has been on the leading edge of being able to handle these large amounts of unstructured data," Vorhees said. "We keep these different focus areas and try to keep ourselves at the front."
This effort is not just for the government, but commercial companies also are finding benefits. Industry representatives include Microsoft and IBM; in fact, Big Blue collected from TREC the original research for the famous Watson computer.
NIST will hold the conference in November, but participants have until the end of February to submit their applications to participate in the effort to deal with unstructured data.