Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
NSF lends supercomputing help to oil spill
Thursday - 6/3/2010, 9:40am EDT
Senior Internet Editor
The National Science Foundation is making it possible to use the theoretical to deal with the reality of the largest oil spill in U.S. history.
Irene Qualters, program director for the National Science Foundation's Blue Waters Program told Federal News Radio the key is using a supercomputer at the Texas Advanced Computing Center to model the movement of the oil.
It predicts up to 72 hours in advance the exact movement and the volume as it heads into these fragile areas off the coast. And so it will be able to help those who are trying to mitigate, and plan for mitigating, the effect of the oil. It will help them focus their resources and be able to expect how much is coming and precisely where, because it's a pretty big coast.
Qualters said NSF is working with NOAA and DHS, exchanging data and other information in both directions. Satellite data tracking the oil from the National Weather Service comes in every six hours, said Qualters. The model is run and the information is fed back out to clean up efforts.
With the start of hurricane season, Qualters said the program has begun just in time. "That's the strength of these researchers and the model they're using. It is particularly adept at storm surge and the effect within coastal regions."
With a three day head start, getting the word out should help, said Qualters.
The results of this will be used both by NOAA and Department of Homeland Security who have websites existing. And they will use this to better predict where the oil is going to end up and in what volume. Because, I think from the early stages of the oil spill, it's been clear we don't have a very good idea of where it's going to be and when. And how. For example, is it going to be tar balls. What are the other attributes.
Other areas of research from NSF will look at the deep water plume, and what will break up the oil effectively, according to Qualters. "This particular research is very important and hopefully help those that are being directly effected, but I think some of the longer term issues associated with the deep water and how best to break up the oil. There are independent efforts going on to do the necessary research quickly to help those efforts too."
Qualters notes the work being done now will last long after the clean up is complete. Since the model being run now uses research done during Katrina. "Hopefully as we go into this hurricane season, we're benefiting both from the research that's going on right now, but also we're learning from disasters that have occurred previously."