Shows & Panels
- AFCEA Answers
- Ask the CIO
- The Big Data Dilemma
- Carrying On with Continuity of Operations
- Connected Government
- Constituent Servicing
- Continuous Monitoring: Tools and Techniques for Trustworthy Government IT
- The Cyber Imperative
- Cyber Solutions for 2013 and Beyond
- Expert Voices
- Federal Executive Forum
- Federal IT Challenge
- Federal Tech Talk
- Mission-critical Apps in the Cloud
- The Path from Legacy Systems
- The Real Deal on Digital Government
- The Reality of Continuous Monitoring... Is Your Agency Secure?
- Veterans in Private Sector: Making the Transition
Shows & Panels
Government IT budgets are shrinking. Real-time, informed decisions are needed. Now more than ever the federal government needs to track and understand the data terrain to bridge the gap from data to decision. Listen each week to Federal News Radio for The Big Data Report brought to you by MarkLogic - learn more.
How is the government using big data currently?
Chief Information Officer Jerry Horton said his agency is now linking its procurement and financial systems, among others, to take advantage of the large amounts of data USAID produces.
Big data enthusiasts from government, industry and academia are getting their hands dirty. The National Institute of Standards and Technology and the National Science Foundation held a two-day workshop recently to explore the technologies needed to collect and analyze big data. Attendees also examined how big data can enhance areas like science, health and security. The government announced in March its plans to invest $200 million dollars in the growing field.
The Office of the National Coordinator for Health IT is putting its money where its mouth is when it comes to big data.
It's offering a $75,000 prize for the development of an application that mashes up personal health data with larger information sets. The goal? Making big data more beneficial for patients.
Entries are due September 5th.
Participants in the Health 2.0 Boston Big Data Code-a-Thon were challenged to create applications that turned large amounts of health data into usable information.
The winner - the "No Sleep Kills" website, which teaches people about the dangers of not getting enough sleep.
Developers used data from multiple sources including the Centers for Disease Control and Prevention and the Centers for Medicare and Medicaid Services to create the site.
The amount of data in the world doubles every 18 months. Now, federal agencies are figuring out how to manage their portion of that big data. PC World says IT managers need to remember that useful information can come from anywhere, including sources that may have been pushed aside in the past. It also reminds organizations that hiring the right people is key to turning everyday data into usable knowledge.
Government agencies will create enough data in the next two years to fill 20 million filing cabinets, according to a recent MeriTalk survey of federal CIOs and IT managers. 60 percent of civilian agencies and over 40 percent of Defense and intelligence officials surveyed said they are now learning how big data initiatives can help solve this problem. Those surveyed said content storage and personnel issues are some of their biggest challenges to using big data effectively.
The Tech America Foundation has announced the formation of a Big Data Commission. The group will be made up of members from industry interested in tackling the most pertinent big data questions - like, how government agencies can secure such large volumes of information and how big data can be used to make intelligent decisions. Tech America is looking for commissioners to head up the new group. It's accepting applications on its website through May 14.
Federal Chief Technology Officer Todd Park is taking on the issue of big data - especially as it relates to the health care field. When asked during a recent Tweet-Up about the biggest barriers to big data, Park said the key is making information liquid and accessible while protecting privacy - which, he said, is doable. Park also encouraged agencies to make sure they release data in machine-readable formats. He pointed to healthdata.gov as a good example.