Shows & Panels
- Accelerate and Streamline for Better Customer Service
- Ask the CIO
- The Big Data Dilemma
- Carrying On with Continuity of Operations
- Client Virtualization Solutions
- Data Protection in a Virtual World
- Expert Voices
- Federal Executive Forum
- Federal IT Challenge
- Federal Tech Talk
- Feds in the Cloud
- Health IT: A Policy Change Agent
- Improving Healthcare Outcomes through IT Policy
- IT Innovation in the New Era of Government
- Making Dollars And Sense Out of Data Center Consolidation
- Navigating the Private Cloud
- One Step to the Cloud, Two Steps Toward Innovation
- Path to FDCCI Compliance
- Take Command of Your Mobility Initiative
Shows & Panels
Search Tags: big data
Mark Weber, NetApp's president of the U.S. public sector, spoke to The Federal Drive with Tom Temin and Emily Kopp about a Meritalk study his company sponsored called The Big Data Gap.
Assistant Secretary of Defense for Research and Engineering, Zach Lemnios (LEM knee ohss), says 60 million dollars in new solicitations will be aimed at everything from autonomous systems...to more natural interactions between machines and people.
On the In Depth show blog, you can listen to the interviews, find more information about the guests on the show each day and links to additional resources.
The Defense Department aims to spend $60 million in new procurements to sort out its data deluge. The Pentagon already spends $250 million annually on research projects under the heading of "big data."
The Navy is in the early stages of trying to figure out how to move from a net-centric view of its information systems to one that focuses on the data itself. The service is looking to the experiences of the intelligence community to improve data tagging and data sharing.
Cloudera CEO Mike Olson joins host John Gilroy to talk about how his company can help you with big data.
February 14, 2012
Tags: technology , Open source software , Cloudera , Mike Olson , Yahoo , Google , Facebook , Hbase , Hive , Sqoop , Flume , Oozie , cloud computing , cybersecurity , geospatial data , emergency response , bioinformatics , John Gilroy , Federal Tech Talk
The National Institute of Standards and Technology is sponsoring an upcoming conference to bring together industry and agencies to figure out ways to improve how they use unstructured data.
Host John Gilroy is joined by Oracle Group Vice President and Chief Technologist Peter Doolan. They will discuss new "Big Data" software developed by the company.
January 31, 2012
Tags: technology , Oracle , Peter Doolan , federal IT , Open source software , licensed based software , ethernet , InfiniBand , John Gilroy , Federal Tech Talk , information sharing , Gov 2.0 , Big Data appliance
October 20th, 2010 at 11 AM
The application of knowledge discovery within the cloud is immensely powerful, but not inbuilt. We are collectively moving past the question of "what is cloud computing", and swiftly moving towards "how does the cloud enable advanced analysis against massive volumes of data?" With industry and government leveraging multiple clouds, how do we successfully share and search large collections of data across systems, departments, and geographies? Organizations will continue to discuss and better understand the analytic power and economies of cloud computing, in the sense of data storage, sharing, and management; but we are quickly discovering that creating knowledge from data is more than just a discussion of technology. It's a discussion of what can be accomplished when massive data and cloud computing efficiencies combine to make advanced analysis and innovation possible.
Tags: technology , Booz Allen Distinguished Speaker Series , cloud , cloud computing , Michael Byrne , Jeff Jonas , David Mihelcic , DISA , Chris Nissen , Mike Olson , Cloudera , MITRE , IBM , FCC , Massive Data , data management , analytics , Chris Kelly , data models , IT