Shows & Panels
- AFCEA Answers
- Ask the CIO
- The Big Data Dilemma
- Carrying On with Continuity of Operations
- Connected Government
- Constituent Servicing
- Continuous Monitoring: Tools and Techniques for Trustworthy Government IT
- The Cyber Imperative
- Cyber Solutions for 2013 and Beyond
- The Data Privacy Imperative: Safeguarding Sensitive Data
- Expert Voices
- Federal Executive Forum
- Federal IT Challenge
- Federal Tech Talk
- Mission-critical Apps in the Cloud
- The Modern Federal Threat Landscape
- The Path from Legacy Systems
- The Real Deal on Digital Government
- The Reality of Continuous Monitoring... Is Your Agency Secure?
- Veterans in Private Sector: Making the Transition
Shows & Panels
Handling 'Big Data'
Tuesday - 10/18/2011, 8:27pm EDT
Let's outline the problem - if you combine data from mobile devices, RFID, aerial sensing, software logs, and social media information you can crush a typical analyst.
Furthermore, information can reside in secure silos and proprietary data stores. The challenge for federal IT professionals is to derive deep insights from this proliferation of information.
GCE Federal has earned its stripes helping federal agencies in financial areas.
With the advent of massive amounts of data being generated, GCE Federal has developed an expertise handling what is now called "Big Data."
President Ray Muslimani give a good technical overview of a technology called Hadoop.
Hadoop originated in 2006 as an outgrowth of the open source Apache Project.
It can give you a way to manage terabytes of information. James Kobelius from Forester writes that "Hadoop will be the nucleus of next-generation data warehouses."