Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
Shows & Panels
Handling 'Big Data'
Tuesday - 10/18/2011, 8:27pm EDT
Let's outline the problem - if you combine data from mobile devices, RFID, aerial sensing, software logs, and social media information you can crush a typical analyst.
Furthermore, information can reside in secure silos and proprietary data stores. The challenge for federal IT professionals is to derive deep insights from this proliferation of information.
GCE Federal has earned its stripes helping federal agencies in financial areas.
With the advent of massive amounts of data being generated, GCE Federal has developed an expertise handling what is now called "Big Data."
President Ray Muslimani give a good technical overview of a technology called Hadoop.
Hadoop originated in 2006 as an outgrowth of the open source Apache Project.
It can give you a way to manage terabytes of information. James Kobelius from Forester writes that "Hadoop will be the nucleus of next-generation data warehouses."