Handling ‘Big Data’

GCE Federal Presdient Ray Muslimani talks about data storage with host John Gilroy. October 18, 2011

October 18, 2011 — Big data doesn’t necessarily mean big headaches.

Let’s outline the problem – if you combine data from mobile devices, RFID, aerial sensing, software logs, and social media information you can crush a typical analyst.

Furthermore, information can reside in secure silos and proprietary data stores. The challenge for federal IT professionals is to derive deep insights from this proliferation of information.

GCE Federal has earned its stripes helping federal agencies in financial areas.

With the advent of massive amounts of data being generated, GCE Federal has developed an expertise handling what is now called “Big Data.”

President Ray Muslimani give a good technical overview of a technology called Hadoop.

Hadoop originated in 2006 as an outgrowth of the open source Apache Project.

It can give you a way to manage terabytes of information. James Kobelius from Forester writes that “Hadoop will be the nucleus of next-generation data warehouses.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.