Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Connected Government
- Consolidating Mission-critical Systems
- Constituent Servicing
- The Data Privacy Imperative: Safeguarding Sensitive Data
- Eliminating the Pitfalls: Steps to Virtualization in Government
- Federal Executive Forum
- Federal Tech Talk
- Government Cloud Brokerage: Who, What, When, Where, Why?
- Government Mobility
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mobile Device Management
- The Modern Federal Threat Landscape
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- Satellite Communications: Acquiring SATCOM in Tight Times
- Transformative Technology: Desktop Virtualization in Government
- Understanding the Intersection of Customer Service and Security in the Cloud
Shows & Panels
Defining and handling big data
Tuesday - 11/13/2012, 11:17pm EST
Many companies dust off their existing offerings and simply re-brand them tools for handling large data sets.
MarkLogic is one of the few companies who have over a decade of experience in handling sets of data that are so large standard tools can't handle them.
Today's interview is with two professionals from MarkLogic: Jon Bakke, vice president Worldwide Consulting and Rick Miller, director of Federal, State, and Local Sales.
They give a balanced presentation of how MarkLogic defines big data and how their company can provide assistance to federal IT professionals who are challenged in integrating widely diverging sets of data.
During the interview you will learn of several agencies, like the Federal Aviation Administration, who have taken advantage of MarkLogic's server in handling data sets that consist of petabytes of information.