Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
Shows & Panels
Defining and handling big data
Tuesday - 11/13/2012, 11:17pm EST
Many companies dust off their existing offerings and simply re-brand them tools for handling large data sets.
MarkLogic is one of the few companies who have over a decade of experience in handling sets of data that are so large standard tools can't handle them.
Today's interview is with two professionals from MarkLogic: Jon Bakke, vice president Worldwide Consulting and Rick Miller, director of Federal, State, and Local Sales.
They give a balanced presentation of how MarkLogic defines big data and how their company can provide assistance to federal IT professionals who are challenged in integrating widely diverging sets of data.
During the interview you will learn of several agencies, like the Federal Aviation Administration, who have taken advantage of MarkLogic's server in handling data sets that consist of petabytes of information.