Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Moving to the Cloud. What's the best approach for me
- Navigating Tough Choices in Government Cloud Computing
- The New Generation of Database
- Satellite Communications: Acquiring SATCOM in Tight Times
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Value of Health IT
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
New data analytics tool gives Postal Service IG head start on cases
Thursday - 2/14/2013, 10:17pm EST
Investigators and auditors in the Postal Service's Office of Inspector General didn't jump at the chance to use new tools for analyzing data. They were unsure and skeptical the new approach would really make a difference or if it would just waste their time.
But once the Counter Measures and Performance Evaluation (CAPE) team in the OIG developed the first dashboard to help investigators visualize the data more easily, they overcame that initial resistance.
Bryan Jones, the director of the CAPE team, said the IG's office had to understand the mind of an investigator and what would be compelling to them.
"It's a web-based interface, a map of the U.S. with hot spots," Jones said. "These are all hyperlinks ... If you are an investigator responsible for a certain area, your eyes are drawn to that area, there are circles that are red or green, depending on what's going on there; there are hyperlinks so you can drill into the details. Once you get behind the map, then the power of the analytics is right in front of you. You have a link analysis tool. It may link one contractor with another contractor. There are copies of invoices. There's risk scores that are assigned to whatever it is we are measuring. We are able to model every single contract or every single transaction or every single whatever it is that's being investigated. In the past, you'd have to do a statistical sample or you may have to wait until someone calls to look for something. It puts a lot of information in front of the investigator."
No longer starting from scratch
Jones said the investigator then can use their experience to decide how to proceed with the research and determine whether there is a problem.
"Hopefully, their starting point in there is a little closer to the crime than having to start from scratch," he said. "We've now developed this same interface for all four fraud models and, as we've done this, our user adoption rates have really increased because it's easy to learn. You don't have to spend a lot of time in training anymore. People seem to be so much more savvy with the computer. As long as there is a hyperlink, people can click on it and then behind that there is the data that means something to you. Then you are not focused on training, you are mostly focused on marketing and communications so people know it's out there. And then our user adoption — another reason it's grown — the investigators that see value in it are sharing it with other investigators. So you have a grassroots effort so it spreads in the field because, believe me, if it's not worth it, they will not come back. And if they like what they see, they will use it and tell others."
The CAPE team brought in the investigators from the beginning. The team included a subject-matter expert in the development phase.
"We started without an off-the-shelf product ... Being that we had a very practical start and we didn't have a very big budget, we were guinea pigs and said 'We need you to go test and prove this out,'" Jones said. "Instead of investing the money we had into a software product, we invested into data modelers. We then asked ourselves, 'What do we already own?' We can find servers that aren't being utilized. We know we have an operating system and an Oracle database that we could potentially use. Can our modeler then develop a model and deploy it in an environment we already own?"
The answer to that question was yes. Now, the OIG's office is investing money into the program, and the CAPE team bought a commercial software tool to run in its environment.
Expanding tools to auditors
CAPE now is expanding the use of analytical tools to the audit side of the OIG.
Jones said the audit group has had a series of risk models in place, but some were older and didn't have the latest data sets.
"We are able to come in and look at this freshly. What new data could we add to these models? What is the business climate now? Do we need to ask a different question to be answered? And can we get you down to the invoice? Can we get you down to the transaction at the retail level? Can we show you that piece of mail?" he said. "We are able to get insight down to that level of transaction. We are taking that suite of tools that were already in existence on the audit side and working with our audit team. We are trying to enhance those and make those more valuable to them and to the Postal Service. We've collaborated on a couple of models directly with the Postal Service and they are seeing value there as well. We've reached a tipping point on the audit side too."