Shows & Panels
- Accelerate and Streamline for Better Customer Service
- Ask the CIO
- The Big Data Dilemma
- Carrying On with Continuity of Operations
- Client Virtualization Solutions
- Data Protection in a Virtual World
- Expert Voices
- Federal Executive Forum
- Federal IT Challenge
- Federal Tech Talk
- Feds in the Cloud
- Health IT: A Policy Change Agent
- Improving Healthcare Outcomes through IT Policy
- IT Innovation in the New Era of Government
- Making Dollars And Sense Out of Data Center Consolidation
- Navigating the Private Cloud
- One Step to the Cloud, Two Steps Toward Innovation
- Path to FDCCI Compliance
- Take Command of Your Mobility Initiative
Shows & Panels
Lessons for managing dashboards
Thursday - 11/10/2011, 6:45pm EST
Federal News Radio
Your agency has at least one dashboard keeping track of your performance. Brand Niemann, director and senior data scientist at Semantic Community, compiled his four lessons for keeping your dashboard on target. He writes about those lessons in AOL Government and shared them Thursday with In Depth with Francis Rose.
Maintain data quality
"Data science is where you look at data with a scientific method," Niemann said. "You are critical of its quality, its pedigree, its usefulness to people." This is different than what Data.gov currently does. Using the scientific method is about putting the data out there with some real method, analysis and interpretation behind it.
Use best practices in dashboard design
Thanks to a recent IBM report, Niemann had the opportunity to review all 11 dashboards being employed by the federal government. He discovered that he had already recreated seven of the 11 dashboards using a tool called Spotfire. He had done this in order to design them to be more open, transparent and understandable.
"These kinds of things ought to be created with software that makes them more inter-operable and more understandable and reusable by the public," he said.
Current dashboards are all done with a variety of software and interfaces making it difficult for users to share the data easily across the different platforms.
Performance measures should reflect mission goals
Niemann was disappointed when he discovered that Performance.gov did not have performance data readily available. "All I really found was data for what the White House calls 'excess properties.' Those were the ones they're trying to sell off," he said. "I didn't see any real what you think of as performance of agencies data that one could do some analytics with."
He said Performance.gov creates a false expectation for visitors that performance of government agency data is available to analyze, thereby failing to fulfill the site's perceived purpose.
Dashboard effectiveness depends on use
"All this data and functionality should be within four mouse clicks," Niemann said.
On the first click, visitors should see the data. "You don't see a description of the data. You don't see that you can download it and unzip it. You don't see that it's in a format that you have to convert," Niemann said. "Step one is you see the data right away."
On the second click, you should be able to filter and sort the data.
"Step three is that you can actually download it, get it back out readily and use it in your own favorite tool," Niemann said.
The final step allows you to share the data with other people. "In my case, you can share what you see on the web on mobile devices like an iPad," he said.
Niemann would like to see agency dashboards employing these four steps to make its data readily available. "You see. You sort and search. You download and you share and collaborate around," he said.
The seven of the 11 websites that he recreated failed to do that.