Shows & Panels
Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- American Readiness: Renewable Power and Efficiency Technologies
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal News Radio's National Cyber Security Awareness Month Special Panel Discussion
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- Government Perspectives on Mobility and the Cloud
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- The New Generation of Database
- Reimagining the Next Generation of Government
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
Agencies taking the 'Burger King' approach to strategic reviews
Friday - 6/20/2014, 4:04am EDT
Like the fast food chain Burger King's motto, agencies are having it their way — when developing their strategic review processes. The Office of Management and Budget is giving agencies a lot of latitude to figure out how best to meet key parts of the Government Performance and Results Act Modernization Act of 2010 (GPRAMA).
GPRAMA required agencies to develop annual reviews and ratings of programs, and then publish them. This was the first year agencies both created and published those strategies, and now they are doing the reviews and rankings.
Lisa Danzig, OMB's associate director for personnel and performance, said it's still early in the review process, but all signs point to good things coming from them.
"I see three parts to the strategic reviews. One piece of it is assessing progress. That's where we often talk about whether you are red, yellow or green, or we've used language here about significant challenges or noteworthy progress, and how well we are doing. That's kind of the very first step of diagnostic of how well we are doing," Danzig said Wednesday during an event sponsored by the National Academy of Public Administration in Washington. "The second step, then, is informing decision making, with an expectation that we will do a little bit of that in this 2014 cycle, but we are really expecting that to mature over time. So I think that's the second stage of this. How does it inform our long-term strategies? How does it inform budget formulation, legislative strategies, ideas or opportunities to incorporate better evidence and research demonstration? That's where I think there is a lot of richness. Then I think ultimately the third piece of this is how to take action to make improvements."
NAPA and OMB jointly released a new report on getting more from strategic reviews. The report highlighted current steps so far at agencies such as the Housing and Urban Development or the Department of Labor, as well as highlighted issues or challenges to improve and build capacity for these processes.
In the short term, OMB will conduct these reviews of plans and goals through July in preparation for the fiscal 2016 budget request. Danzig said the review meetings usually include eight to 10 senior executives, including deputy secretaries, assorted CXOs and program folks.
More flexibility in the design
Conducting program reviews is not new for most of the government. But the difference this year is OMB didn't prescribe the approach. In fact, Danzig says it will take agencies two to three years to mature the review processes.
Danzig said too often agencies get overly caught up in the process and not in the overall goal of making improvements across program areas.
Beth Robinson, the CFO at NASA, said they meet monthly to go over their programs and cycle through every program four times a year.
"This is one of the few performance exercises that we were asked to do something in a less perspective way, and we could actually tailor it to our agency's needs," she said. "That was very welcome by the agency. The agency was like, 'OK, we really want to take this as a communication tool outside of our agency.'"
Robinson said during the early days of GPRA, it wasn't always this way. She said there was a time when one of the appropriations subcommittees would tell the agency to put all its budget data on white paper and all its performance information on green paper. The congressional staff members would tear out the green paper because they didn't understand the performance data.
"I think we've come a long way since then in using it as a communications tool. I think people are very excited to be able to do something in the spring where there is communication early on with OMB and others on what the challenges we face are," Robinson said. "It's an early indication of what we will bring forward as our solution set in September, so I think the timing of it was very welcome."
Robinson said usually the performance and budget data are sent to OMB all at once and there's no way anyone can make heads and tails over what's really going on because of the volume of information.
A framework that fits their mission
The National Science Foundation approached the reviews called for under the GPRA Modernization Act in a different way than ever before.
Martha Rubenstein, the CFO of NSF, said too often in the past, the frameworks that OMB wanted agencies to use just didn't work for how they meet their mission.