Shows & Panels
Shows & Panels
- The 2014 Big Picture on Cyber Security
- AFCEA Answers
- American Readiness: Renewable Power and Efficiency Technologies
- Ask the CIO
- Building the Hybrid Cloud
- Connected Government: How to Build and Procure Network Services for the Future
- Continuing Diagnostics and Mitigation: Discussion of Progress and Next Steps
- Federal Executive Forum
- Federal News Radio's National Cyber Security Awareness Month Special Panel Discussion
- Federal Tech Talk
- The Future of Government Data Centers
- The Future of IT: How CIOs Can Enable the Service-Oriented Enterprise
- Government Perspectives on Mobility and the Cloud
- The Intersection: Where Technology Meets Transformation
- Maximizing ROI Through Data Center Consolidation
- Mitigating Insider Threats in Virtual & Cloud Environments
- Modern Mission Critical Series
- The New Generation of Database
- Reimagining the Next Generation of Government
- Targeting Advanced Threats: Proven Methods from Detection through Remediation
- Transformative Technology: Desktop Virtualization in Government
- The Truth About IT Opex and Software Defined Networking
- Air Traffic Management Transformation Report
- Cloud First Report
- General Dynamics IT Enterprise Center
- Gov Cloud Minute
- Government in Technology Series
- Homeland Security Cybersecurity Market Report
- National Cybersecurity Awareness Month
- Technology Insights
- The Cyber Security Report
- The Next Generation Cyber Security Experts
Shows & Panels
DoD, DHS see more, earlier testing as a possible fix to troubled programs
Wednesday - 7/23/2014, 8:48am EDT
Two of the largest agencies are looking at increasing the amount of testing and evaluating of their often-troubled acquisition programs as the panacea to systemic problems.
The Defense and Homeland Security departments are pushing project managers to test technology or weapons systems earlier in the acquisition lifecycle to understand and solve potential roadblocks sooner.
The idea of testing sooner may seem like common sense — the sooner you find the problems, the better it is for the program.
But as Michael Gilmore, DoD's director of operational test and evaluation, has found in his 24 years in government, common sense doesn't always rise to the top.
"It's clear that testing, in and of itself, is not the reason programs get delayed. It's not the reason costs grow in programs. The results of the testing, you bet. The results of the testing often lead program managers in the services and the acquisition authorities in OSD to try to fix the problems that the testing uncovered, which is what testing should do," said Gilmore during a speech Tuesday at the National Defense Industrial Association Test and Evaluation Conference in Washington. "So when people claim that testers are driving billions of dollars in costs to the problems, the facts simply do not support the claim. That claim is not factually based, period. That doesn't mean there aren't people who strongly believe it's true. But it is not factually based. The facts indicate the converse — the earlier that you test, the more information you get about the problems with the program, the sooner you will be able to fix them at a lower costs. That's what the facts support."
Gilmore so strongly believes that DoD is short-changing the value of testing and evaluation that he submitted a series of recommendations to Congress as part of its acquisition reform efforts.
Sens. Carl Levin (D-Mich.) and John McCain (R-Ariz.), chairman and ranking member of the Homeland Security and Government Affairs Subcommittee on Investigations, are collecting suggestions from Defense and industry experts on ways to improve the military's procurement processes.
Workers need better skills
Gilmore's recommendations included integrating test and evaluation planning into the requirements phase, including scientists and engineers earlier on in the acquisition lifecycle planning phases and making it easier for DoD to hire "technical excellent workers."
"I think engineers and scientists need to be just as involved in generating requirements as the operators. Certainly the operators must be involved. The operators are the people engaged in the fight, trying to kill the enemy so they aren't killed, trying to prevail on the battlefield and protect our security. So the operators must play a key role in generating the requirements. That's not a profound observation," he said. "But engineers and scientists must as well, and for that matter the test community should, which is composed of scientists and engineers. We know what the problems are. We know what the physical limitations are."
Gilmore said two prime of examples of where scientists and engineers could have made a huge difference and saved the DoD tens of billions of dollars was with the Army's Future Combat Systems and Joint Tactical Radio System programs.
When the Army developed requirements for the Joint Tactical Radio System (JTRS), it asked for performance measures that weren't possible and engineers knew they weren't possible for decades, he said. But since engineers weren't consulted, the requirements were included in the RFP and ended up costing the Army $8 to $10 billion for a system that, in the end, didn't deliver on its promises.
Gilmore said the Future Combat Systems (FCS) also suffered from outsized requirements, in part because the senior officials didn't heed the concerns or suggestions from engineers. The Army ended up cancelling FCS and wasting between $15 billion and $20 billion over the last decade.
Culture change slowly happening
But while these two examples demonstrate a lack of understanding of the value of scientists and engineers, Gilmore said he already sees change across some parts of DoD.
He said the Army now is testing and evaluating technology much earlier in the process through the Network Integration Evaluation program.
Additionally, Adm. James Winnefeld, vice chairman of the Joint Chiefs of Staff, issued a memo in January 2013 letting military services and agencies submit a request to the Joint Requirements Oversight Council to change program requirements if needed.
In addition to including scientists and engineers in the requirements development process, Gilmore said other ways to ensure test and evaluation are part of the acquisition planning earlier on is for programs to develop a draft concept of operations and a draft test and evaluation plan even before they come up with the requirements and release the requests for proposals.