What does the future hold for American military might? By many accounts, the situation appears bleak. For fiscal 2018, the federal budget is $4.094 trillion, of which roughly 62 percent — or $2.535 trillion — is mandated spending on Social Security, Medicare and Medicaid. These accounts will continue to grow as the population ages, placing evermore pressure on discretionary spending. One such discretionary account is the defense budget, which has fallen as a percent of gross domestic product from 5.7 percent in 2010 to 4.5 percent in 2015, and it’s projected to drop to 3.8 percent by 2020.

With this as context, it is worth noting that it takes the Department of Defense decades to field new systems. As fiscal pressures mount, modernization is by no means certain unless we do something about the efficiency of our acquisition programs.

Whether we like it or not, the pace of our systems development does not match our needs. The Pentagon knows it must do things differently. The U.S. cannot count on either superior funding or technology to rule the day; both are increasingly commodities that other countries have access to. Our best and possibly only chance to remain competitive is to rely on that most American of capabilities: innovation.

Enter HPCMP, the High Performance Computing Modernization Program. HPCMP provides the DoD with massive supercomputing capabilities and all classes of networks to transport data, as well as software to help the DoD execute trillions of computations per second in support of its development programs. All that capability translates into the potential to dramatically accelerate development timelines. It is not so much that HPCMP is a new organization, because it is not. It’s that the DoD is changing the way it does development, and HPCMP is an enabler to help make that happen.

At the recent 20th Annual Systems Engineering Conference, the keynote speaker, Vice Adm. Paul Grosklags, commander of Naval Air Systems Command, had a simple message: “We need to increase the speed of capability development.” He said the way to do that was to design, develop and sustain fully integrated capabilities in a model-based digital environment.

What does that mean? In basic terms, it means taking the archaic design-build test-development cycle that takes anywhere from 10 to 30 years to execute, and then inserting a modeling step on the front end.

Modeling should not be an after thought. To the contrary, good systems design begins with modeling, and the two must be closely linked in an iterative development process. Models are physics-based, high-fidelity tools that provide an authoritative digital surrogate, or “twin,” that can be used to rapidly test new designs, performance attributes and further develop a system before any metal is cut. The digital twin, as it is called, is used to make informed decisions throughout a system’s life cycle.

Done right, a computational approach to systems development has the potential to cut years from development timelines and billions off acquisition costs — it is far less costly to fix system performance problems at the digital twin stage than it is when the system is in low-rate initial production.

The DoD has only begun to experiment with fully integrated, model-based design, but the benefits are already clear. The Army’s Joint Multi-Role Technology Demonstrator program used HPCMP capabilities to do an independent analysis of contractor proposals to winnow four concepts down to two prior to additional development.

In another example, the Army rotorcraft program, together with Boeing, used HPCMP models to generate early design-stage predictions of helicopter performance for a proposed rotor blade upgrade of the CH-47F Chinook. The computational model accurately predicted up to a 10 percent hover thrust performance improvement without material degradation of the forward-flight performance.

In Navy applications, computational modeling has been used to generate tens of thousands of ship designs with varying hull forms and configurations to allow acquisition authorities to down-select a design much earlier in the acquisition process. The list of DoD modeling applications is endless, covering domains as varied as radar cross-section analysis, propulsion technologies, aerodynamics and ground mobility studies, to name just a few.

Perhaps most importantly, the concept of the digital twin was once a nice-to-have design tool that some number of zealots pursued. Today, physics-based modeling is at the forefront of the DoD’s push to reform acquisition and is quickly becoming a policy mandate. The DoD cannot afford not to employ modeling on the front end of weapons system development. To not do so would prove too costly and time-consuming. Embracing physics-based modeling is a surefire way program managers can risk-reduce their programs.

John Walker leads Navigant’s defense and national security advisory practice.

Share:
In Other News
Load More