WASHINGTON — The U.S. Air Force’s concept Combined Joint All-Domain Command and Control requires an intricate web that connects sensors and shooters, while employing emerging technologies such as artificial intelligence and machine learning to quickly sift through data. But to connect these platforms across domains and services, data must seamlessly flow across networks with disparate owners. And to do that, the services must agree on a level of data standards.
This is a difficult task — one that several experts told C4ISRNET is made more difficult by cultural barriers.
“If you don’t have data standards, then, you know, what’s the accuracy and trustworthiness of the data? The defense data decision can be life or death decisions. And in those situations, the data needs to be trusted,” said Kate Mercer, a vice president for Booz Allen Hamilton’s defense business.
Data standardization includes agreeing to common data formats and architectures to ensure disparate data sets across the services are easily accessible to sister services.
“This requires getting over the cultural problems of sharing data and being told what to do with your data,” retired Rear Adm. Danelle Barrett, former deputy chief information officer of the Navy, told C4ISRNET. “As long as I’ve been in this business, the friction between services, even intra-service, between commands and everybody, about how to try to put standardization on data has been a losing effort in most cases, just because that is the hard part.”
The Army and Air Force have made progress with their respective chiefs of staff, signing a two-year agreement to enable Combined Joint All-Domain Command and Control starting with “mutual standards for data sharing and service interfacing.”
“The Army and Air Force are working together more closely on common data standards, common architectures, the use of open interfaces, common cloud tools,” Army Brig. Gen. Martin Klein, director of strategic operations in the office of the deputy chief of staff G-3/5/7, said in a statement to C4ISRNET. “While the Services have different views on moving data from Intelligence sources, the Army and Air Force are on parallel paths to achieve operational advantage for the Joint Force.”
The intent, Klein said, is to “drive a better understanding of data structures and architectures.” And next year, the Army wants to integrate the Air Force’s CJADC2 system, called the Advanced Battle Management System, into its Project Convergence, an Army experiment that aims to decrease the sensor-to-shooter timeline
But industry experts told C4ISRNET that the key to enabling the data standardization piece of CJADC2 comes from top-level leadership prioritizing it. The problem, like so many IT challenges in the Department of Defense and civilian government, is cultural, not technological.
“Leaders need to be open and transparent in their conversations,” said Juliana Vida, chief technical adviser of the public sector at Splunk, who also served as deputy Navy CIO. “Decision-makers and leaders need to just jump in and accept and trust the processes that already exist so they can move forward and actually use the technology that is available.”
The data push
The DoD’s data strategy, released in early October, signaled a cultural push in this direction, listing “standards” as one of the four “essential capabilities” to enable joint war fighting. Data standards underpin several of the stated goals within the strategy, including ensuring data is understandable, accessible and linked.
The strategy stated that standards should be applied at the “earliest practical point in the data lifecycle” and follow industry standards for open-data architectures where practical. It also noted that “standards are not an end unto themselves, but rather, they provide value when enabling data and information to be readily and securely utilized and exchanged.”
Brett Loubert, a principal in Deloitte’s defense business, said open-data architectures and standards will unlock capabilities that would not be gained otherwise.
“You’re actually now sort of inviting them into this collaborative discussion and collaborative development of standards. And you might come up with scenarios, effects and ways of doing analysis that you haven’t thought of before,” Loubert told C4ISRNET in an interview.
Dave Spirk, the Pentagon’s chief data officer, said on a webinar in late October that the military has made significant cultural progress on data because officials know their counterparts across the services and regularly communicate with them.
“It’s about establishing those organizational relationships and those human connections,” Spirk said on the webinar. “Then we work through what probably in the past were challenging because we didn’t know who the right people to talk to [were] or how to communicate with the technical acumen.”
Spirk also said a data interoperability working group — under the cross-functional team focused on CJADC2 — combined efforts with similar working group on the DoD’s Chief Data Officers Council. He told C4ISRNET that standardization is a “team sport.”
“It is less about common standards across all systems and platforms,” he wrote in a statement. “It is more about standardizing any data that must be shared across services, components, or coalition partners to impact the readiness, efficiency, and precision of the warfighter.”
In an interview with Defense News on Oct. 15, Lt. Gen. Clinton Hinote, who leads the Air Force’s strategy office, said the services have agreed that “as much as we can, we will come up with common standards” while allowing access to each other’s data.
Still, if there are areas where the services can’t agree, technologies are available to ensure interoperability.
“Even if we can’t come up with common standards, we realize that translators are going to be something that will be with us for a long time, and we will build the translators necessary to make sure we can share,” Hinote told Defense News.
However, Barrett said, adding new tools to translate data will increase latency in the process.
Connecting sensors and shooters will require advanced capabilities such as artificial intelligence and machine learning. Basic, agreed-upon data standards will ease data ingestion and discoverability across the services, Barrett added. But the services must also grapple with legacy platforms that have been around for years or decades.
“You have to account for how you get those legacy data into this environment too. Now it becomes infinitely easier as you move forward setting data standards, to build those data standards into the design requirements of the systems, to make sure the data are more inoperable moving forward,” Barrett said.
“But that also requires that the services agree to formats that they can live with. And you know, that’s always the kind of hard part because these problems are not technical ... the hardest pieces are institutional and cultural.”