As hundreds of AI initiatives and programs are underway across the Department of Defense, many are facing new and diverse challenges when it comes to operationalization. Selecting a solution and putting it into practice are certainly not the same task, creating challenges that span both organizational and data facades.

For example, the rise of hybrid and multi-cloud architectures present data integration, access and management challenges for many as agencies seek to leverage data assets that span across both on-premise and hosted solutions.

This may be why the GSA Data Center and Cloud Optimization Initiative Program Management Office recently released a Multi-Cloud and Hybrid Cloud Guide to help agencies make better decisions about cloud architecture. Further complicating matters is that the DoD is facing a groundbreaking December award of the Joint Warfighting Cloud Capability procurement.

Pentagon Chief Information Officer John Sherman describes the JWCC as a “multi-cloud effort that will provide enterprise cloud capabilities for the Defense Department at all three security classifications: unclassified, secret, and top secret all the way from the continental United States out to the tactical edge.” At the conclusion of this possible five-year procurement, the DOD plans to launch a full and open competition for a future multi-cloud acquisitions. Until then, DoD data scientists may be forced to work in silos because connecting to live data may not always be possible.

How Technology Can Help

Many government agencies are overcoming these challenges using technologies such as data virtualization to implement a logical data fabric approach capable of ensuring trusted data access and sharing. Data virtualization is a modern data integration technique that integrates data in real-time without having to physically replicate it.

Data virtualization can seamlessly combine views of data from different sources and feed AI/ML engines with data from a common data services layer. Using data virtualization, AI teams can work more efficiently and collaborate more effectively, because the technology provides views of data rather than replicating it.

This not only saves access and storage costs because it provides a unified data-access layer, it also enables stakeholders to implement governance controls from a single point across the department. Creating this “single source of truth” is one of the most valuable characteristics of unifying enterprise data through use the of data virtualization.

This combination of enterprise AI, multi-cloud architecture, and data virtualization is being leveraged by many government organizations to leverage data more effectively and take advantage of the cost savings of the cloud. Together, these technologies underscore the fact that digital transformation is not just about technology, but about using technology in the most intelligent way and providing enhanced value to data science teams and internal and external data consumers.

Also, with the competition for market share between the major Cloud Service Providers (CSP) promising both better value for government and access to a vast array of AL/ML tools to drive better, mission-specific results, there may be conflicting narratives on which CSP -- and which AI/ML technology -- is best for a given mission.

In this environment, a logical data fabric is rapidly emerging as the technologically elegant solution to the chaos of multi-cloud computing as it simultaneously makes the best features of each CSP available to users.

Defined by Gartner as a design concept that serves as an integrated layer (fabric) of data and connecting processes, a data fabric utilizes continuous analytics over existing, discoverable and inferenced metadata assets to support the design, deployment and utilization of integrated and reusable data across all environments, including hybrid and multi-cloud platforms. This pure play data fabric enables the best of all commercial CSP offerings without vendor lock-in and, in many instances, is proving to be the government’s best answer to overcome these challenges.

Addressing the DoD’s pervasive siloed data, standardizing and improving its quality and access, should be a precondition to having the data necessary to train algorithms for many defense uses. A logical data fabric approach that incorporates data virtualization promises to be a means for quickly collecting, processing, and using information from the DoD’s disparate data sources. It also ensures that a developed AI/ML model in a silo is still relevant to live data and can accelerate better data flow and data access across the entire AI operationalization cycle.

The more data AI/ML models receive, the more they learn creating better and more accurate predictions the DoD requires for mission critical decision-making. However, extracting data from multiple sources and then replicating it to a central repository is an old and inefficient way of getting data access. The process is still prevalent across the federal government and often results in the majority of the project time being spent on data acquisition and preparation tasks.

With the DoD touting their growing capabilities in artificial intelligence and machine learning technologies, integrating data to nourish disparate AI/ML models with their expanding data science teams is still a significant undertaking. Enterprise AI, leveraging a logical data fabric layer, overcomes these challenges. It can act as a central hub for data science teams between different AI/ML systems, reducing the need for data duplication enabling highly sophisticated AI/ML initiatives with enhanced operationalization and accelerated timetables for quicker time-to-production.

Bill Sullivan is Vice President and General Manager, US Federal at data integration and management company Denodo.

Have an opinion?

This article is an Op-Ed and as such, the opinions expressed are those of the authors. If you would like to respond, or have an editorial of your own you would like to submit, please email C4ISRNET Senior Managing Editor Cary O’Reilly.

Want more perspectives like this sent straight to you? Subscribe to get our Commentary & Opinion newsletter once a week.

Share:
In Other News
Why do federal pay raises lag the private sector?
The federal budget proposal unveiled by the White House in March included an average pay increase of 4.6% for civilian federal workers, matching a planned military pay raise. Historically, with pay lagging in the federal sector, other factors including steady opportunities, competitive benefits and hybrid work to retain talent.
Load More