ChatGPT is an artificial intelligence-enabled chatbot tool developed by U.S.-based AI lab OpenAI. Two months after its launch in November, it became the fastest consumer application in history to reach 100 million active users, and it has already started to have a noticeable impact in business and academia.
The extent to which ChatGPT (or its technological successors and relatives) will have an impact on national security and defense acquisitions is still unclear, as the technology has yet to replace humans in any meaningful capacity. However, based on initial use cases and reactions to the tool, ChatGPT is already creating benefits as well as risks for defense professionals.
ChatGPT, derived from a machine learning-based language model known as GPT-3.5, is considered a groundbreaking form of generative AI. This means that it is capable of interacting conversationally with users and generating detailed human-like responses to questions or prompts in a variety of text formats, proving itself a useful tool across various fields.
For example, across the health care, real estate, public relations, marketing, customer service, and media industries, some companies have started using iterations of generative AI, especially ChatGPT, for various tasks from scheduling appointments to writing articles. ChatGPT has also made headlines for passing a variety of benchmark exams, including correctly answering technical questions for an entry-level Google software engineering job and achieving passing grades on both a law school test and Wharton business school paper.
In the defense sector, both small and large companies have complaints about time-consuming regulations, but smaller businesses sometimes face more difficulties complying with Federal Acquisition Regulation standards due to their unfamiliarity with processes and procedures associated with government contracting. ChatGPT can already help alleviate this problem.
As one YouTube creator has demonstrated, the tool can be used to write the skeleton of a government contract, and while this is not a comprehensive solution, it can certainly assist founders of small- and medium-sized businesses (SMB) who already firing on all cylinders to get proposals out and accepted. According to Megan VanHorn, president of Bridge4Acquisitions, ChatGPT is helpful both for drafting content and conducting reviews. She noted that without the right knowledge and experience in the government contracting world, the tool does not produce great results, but when coupled with the right expertise it is a disruptive technology that helps her save clients’ time and money on the path to securing government contracts.
The Department of Defense has also caught on to the benefits of using generative AI to speed up and simplify the federal acquisition process. This year, the DoD’s Chief Digital and AI Office announced that it is prototyping and testing an AI-powered contract-writing capability, called “Acqbot.” The tool is intended to help contracting officers write contracts and manage the contract lifecycle. Currently, the tool is still being developed and requires substantial input data and human supervision.
On the flip side of its potential benefits in the defense sector, ChatGPT also provides cybercriminals with an arsenal of potential use cases threatening the cybersecurity posture of American citizens and the federal government. It is already being used by non-state threat actors, including script kiddies, hacktivists, and scammers, to engage in various forms of cybercrime, but in the future, ChatGPT’s technology has the potential to be harnessed by nation-state actors to conduct cyberespionage, information operations, and cyberattacks to increasingly devastating effect. ChatGPT is easy to use and accessible to users with elementary cyber proficiency levels, lowering the barrier to entry for various criminal actors.
ChatGPT still has a long way to go before it can be relied upon for the most important tasks in national security or defense contracting. The information it produces is displayed confidently yet is often unreliable without further verification—ChatGPT is incapable of acknowledging potential misinformation. OpenAI’s own FAQ page notes that ChatGPT sometimes produces incorrect or biased answers, and it has limited knowledge of anything that occurred prior to 2021. Some software developers have even banned ChatGPT answers because it often generates code with substantial errors, an obstacle to individuals using ChatGPT for either good or nefarious purposes.
Given the extent of its limitations, the reality is that the current version of ChatGPT will not revolutionize national security or government contracting immediately. Nevertheless, government contractors and the DoD workforce should develop a better understanding of the pros and cons that AI and ML-based capabilities will bring to their industry in the coming years.
When chatbots get even “smarter,” defense professionals who were shrewd enough hone their awareness of advances in AI such as ChatGPT will be most poised to take advantage of the technology’s benefits and defend against its security risks.
Noah Rivers and Olivia Letts are research associates at the Greg and Camille Baroni Center for Government Contracting at the George Mason University School of Business.
Have an opinion?
This article is an Op-Ed and the opinions expressed are those of the author. If you would like to respond, or have an editorial of your own you would like to submit, please email C4ISRNET and Federal Times Senior Managing Editor Cary O’Reilly.