Artificial intelligence will factor into election security over the next few years, but the new cybersecurity agency at the Department of Homeland Security is still developing its role.

“We are starting at a crawl, walk, run,” said Chris Krebs, director of the Cybersecurity and Infrastructure Security Agency (CISA), speaking at a Chertoff Group event June 18. “We’re getting into that walking phase right now.”

Krebs said that CISA has deployed network sensors and intrusion detection capabilities across the country. This technology allows states to take a deeper look at their NetFlow, or network traffic monitoring. But these systems create petabytes of data.

“No human can sort that,” Krebs said. So algorithms can “do that tier one, tier two analyst triage and get the human in the loop [to] focus on the higher order stuff and really innovating at the top.”

Krebs mentioned that states have varying levels of needs. Some — such as Illinois, where state officials notified around 76,000 registered voters that their data was breached — were breached by a SQL injection attack, or Structured Query Language injection, which uses malicious code to steal information from a database.

“We’re still dealing here with some really basic issues,” Krebs said. “But at the same time, as we continue to push out capabilities, it opens up opportunity for AI and machine learning.”

CISA is also helping states configure systems correctly and retire legacy systems. One challenge the agency faces with states and AI is that each state is at a different point, Krebs noted.

“There’s a huge opportunity space” for AI, Krebs said. “But we have to keep in mind, again, that there is increasingly a ‘haves and have nots’ community out in the enterprise space. We are really trying to shore up as much as possible the ‘have nots.’”

Moving forward with AI, Krebs wants to ensure that humans remain involved with the process, but would like to use AI to solve low-level problems to free up humans for higher level issues.

“If we’re just talking about a windows box behaving abnormally … I want to be able to have a tier one or tier two automated process where I can put them in a walled-up garden and we can inspect it or get it up to the policy and bring it back in,” Krebs said. “But if there’s an industrial control system application or there’s some lifesaving nexus, then we’re talking completely different governance model and decision-making process.”

Krebs also said he wants to keep humans in the process because leaning on AI too heavily will disrupt a pipeline of high-level analysts training low-level coders to understanding the algorithms and later move up to higher level.

“I’ve got really good, capable, trained analysts that understand the algorithms,” Krebs said. “And so if we lean on machine learning and algorithms down the road, then we’re not going to have that pipeline to get to that level five analyst. So what I want is this kind of virtuous cycle of trained analysts that are working with the base, entry-level coders, to help them understand the algorithms and you can help push them back on top of that development cycle.”

Andrew Eversden covers all things defense technology for C4ISRNET. He previously reported on federal IT and cybersecurity for Federal Times and Fifth Domain, and worked as a congressional reporting fellow for the Texas Tribune. He was also a Washington intern for the Durango Herald. Andrew is a graduate of American University.

Share:
In Other News
Load More