In May, agency leaders and governance stakeholders gathered in a hotel ballroom in Cambridge, Maryland, to figure how they could solve the federal government's most pressing problems with the chic, instrument du jour of public policy: analytics.

In a roundtable session at ACT-IAC's Management of Change conference, Johan Bos-Beijer, director of analytics services for the General Services Administration's Office of the Associate Administrator, split the ballroom's attendees into seven tables and tasked them to answer four questions:

  • What are the key drivers for needing analytics?
  • What emerging challenges should be addressed by analytics?
  • What types of analytics will be important to the future?
  • Why?

Once the questions were posed, the tables began buzzing with possibility. Could better analytics solve issues with student loan debt? Workforce challenges? Immigration, health care, mission delivery?

The federal government has always been a prodigious producer of data. Whether it is Social Security benefits or defense spending programs, the government has massive stores of information on how it conducts business, but until now there hasn’t been an efficient way of understanding it.

With greater insight into the numbers it generates, the federal government now seems poised to better take advantage of data through a technological push that could provide agency leaders with a clearer picture of the inner workings of their missions and how to improve them than they've ever had before.

While the possibility of better governance through data analysis seems to be within reach, utilizing the tools that provide that analysis has been at the root of a seismic shift in federal policy.  

Before agencies can use those tools, they have to ensure that virtual mountains of unstructured, or "dirty," data can be standardized in a compatible format. And not every agency is running at the same pace.

"Traditionally, the information has gone from the top of the chain to the senior management chain to the front line manager chain to the employees doing the work and right back up the chain. And that type of vertical dissemination has taken some time," said Kris Rowley, GSA's deputy CIO and chief data officer. "How do we take these analytical tools that are now available and drive a culture that allows for more real-time analytics, faster movement of information for people doing the work, all the way up to [the leadership]."  

Rowley, who heads up GSA's Data to Decisions project — a cloud-based "analytics-as-a-service" platform — said the shift to better analytics and, therefore, better decision making is a multilayered movement that fundamentally changes how agencies process their information.

"I think there's really three things to think about when we are trying to consume data and make it more analytical," he said.  

"One is getting our hands around all of the enterprise data and trying to put together a strategy on how to manage that across an organization the size and variety of GSA. Second is the technology. Once we decide on our enterprise data sets — what they are and where they are living — what technologies do we need to really get the value out of that data and pull it into an environment that we can utilize and scale. Then the third piece is on culture. I think traditionally, in most federal agencies and probably in the private sector, this data and reporting has been done in stovepipes, by functions."

Breaking out of those stovepipes has been the chief goal of the analytics push: Finding better ways to account for everything from a rapid-response deployment of services to finding new ways to address employee engagement. 

The transformation is not only about fundamentally changing the way government does business, but also finding the best tools to do it in a sea of silver bullets. 

"We have technology right now that is coming out and advancing every day that makes data analytics and data work much easier," Rowley said. "But there are so many companies out there right now producing software to do data analytics, how do you know the right one?"

The promise

When the Ebola virus caused an epidemic outbreak in West Africa in 2014, Wendy Davis, director of

USAID's Center for Accelerating Innovation and Impact, knew that informing people in the field with speed would be paramount in preventing new infections. Speed meant crunching data and dispersing the right elements quickly.   

"Finding ways to arm community members and responders with information they needed to prevent Ebola transmission was critical to the response," she said. "We used a lot of data to think through how best to manage the response.

"There's always a challenge, you've got data that's originating thousands of different places with thousands of different people, so how do you pull those all together and make sure you are getting the right information into the hands of people that can actually do something?"    

To help their responders do something, the Obama administration launched the Ebola Grand Challenge, a partnership between USAID, the White House Office of Science and Technology Policy, the Centers for Disease Control and Prevention and the Department of Defense to help crowdsource private sector solutions through competition.  

The Ebola Grand Challenge netted more than 1,500 ideas, with the government ultimately investing in 14 for rapid deployment in West Africa.

"You need to have good, strong feedback loops as part of those networks," Davis said. "That was something that we were able to do in a few important ways. We invested in a couple of innovations that used mobile phones as their base and were able to connect those feedback loops."

To help improve communication in the field and track new infections, the Ebola Grand Challenge invested in software like IntraHealth International's Mobile Health Worker Electronic Response and Outreach, or mHero, to use on basic cellphone networks and deliver news to frontline workers and coordinators quickly. 

"That particular approach was deployed in Liberia, Guinea and Sierra Leone," Davis said. "In Liberia, they ended up with 22 communications campaigns developed that reached nearly 9,000 recipients who were primarily health care workers. 

"It was such a success that the Ministry of Health in Liberia is now weaving it into its own national health information system strategy. It was really paving the way for the enhanced health information systems over time."

USAID combined the mHero's ease of use with another open-source software called CommCare. The mobile platform provides health officials with a bird's eye view of the health network, allowing them to coordinate screening, triage, diagnostics, lab tracking, contact tracing and map-based visuals.

"When you are trying to get control of an outbreak, the key tool that you have is tracing all contacts around any individual who has become sick," she said. "CommCare, their contact-tracing platform, was used in Guinea and Sierra Leone, and they end up tracking over 20,000 contacts."

Davis said that the ability to deploy the tools interoperably across a wide range of devices and improve communication played a big role in containing the outbreak, and they were discovered because of competition generated from the government's innovation outreach.

"The Grand Challenge, I think, has been a very useful tool in being able to source new and groundbreaking innovations from all over the world," she said. "What we hadn't tested out is the ability to source innovations during our crisis response, and we were able to see through Ebola that we could do that."

The Ebola Grand Challenge demonstrated what the federal government could achieve when it needed to source innovative solutions quickly to match a crisis, but when it is trying to revolutionize the way it does business, transformation gets tougher.

The problem

The Digital Accountability and Transparency Act of 2014 achieved what few pieces of legislation do in Washington these days: near universal bipartisan agreement on how to make the federal government more efficient.

But whether it can completely transform the way government records and reports its data in the next eight months to achieve that efficiency is a question that has yet to be answered.

The 2014 law requires the Department of Treasury to establish common standards for how federal agencies report spending and increases transparency by making that data open to the public.

The hope is that by making the federal government's books easier to understand, the law will also make agencies more efficient and empower them to discover new innovative policy solutions. 

"It's clear that as the federal government begins to manage its information as data, that opens up all sorts of new opportunities that didn't exist before," said Hudson Hollister, executive director of the Data Coalition — a trade association that supports the DATA Act as well as other data standardization efforts throughout the federal government.

"When federal information is tracked in documents, then federal leaders and agencies tend to deal with it using document methods, like reading them and summarizing them," Hollister said. "But because we are seeing a gigantic shift from documents to data, new ways of understanding the information are becoming available. It's really exciting."

But despite the support behind the DATA Act, its implementation has been a long and arduous process.

The law is scheduled to go into effect in May 2017, a full three years after its passage, to give Treasury and the Office of Management and Budget time to set baselines and monitor implementation.

But an April Government Accountability Office report found an OMB pilot program designed to streamline contractor reporting under the DATA Act had run four months behind schedule and its plans didn't include the methodology strategies or how the data will be evaluated. 

GAO again took OMB to task in August with a report saying that a four-month delay in releasing its technical guidance for agency data submission further delayed software patches needed for reporting systems.

On Sept. 2, the Department of Housing and Urban Development's inspector general said the agency is not likely to meet the 2017 deadline to start standardizing its spending reports, citing "management turnover and indecision." HUD officials assured the inspector general they would be on time with the agency's DATA Act requirements.

Each agency hurdle illustrates that while data standardization may streamline the government's use of analytics, it's still far from the finish line. 

But Hollister said that once the DATA Act does come online, it could open a realm of possibility not seen before in the federal government. 

"They will create the most valuable open data set in the world," he said. "A single, fully searchable, standardized open data set that covers all of the spending of the executive branch. It's the largest, most complex organization in human history, and we are going to have one data set that shows where all the money goes.

"There will be an opportunity to do simple stuff like matching grantees and contractors with the accounts from the money they get. There will be opportunities to do more advanced things like figuring out which congressional appropriations are all funneled together into a particular program. We'll be able to find potential indicators of waste or fraud. All that stuff can happen once we have the spending information as one standardized data set."

While the DATA Act deals with agency spending data, the simplicity that analytics provides could have an impact beyond spending to multiple facets of public policy, but only if leaders are able to apply the accessible tools to the information on hand.  

"In a way, it's simple," Hollister said. "The DATA Act just says we need to have a consistent data deal for all of these things and we've got to apply it to all of this information the same way. In another sense, it's complicated because no one has ever tried to do this across the entire federal government before."