Advertisement

You will be redirected to the page you want to view in  seconds.

Government's dirty little secret: data quality

Jun. 9, 2014 - 06:00AM   |  
By STEVE WATKINS   |   Comments
Steve Watkins, Managing Editor of Federal Times.

There has been a big push recently from the Office of Management and Budget that agencies do better at maintaining, connecting and leveraging the vast amounts of data they collect. This will become more vital as federal executives are pressed increasingly to adopt more “evidence-based” governance strategies and tactics, particularly as budgets get tighter and budget-cutting decisions get harder to make.

So, as Mark Reger, interim controller at OMB, said last week at the CFO-CIO Summit conference in Washington: “The next big surge is data.”

But when it comes to data, there is a big problem that many people generally don’t like to talk about: bad data. And as many feds know, there is a lot of it out there.

Thankfully, Kathryn Stack, OMB’s adviser for evidence-based innovation, is trying to address the problem.

“I think we need a really honest conversation about data, and sometimes it is hard for people to admit how much bad data we have that we collect and move around and push around,” she told the same conference.

A wider conversation is needed, she correctly argues, because untold millions of dollars’ worth of time and resources are dedicated daily toward collecting, maintaining and using data, regardless of their quality.

“It’s a drain on our resources, our grantees, our contractors,” Stack said. “And our IGs and GAO folks know it, too.”

So what’s to be done about it? Among her recommendations: Stop collecting data we know is bad. Or streamline it, if possible, so it is more valuable. Or find alternative data sets — inside or outside of government — that can tell us the information we’re seeking.

At the Health and Human Services Department, Amy Haseltine, who oversees grant programs worth a combined $363 billion, said her department is making a big push to improve data quality they receive from state and local agencies that receive HHS grants. They do that using a so-called “data quality heat map,” a color-coded chart that shows their partners how good the information they are sending to HHS is in terms of accuracy, timeliness and completeness.

“When you show it in a pictoral form, and then you point to the part that’s red and say, ‘That’s information that is not coming to us in a timely manner, it’s not accurate and it is certainly incomplete,’ it’s amazing the level of engagement you have,” Haseltine said.

The true value of fixing this problem is huge. Of course, there is savings in avoiding the costs to collect, operate and maintain poor-quality databases.

But the larger savings comes from achieving better quality decisions about how to manage literally hundreds of billions of dollars in program resources.

The problem of data quality needs to be elevated. It is not a conversation just for IT executives or the data science community. It is a conversation that affects everyone because it is a key to a smarter, data-driven government.

More In Blogs