There were no mosh pits or wailing guitar solos, but when it comes to the enthusiasm for the future of open data, Data Transparency 2016 might as well have been Woodstock. The annual conference highlighting the benefits of data sharing and analytic capability kicked off with a pep rally of sorts, led by Federal Chief Information Officer Tony Scott and U.S. Chief Technology Officer Megan Smith.

"What's really great and exciting to both of us is the tremendous set of new tools and data sets and technologies that we can now leverage," Scott said.

The conference highlighted a massive scope of possibilities where open data and analytics could provide better governance as well as policy breakthroughs on things like the Cancer Moonshot, to streamlining foster care operations and identifying fraud in government contracting.

But it also showcased the challenges that still remain to bring open data to the forefront. Here are three takeaways from the event.

The challenge of legacy systems

While there are new tools to help the government capitalize on the massive troves of information it possesses, deploying those tools onto outdated legacy IT systems requires a lot of time and resources.

Part of the solution to that problem, Scott said, is the government's ongoing mission to fund and modernize its IT infrastructure.

"Some of this stuff has been locked in our old legacy systems, and as we transform the way the government works and digitize the processes, we have a great opportunity to free up even more of this so it can be useful for our citizens," Scott said.

But for agencies struggling to scale up systems to process the data, a private sector solution may be the way to go.

Danny Harris, former chief information officer for the Department of Education, said in a panel on open data in grant reporting that tech advancements mean that there no longer has to be a massive storage footprint to handle standardized data.

"You don’t have to build a data warehouse anymore," he said. "There are tools and vendors that will help you pull data from anywhere in your universe."

Harris added that private industry has developed systems that will process analytic data from across the agency without the headache of navigating legacy systems with solutions like crosswall tables, possibly providing the answer for federal leaders who are planning their IT migration but still want the benefit of analytics.

But to do that, the data has to be standardized.

The DATA Act Broker

The Digital Accountability and Transparency Act requires federal agencies to meet data standardization reporting goals by 2017. To help agencies assess the quality of their data for this purpose, DATA Act Solutions Architect Marcel Jemio demonstrated the demo version of the Treasury Department’s new tool for data processing.

The DATA Act Broker validates agency files against the DATA Act’s Information Model Schema, ensuring they meet the standardization guidelines for reporting.

"Everyone benefits when people implement a standard," Jemio said. "It’s not just about me and my agency; it’s about all of us. Implement the standard. It will help you, your partners, your customers and your stakeholders."

During the demonstration, Jemio uploaded data from the Department of Justice’s Financial Management Initiatives Group, to show how the system works.

While the data was validated, Reed Waller, assistant director of DOJ’s Financial Management Initiative Group, described the agency’s process was toward standardization, including an IT migration, the formation of Justice Enterprise Data Integration plan to tie disparate data systems together, as well as setting up data governance plans and automation.

The final version of the broker is set to release sometime this fall and will certify the validity of agencies’ standardized datasets.

Making data useful

It’s not enough to have the data; it also must be applied proficiently.

In a panel on open data decision-making, Kelly Tshibaka, chief data officer for the U.S. Postal Service’s inspector general, outlined a data system her office built to track possible contract fraud.

Tshibaka said that the data tools her office set up have tracked almost $300 million in contract fraud using traditional markers identified in successful investigations. The team then applied algorithm models to track those markers in existing contracts. The result is a map of contract risk profiles that gauges likely fraud targets throughout the nation.

"You can see how as we move more into open data and compliance with the DATA Act that we can use analytics tools like this to give us very valuable insights to make our jobs easier and our government more effective."

Share:
In Other News
Load More