Published on April 25th, 2014 | by Travis Korte0
5 Q’s for Government Performance Expert Robert Shea
The Center for Data Innovation spoke with Robert Shea, a principal with accounting firm Grant Thornton‘s Global Public Sector team and former Associate Director of the U.S. Office of Management and Budget (OMB) for Administration and Government Performance. Shea discussed how his experiences in implementing the Federal Funding Accountability and Transparency Act of 2006 (FFATA), which requires the public disclosure of organizations receiving federal funds, mirror the development of the Digital Accountability and Transparency Act (DATA Act) of 2014, which would standardize and publish U.S. government financial reporting data. The DATA Act is expected to pass Congress on April 28. Shea will be speaking at the Data Transparency Summit, a conference around delivering transparency in federal spending, on April 29, 2014 in Washington, D.C.
This interview has been lightly edited.
Travis Korte: You were in charge of implementing the FFATA, and this was in many ways a predecessor to the DATA Act. How has the government response to the DATA Act been similar to your experience with FFATA?
Robert Shea: It’s not very different. The FFATA was opposed by OMB pretty strenuously, and the agencies said the implementation would be almost impossible. But after discussions with the Hill, it became clear that the Hill was willing to come a long way towards addressing the concerns of the administration, so we finally got a bill signed. It’s sort of the same thing here. OMB opposed the DATA Act as unnecessary or too difficult, but in the end transparency is a hard thing to argue against. When these requirements are put on, we figure out a way.
TK: What are the trade-off between long-term savings from better data versus short-term savings from remaining with old systems? How do you convince agencies that it’s worth their while right now?
RS: It’s a hard sell, because there are a lot of requirements imposed on agencies that produce information that goes unused. Experience shows that reporting requirements don’t result in action. One of the exceptions is the Recovery Accountability and Transparency (RAT) Board [the federal agency tasked with overseeing spending under the American Recovery and Reinvestment Act of 2009]. You see a lot of reporting requirements advertised to improve accountability of recovery spending and over time you’ve seen a real increase in the extent to which agencies are availing themselves of the analytical capabilities of the RAT Board’s Operation Center. That’s a really good case study of—I don’t want to say, “if you build it, they will come”—but there was a real purpose behind those requirements. More and more, agency inspectors general are going to the Operation Center to connect the dots.
TK: You also worked on the Program Assessment Rating Tool, a Bush Administration initiative that used data to evaluate federal programs’ effectiveness. Have you seen a growth in effectiveness evaluation within government? Would you say there’s been a turning point in recent years as data quality has improved or has this been happening quietly all along?
RS: I’m really excited about the increased reliance on program evaluation to improve programs. It’s something that was begun with the Program Assessment Rating Tool. And although that tool was abandoned, the requirements from the Obama administration to do more for rigorous program evaluation have produced a body of evidence about program effectiveness, and it will be useful at federal, state, local, and international levels, if we can get it to where information can be published and used.
TK: What agencies are doing a good job with that?
RS: You’re seeing it to some degree, for example, in the Department of Labor, where they’re really embedding evaluation results and evaluation requirements into programs.
TK: Now that you’re working in the private sector, are you seeing areas where government is lacking in data-driven performance management, areas it can take inspiration from private sector?
In addition to evaluations, I think data-driven reviews that are being used more and more at the agency level are a real highlight. But if you look at the amount of federal investment that’s being spent in an evidence-based way, we’ve got progress but there’s still a long way to go.