World

The promise – and pitfalls – of the Aid Transparency Index

Format
News and Press Release
Source
Posted
Originally published
Origin
View original

In our final 2020 Aid Transparency Index launch blog, Dr Kate Weaver of the University of Texas reflects on the influence of the Index and the limitations of the methodology. She calls for greater support to enable the Index to take the next steps in delivering the real promise of aid transparency – better, more accountable aid for all.

Congratulations to Publish What You Fund on the release of the 2020 Aid Transparency Index (the Index). It is no small feat to track the performance of 47 donors across 35 indicators with such diligent attention to peer review and quality coding. Hundreds of hours are put in each year to meticulously gather and validate data, with a heavy dose of donor cat herding. Was all that work worth it?

In short, yes. As my colleague Dan Honig and I found in our recent academic study, the Index is a remarkable case in point in which a relative weak actor – the small, but mighty Publish What You Fund – has managed like David to wield considerable influence over the many Goliaths in the aid donor world. To discern whether or not Publish What You Fund was effective in using the Index to pressure donors to become more transparent with their spending data, we set about to do what eggheads do. We employed over 400 qualitative, primary interviews alongside quantitative data analysis of the effects of the Index on donor transparency performance over time. If you want to cure your insomnia, read more here. In the meantime, let me provide you with the punchline of our study: unlike many other global indices, the Aid Transparency Index works. The Index has not simply “named and shamed” international aid donors into opening up their troves of project-level data. The Index has also facilitated critical discussion on best and worst practices and incited peer learning and competition that has produced pivotal impacts on donor efforts to improve transparency policies and practices. The Index by many accounts is now an industry leader in assessing aid transparency and setting accountability standards for donors. The Index alters what transparency means in practice even as it encourages greater disclosure. The Index is more than a mere assessor of transparency practice, it is an active constructor of what it in fact means to be “transparent.”

But will this influence last?

My overall opinion – both as a close observer of the Index for the past decade and more recently a participant in the index construction process as an external reviewer – is a caveated “yes”. The Index, in comparison to the vast majority of other global indices, is remarkable for its own transparency in its methodology and meta-data. Its influence also stems from its participatory process, wherein the very actors the Index is trying to influence (aid donors) are actively involved in reporting and validating the data. Without doubt, donors gripe about having to meet the Index’s deadlines and conform to its particular definition of transparency. But the increasing level of compliance across donors and the evidence of their own use of the indicators to calibrate internal reporting practices signals the perceived utility of and respect for the index.

So where are the pitfalls? First, in our qualitative interviews, Dan and I found that the Index does have some unintended and possible negative effects. When asked if the Index presents any concerns or risks, key informant interviewees are quick to point out that the systems put in place in response to the Index do not always represent the most efficient route to full transparency for their own organization (in part because of its link to the IATI common standard). The pressure to conform to the Index sometimes deters innovations more bespoke to organizations’ business models. Many interviewees in organisations praised the role of the Index in getting donors to report in a timely manner, while simultaneously criticizing the appropriateness of the assessment criteria.

At the same time a more existential question revolves around the Index’s core norms, with respect to defining what the end goal of aid transparency looks like and, more critically, the focus on monitoring the supply, as opposed to actual use, of aid data. This has not escaped Publish What You Fund’s attention: the broader challenges of data engagement are already much on their minds. As many others in the open data space have learned over the past decade, the lack of demonstrable evidence that vested stakeholders are actually using open data detracts from the case for transparency at a key time when we need renewed investments to sustain open data work. It is thus an open question: while the Index has proven effective in moving the needle on donor transparency, will the resulting wealth of open data impact the planning, execution, and assessment of development projects?

Publish What You Fund has been extraordinarily reflective in its management of the Index over the years. As it continues to amend the index to better capture important aspects of donor transparency, I humbly suggest that it would be useful to expand the index in two ambitious ways, and make a direct plea to foundations to support PWYF’s work in these directions. First, develop a means to assess whether and/or how donors are using open data in their own decision-making (akin to Owen Barder’s popular call for donors to “eat their own dog food”). Second, conduct surveys of key stakeholders in aid-receiving countries to assess how useful donor data is at the coal-face of aid projects on the ground.

These two steps – neither an easy task, to be sure – would empower the Index to not just measure how transparent donors are, but how meaningful their data is in practice. And therein lies the real promise of aid transparency in achieving better, more accountable aid for all.

Dr Catherine (Kate) Weaver is Associate Dean and Associate Professor at the LBJ School of Public Affairs at The University of Texas at Austin.