Please note: You are using an outdated version of Internet Explorer. Please update to IE10 here to properly experience the ATI website.



1. Why do you produce the Aid Transparency Index?

Publish What You Fund produces the Index in order to assess the state of aid transparency among the world’s major donors, track and encourage progress and hold donors to account.

2. What kind of change do you hope to effect having published the ATI?

Our best days in the office are when a donor phones asking ‘How do we become number one on the Index?’ Our goal is to motivate and facilitate donors to improve the amount of publicly available information on the aid money they spend.

3. How do you choose which donors to include in the ATI?

Organisations are selected using three criteria, of which they have to meet a minimum of two:

  • They are a large donor (annual spend is more than USD 1bn);
  • They have a significant role and influence as a major aid agency and engagement with the Busan agenda;
  • They are an institution to which government or organisation-wide transparency commitments apply, for example members of the G7 or the Member States of the European Union.

There are some donor organisations that spend more than USD 1bn per annum that have not been included in the ATI, for example Saudi Arabia and Turkey. The ATI’s coverage of DFIs and providers of south-south cooperation is also limited. Ideally we would like to rank all large or influential aid providers but this is not possible at the present time due to resource and capacity constraints.

4. What do you think this year’s ATI shows?

  1. There is a race to the top, but the majority of organisations are lagging behind in meeting their international commitments
  2. The lack of comparable, comprehensive and timely publication of information means that information on development cooperation is still difficult to access and use.
  3. Progress is achievable, if the political will exists.

5. In terms of the ATI generally, what does success look like? And failure?

A successful organisation in the 2014 Index is one that publishes comparable, comprehensive, accessible and timely information on its aid activities to the IATI Registry, in keeping with the changes that have taken place in the global aid transparency environment. Failure is the inability to keep pace with these changes and is reflected in the poor performance of organisations who publish little or no information on their aid activities or publish information in less useful formats such as PDFs or hard-to-navigate webpages. 


6. What are the most surprising findings from this year’s ATI?

One of the most surprising findings is how difficult it still is to connect the dots between the financial, descriptive and performance information related to individual activities. We’ve also seen some big improvements from some donors who have performed poorly in previous years – demonstrating that progress is achievable within a relatively short timeframe, if the political will exists.

7. What are the least surprising findings from this year’s ATI?

What is least surprising is that a majority of donors still publish very limited information on their aid activities. 

14. What did the MCC do wrong to lose the top spot?

Although MCC has dropped in the ranking relative to 2013, this is a reflection of very high quality IATI publication from DFID and UNDP. While there is also a marginal decrease in their overall score compared to 2013 – this is because some of its information could not be assessed automatically for the Index as the activities for which these are available do not contain dates and therefore could not be identified as “current” projects, rather than a reduction in MCC’s transparency or a deceleration in their progress. In fact, MCC should be congratulated for continuing to demonstrate commitment to aid transparency and for the ongoing improvements to their already fairly comprehensive IATI publication.

Donor Ratings

8. Why do you select more than one agency for some donors?

The ATI assesses more than one agency for some large donors (EC, France, Germany, Japan, UK, UN, U.S. and the World Bank) with multiple ministries or organisations responsible for significant proportions of ODA. We have opted to maintain the disaggregation of agencies for several reasons. First, no two agencies from the same donor country or organisation in the ATI score the same. There is often wide variation in the amount of information made available by different agencies in a single country or multilateral organisation. Second, agencies often retain a large amount of autonomy in deciding how much information they make available and have different publication approaches, and should therefore be held accountable for them. Third, it would be unfair for high performing agencies within a country or organisation to be pulled down by lower performing agencies, and similarly lower performing agencies should not have their poor performance masked in an average score.  Finally, it is unclear how we can aggregate agencies into a single country or organisation score in a way that reflected wide variations in performance. Moreover, it would be necessary to take into account the proportion of a country’s aid delivered by each separate agency in order to create an aggregate country ranking that fairly reflects that country’s level of aid transparency and this information is not always available

9. Do you rate donors that fund you?

In previous years, we have included the Hewlett Foundation, our biggest funder. The Hewlett Foundation did not meet the donor selection criteria which was adopted in 2013 after consultations with our reviewers and was therefore dropped last year. 

10. What explains some agencies’ big improvements in the ranking?

The big improvers in the 2014 ATI have the following in common: more information on their aid activities published currently compared to previous years and/or they publish information in IATI or other machine-readable formats.

11. How many agencies have actually got worse transparency scores? Why?

Improvements to the data quality tests this year means that any drops in absolute scores must be carefully interpreted, especially for IATI publishers. Where organisations have dropped significantly in their absolute scores, it is because their information is not up to date.  The U.S. Department of the Treasury has the biggest drop in score and ranking. This is because most of the information it publishes to the IATI Standard does not include activity dates and therefore does not pass the current data test for the Index. Several others have declined in the ranking relative to 2013. This is a result of being overtaken by other organisations that have started publishing more comprehensive and comparable information about their activities. 

12. How can donors improve, generally?

Donors can improve either by making the information they already publish comprehensive for all their activities, by publishing current information for the first time, or by publishing in more useful formats such as CSV or IATI XML.

13. How well are U.S. agencies doing?

The 2014 ATI shows the five U.S. agencies and one programme (PEPFAR) at very different stages of transparency. MCC remains a leader in aid transparency, ranking third overall and placed in the very good category. PEPFAR – one of the biggest improvers from the 2013 ATI – ranks 30th and is in the fair category, having recently made the political and technical commitment to greater transparency. USAID, ranked 31st, is in the fair category. The agency remains committed to aid transparency but faces internal systems challenges to the publication of its data. USAID is also contributing to improving data use by conducting a pilot study in three partner countries – Bangladesh, Gambia and Zambia. Just behind USAID is the Department of State, ranked 32nd, which has published to IATI for the first time in 2014, but with substantial data quality issues. Treasury ranks 36th and is in the poor category, having dropped substantially in the ranking in 2014 primarily because most of its IATI data lacks activity dates. DOD, ranked 38th, has only marginally engaged with the transparency agenda and remains in the poor category. 

International Aid Transparency Initiative (IATI)

15. Why is IATI so important? Aren’t other forms of publication just as good in their own way / for specific purposes?

IATI is the agreed standard for publishing current aid information in a common, comparable format. While donors may publish extensive information on their own website or to the DAC, it will always lack these vital elements of being current and comparable. IATI also includes some “added-value” fields, for example results, impact appraisals, sub-national location and a budget identifier.

16. If a donor publishes to IATI, surely they should be in the ‘good’ category (at least)?

In order to be placed in the good category, donors’ IATI publication must be comprehensive, i.e. information on the organisation and activity level indicators measured by the Index should be consistently available across all activities and recipient countries. This is currently not the case for some organisations that publish limited, aggregate or old information in their IATI data or those that currently publish only a limited number of IATI fields.  

17. Who actually uses IATI data?

Now that an increasing amount of information is being published to IATI, the challenge is to encourage wide-ranging use of the data. At Publish What You Fund, we use IATI data in the Index to measure whether and to what extent donors are delivering on their promise to make their aid transparent. The Netherlands is using its own information for internal management and reporting purposes and is working with two of its partner organisations publishing to IATI in order to stimulate exchange and learning, with the longer-term aim of including open data throughout its supply chains. The pilot studies on data use by partner countries, being conducted by organisations such as USAID, will also highlight existing data gaps and ways in which information can be made more useful. Together, these initiatives hold great promise for unlocking the potential of IATI. 

18. Are there any visualisations of IATI data?

A number of organisations are now using open data platforms driven by IATI data, marking an important shift from publishing raw data to visualising it in a meaningful way for users. One of the most exciting examples of this is Development Initiatives’ Development Portal (, a country-based information platform that tracks resource flows. In its first iteration, it contains current data published through IATI as well as the most recent (2012) data available from the OECD’s Creditor Reporting System. Another great example is Akvo’s portal (, where visitors can customise interactive maps to see how projects are distributed geographically by region, country and sector. The portal uses all datasets from the IATI Registry, visualises them and also makes the data available through an API which allows for further platform development. AidData brings this approach to scale, producing a series of maps incorporating data from IATI and 90 bilateral agencies ( The use of interactive graphs and menus, for example on Sweden’s and the UK’s Development Tracker (, allows the exploration of aid volumes, projects and results across different sectors, locations and time periods at the click of a button. For more on tools using IATI data: see Box 4 in the ATI report.


24. What is the Aid Transparency Tracker?

The Aid Transparency Tracker is an online data collection platform that provides the main, underlying dataset for the Index. The Tracker includes three separate data collection tools:

  • An automated data quality assessment tool (for indicators where comparable and timely data is available via IATI)
  • A survey (for indicators where comparable and timely data is not currently available)
  • An implementation schedules’ analysis tool.

25. How did you select independent reviewers/CSO partners for the ATI?

We usually work with national NGO platforms for aid effectiveness and development. For most of the EU member states, we approach the AidWatch/CONCORD platform which then recommends members to us. If the platform members are unable to conduct the review, we ask them to recommend other organisations to us. Where there is no national NGO platform, we work with CSOs we have partnered with in the past on the ATI or in other advocacy efforts. For multilateral organisations or IFIs where there is no direct match with an NGO platform or CSO, we ask our peer reviewers to provide recommendations on who we can approach for the independent review. The independent review process is voluntary and unpaid. There are some organisations for whom we are unable to find independent reviewers. In these cases, Publish What You Fund undertakes the assessment.

26. Who are the peer reviewers?

  •  Bill Anderson, IATI Secretariat
  • Neissan Besharati, South African Institute of International Affairs, University of the Witwatersrand
  • Laurence Chandy, Brookings Institution
  • Molly Elgin-Cossart, Center on International Cooperation, New York University
  • Brian Hammond, adviser, IATI Secretariat
  • Alan Hudson, Global Integrity
  • Rolf Kleef, Open for Change
  • Ben Leo, Center for Global Development
  • Marie Lintzer, Natural Resource Governance Institute
  • Afshin Mehrpouya, HEC Paris
  • Larry Nowels, independent consultant
  • Paolo de Renzio, International Budget Partnership, Center on Budget and Policy Priorities

19. What indicators does the ATI use? How many are there and how are they weighted?

The 2014 Aid Transparency Index uses 39 indicators grouped into weighted categories, to assess how transparent donor organisations are about their aid activities. These categories cover overall commitment to aid transparency and publication of information at both organisation and activity level. Within the publication category, the organisation-level indicators account for 25% of the overall weight, while the activity-level indicators account for 65%. The two publication groups are further divided in subgroups, based largely upon the subgroups used in the Common Standard implementation schedules template. The subgroups are equally weighted.

20. How is this year’s methodology different from 2013?

In 2013, we piloted a new methodology to reflect the increasing importance of the format of published aid information. We have tried to keep the methodology as stable as possible in 2014. All 39 indicators used in the 2013 ATI have been retained in 2014. However, please note:

  • IATI XML data needs to be available via the IATI Registry for it to be counted as being published in the most accessible and comparable format. XML data that is not on the Registry is scored the same as other machine-readable data.
  • Documents are sampled and checked more closely in 2014 to verify that they contain the information outlined in the indicator guidelines. Data on results, conditions and sub-national location published to the IATI Registry is also sampled and manually checked.
  • Based on feedback from donors and independent reviewers in 2013, and the public consultation held in January-February 2014, some of the data quality tests have been tightened up to improve the quality of the automated assessment of IATI data.

21. Isn’t it unfair to change the methodology (‘move the goalposts’)?

The ATI is an advocacy tool – and we are pushing donors to meet their own commitments. We are confident that this index stands as a credible reflection of donor’s current levels of aid transparency and can be used to monitor their progress over time. We will continue to review the methodology and take into account feedback received from donors and our CSO partners and make adjustments if necessary.

22. What do you mean by “format” of the data?

There is a substantial difference between searchable IATI XML data where you can access and compare any number of worldwide projects across a number of fields as opposed to searching dozens of URLs or looking for information published in several different PDF files. This difference has been quantified by allowing organisations to score more highly on 22 indicators depending on the format of publication.

23. How do you measure the quality of IATI data

The quality of data published in IATI XML is assessed by running a series of tests on all activity and organisation data packages being published. These tests have been designed to assess the availability, comprehensiveness and comparability of aid information and to determine whether an organisation’s IATI data conforms to the IATI standard appropriately. Most of the tests have been derived directly from the IATI schemas which provide formats for reporting data on various information fields in the IATI XML format.  The tests return results for the percentage of activities within an organisations’ data packages that contain correctly coded information on the specific indicator being tested. For example: what percentage of activities reported contain a title? Or what percentage of activities that have been completed contain information on results? The full list of tests can be accessed on the Aid Transparency Tracker site.