2013 represents an evolution in the Index methodology, recognising changes in the global environment since Busan and the significant progress donors have made in increasing their aid transparency, both in terms of commitments and publication.
As in previous years, the Index monitors the availability of aid information. For the first time in 2013, it also looks at the format of the information. This is in response to donor and CSO feedback on the previous methodology, particularly regarding activity sample selection and the need to assess the quality of the information being made available. Looking at the format of the data helps us to assess how easy the information is to access, use and compare.
Why do this and why now?
Since the launch of the 2011 pilot Index, donors have shifted from making high-level commitments to practical implementation. As the Index evolves, it needs to reflect the progress made by donors in making their aid information more accessible in line with these commitments. In the 2012 Index, we made clear that we wanted to start measuring the quality of published aid data better by focusing much more on the format that the data is provided in and how comprehensive it is.
Publish What You Fund reviewed the Index methodology in consultation with peer reviewers, CSO partners and donors who expressed interest in giving feedback. Reviewers were asked to consider whether 43 indicators, assessed using a manual data collection process, were still needed. They were also asked to consider how best to show differences in organisations’ publication, the quality of that data, and ultimately how the Index could be used to encourage publication of more and better aid information.
Feedback from the consultations emphasised that Publish What You Fund should assess organisations on their progress with implementing the Busan common standard and that this should start in 2013, in order to assess progress against the target of full implementation by the end of 2015. There was also a strong preference for the Index to include a mixture of both automatically and manually collected data (in order to include as many different types of organisations and publishers as possible, and especially to include organisations not yet publishing to the IATI standard), and that it continue to look at overall commitment to aid transparency as well as current publication, including at the activity level.
What has changed in 2013
The revised methodology represents a shift that better assesses the quality of published data. As a consequence, selected indicators and the data collection process are somewhat different in 2013. The new methodology uses 39 indicators to monitor aid transparency. These are largely drawn from the indicators used in 2011 and 2012 – see table 2 for the full list of indicators. As in previous years, the indicators are grouped into weighted categories covering commitment to aid transparency and publication of data (at both organisation and activity level).
A new, graduated scoring methodology has been used for some of the publication indicators. For 22 of the indicators, the scoring takes into account the format that the data is provided in, depending on how accessible and comparable the information is. For example, data published in PDFs scores lower than data published in machine-readable formats (see box 4 for more on data formats and why they are scored differently). Data that is published in the most open, comparable format of IATI XML can score up to 100% for most indicators, depending on quality. More detail on scoring is provided below, with a full explanation provided in the technical paper.
Feedback on the 2012 Index highlighted the need for a more systematic approach to selecting which donors to include in the Index. In previous years, organisations were selected on the basis of their membership of the DAC or IATI signatory status, with additional organisations included later to test the methodology, such as Development Finance Institutions (DFIs) and climate finance providers.
In 2013, the number of donor organisations included in the Index has decreased from 72 to 67. Organisations were selected using three criteria:
- They are a large donor (annual ODA spend is more than USD 1bn);
- They have a significant role and influence as a major aid agency and engagement with the Busan agenda;
- They are an institution to which government or organisation-wide transparency commitments apply, for example members of the G8 or all EU Member States.
Organisations need to meet two of these criteria to be included in the Index. There are some donors that are spending more than USD 1bn per annum that have not been included for example Saudi Arabia and Turkey. Ideally we would like to rank all large donors but this is not possible at the present time. The Aid Transparency Tracker, the online platform used to collect the Index data, has been designed so that others can use it to collect and analyse data on different organisations. Please get in touch if you are interested in doing this.
For more detail on the methodological review please see the separate technical paper.
 Four indicators that were included in 2012 have been removed in 2013: ‘forward planning country budgets’; ‘current activities in recipient country’; ‘centralised public country database’; and ‘design documents and/or log frame for the activity’. There are two new commitment indicators in 2013: ‘implementation schedules’ has replaced ‘engagement in IATI’ and ‘accessibility of the data’ has replaced ‘centralised, online database’. See the separate technical paper for more on why some indicators have changed or been removed.
 The data source for calculating annual ODA spend is the OECD DAC’s Creditor Reporting System. The most recent CRS data available is from 2011. For those organisations that do not report to the DAC, the spend was calculated based on the most recent annual financial report. In the case of IFIs or DFIs that spend ODF as well as or instead of ODA, their ODF and ODA spend was calculated. Where no ODA or ODF data source was available, the figure was calculated based on total investment programme budget (for the EBRD and IFC) or loans disbursed to partner countries (for the EIB).
The Index uses 39 indicators in total, divided into those that measure commitment to aid transparency (three indicators) and those that measure publication of aid data (36 indicators). The publication indicators are further divided into organisation level and activity level, as in previous years. These two categories are further divided in sub-groups, based largely upon the sub-groups used in the common standard implementation schedule template. The commitment category indicators account for 10% of the overall weight. Publication accounts for 90% of the overall weight. The organisation-level indicators account for 25% of the overall weight, while the activity-level indicators account for 65%. Within these categories, the indicator sub-groups are equally weighted.
Reweight the data in line with your prioritisation and assessment of the importance of different types of information with our Explore the Data tool.There are three indicator categories covering commitment to aid transparency – reflecting the extent to which organisations have demonstrated an overall commitment to making their aid more transparent; publication at organisation level – looking at the availability of general planning and financial information; and publication at activity level – reflecting the extent to which organisations make aid information available on specific project activities in-country.Close
Current data: Data for each indicator must be current for an organisation to be able to score on the indicator. “Current” is defined as published within the 12 months immediately prior to the data collection period (1 May–31 July 2013), so information published on 1 May 2012 or later and that relates to that date or later is accepted as current. Information published after 1 May 2012 but relating to a period prior to then, for example 2011 DAC CRS data, is not accepted as current. Documents that are not current under this definition were accepted only if they are up to date with their regular cycle of publication, for example, annual audits and evaluation reports, or if they have explicit extensions into the current period written into them.
Date information: For indicators with a date component (e.g. actual dates, planned dates), both the month and the year are required in order to score. In previous years, just the year was accepted for such indicators. They have been interpreted more strictly in 2013 in recognition of recipient countries’ need to map activities to their individual financial years rather than the calendar year.
Development focused: For the handful of organisations whose primary mandate is not providing development assistance, the assessment of their aid transparency relates only to the development assistance aspect of their operations and not the transparency of the organisation more broadly.
Parent or subsidiary organisations: Information for some organisations is held or managed by other organisations. In such cases, we look at both organisations for the information, i.e. the primary organisation under assessment as well as the organisation holding/publishing the information. For example, in the case of Norway, the majority of development assistance is administered by the Ministry of Foreign Affairs (MFA) but most activity-level information is found on the Norwegian Agency for Development Cooperation (Norad) website. In such cases, information published by both the MFA and Norad is accepted.
Multiple sources: For organisations which publish information to multiple databases or websites, information from all sources is accepted. For example, DG ECHO’s data is published to both the European Disaster Response Information System (EDRIS) and the Financial Tracking Service (FTS) and both sources are accepted. If there are differences between the two information sources, the most recent information or the most accessible source is used.Close
We have designed a new, more user-friendly data collection tool, the Aid Transparency Tracker,
We have designed a new, more user-friendly data collection tool, the Aid Transparency Tracker, to collect and share the data included in the Index. The Tracker is an online platform that provides the underlying dataset for the Index. It includes three components – an automated data quality assessment tool; an online survey tool; and an implementation schedules tool. The Tracker highlights what information donors have committed to publish in their implementation schedules, as well as what they are currently publishing.
Most information included in the Index is gathered from what is published online by each organisation – either on their website, the IATI Registry, national platforms such as the U.S. Foreign Assistance Dashboard or the OECD common standard website (for implementation schedules). One indicator uses a secondary data source to assess the quality of Freedom of Information (FOI) legislation.
There was a defined data collection period (1 May–31 July 2013) to ensure that all organisations are compared fairly at the same period in time. If the organisation is not an IATI publisher then all the information was collected via the manual survey. Surveys were completed in-house by Publish What You Fund. For the activity-level indicators, we look for information pertaining to the recipient country receiving the largest amount of aid by value from that donor agency. To establish that information is consistently published at the activity level, a minimum of five activities are selected within the largest recipient country (or thematic sector if the donor organises itself along thematic areas or sectors rather than by countries). As in previous years, donors and partner CSOs were invited to review the surveys and provide any updates or corrections as necessary. While checking and verifying data, organisations are also asked to confirm if the responses are representative as a whole.
For organisations that are publishing in IATI XML format, data collection follows a two-step process:
- First, their data is run through the data quality tool, which is designed to run automated checks and tests on each organisation’s data, providing both a comparative view across organisations and granular details on each organisation’s data. These tests are aggregated to produce scores for indicators to which they are relevant.
- Next, for those indicators for which information is not published in IATI XML or does not pass the necessary tests, the data is collected via the survey.
The data quality tool automatically assesses the quality of donors’ data published to IATI. The initial assessment was made available to donors via the Tracker in May 2013 and remained available for review and comment for three months until the end of July. The final set of IATI data was automatically collected on 31st July, so any improvements or changes to an organisation’s IATI data during that period have been reflected in the final dataset used to compile the Index.
Only IATI data is collected and assessed automatically. The tests used to assess the data were designed by Publish What You Fund in consultation with IATI data experts. Several donors also provided feedback on the tests. As in previous years, all organisations are assessed against the same indicators, meaning that a mixture of automatically and manually collected data can be used for the 28 IATI publishers included in 2013.
Measuring quality for IATI XML data
The quality of data published in IATI XML is assessed by running a series of tests on all current activity and organisation data packages being published. These tests have been designed to assess the availability, comprehensiveness and comparability of aid information and to determine whether an organisation’s IATI data conforms to the IATI standard appropriately. Most of the tests have been derived directly from the IATI schemas, which provide formats for reporting data on various fields in IATI XML format. Some additional tests have been designed to check that data published in IATI XML is presented in a manner which allows for comparison across organisations. Tests are run against only those activities that are still ongoing or ended at most 13 months ago and only if they account for 20% or more of the organisation’s country programmable aid budget. This is to ensure that the information gathered pertains to the period of assessment covered by the Index, i.e. information published on or after 1 May 2012, relating to that date or later. IATI files that are XML conversions of CRS data and do not contain any updated information for activities relating to or starting after 1 May 2012 were not accepted as current. For more information on the data quality and frequency tests conducted, please see the separate technical paper.
IATI XML data and the IATI Registry
The IATI Registry is an important component of IATI publication, as it makes data discoverable and easier to access. IATI publishers “register” their IATI XML data, providing links back to the original source data – which remains on donors’ own websites – and other useful metadata.
For the purposes of the 2013 Index, some donors were unable to register their IATI XML data with the Registry by the 31 July data collection deadline. Publish What You Fund accepted IATI XML data from the donors’ own websites, even if it was not registered with the IATI Registry, on the understanding that it would be in the near future. Donors provided Publish What You Fund with links to the source files via public URLs, where the data could be downloaded and automatically assessed. The URLs for the files are now published on the Aid Transparency Tracker. All four organisations were strongly encouraged to register their data with the IATI Registry and some have since done this.
Given that the rest of the Index methodology permits information to be taken into account no matter which website the information is provided on, it was felt it would be unfair to penalise these organisations. The focus in 2013 is on the format that data is published in and not the location. In the 2014 Index however, registering data with the IATI Registry will be a criterion on which donors will be assessed, given that the discoverability of IATI data, and the fact that it is accessible through a machine-readable list of the locations of the files from different publishers, is an important aspect of the accessibility of IATI data (in addition to the structure of the files themselves).
 The Global Right to Information (RTI) Rating is used as the data source to assess the quality of FOI legislation. The RTI Rating scores the strength of the legal framework in guaranteeing the right to information in a country. Based on a 61 indicator survey, the legislation is graded on a 150-point scale. This has been adapted to a three point framework for the Index indicator. As in 2012, a second scale was developed to score disclosure policies for non-bilaterals. This was guided by the principle that, while non-bilateral donors may not be legally obliged to disclose their information, many of them have disclosure policies and these should be taken into consideration rather than having a data gap or awarding them an average score for this indicator. For the RTI Rating methodology and full dataset, visit: http://www.rti-rating.org/index.html.
 The data source for calculating annual ODA spend is the OECD DAC’s Creditor Reporting System.
 Finland and Spain are not included in this list of 28 IATI publishers. Although they both publish IATI data, it was not taken into consideration for the purposes of the Index as the data is historic. Current IATI data is only taken into consideration if it accounts for 20% or more of the organisation’s country programmable aid budget.
 EC ECHO, EC Enlargement, U.S. MCC and U.S. Treasury.
There have been substantial improvements to the methodology, which means it is not possible to compare absolute scores in 2013 with absolute scores in previous years. Taking into account the publication format gives a more accurate picture of aid transparency. In 2012, organisations would have had either 0% or 100% of the score for an indicator regardless of format. In 2013, for 22 indicators, publishing in IATI XML format can score between 50% and 100%, while publishing a PDF can score only 16.67%. So an organisation that may have scored 100% for an indicator in 2012 may only score 16.67% in 2013 without changing its practice, due to the change in scoring method. The new, more nuanced methodology will be used in future years, making it possible to compare absolute scores going forward.
The set of organisations included in the Index changes slightly year on year. Therefore the ranking of 72 organisations in 2012 is not fully comparable with the ranking of 67 organisations in 2013. It is possible to compare individual indicators however; such as whether a higher proportion of organisations are now publishing annual reports or forward budgets.
The performance of each organisation will affect the ranking of every other organisation, so a change in rank may not reflect a change in an organisation’s own practice. However, it is likely that a large move up the ranking reflects a genuine change in practice since 2012.Close