Review of members' reporting practices to OECD DAC
-
Upload
development-co-operation-directorate-dcd-dac -
Category
News & Politics
-
view
1.120 -
download
4
description
Transcript of Review of members' reporting practices to OECD DAC
Review of members’ reporting
practices to OECD DAC
Valérie Gaveau, OECD
Outline Status of members’ reporting
(Room Doc 2)
Application of methodology (Room Doc 1, section I)
Strengths and weaknesses of the Rio markers(Room Doc 1, section III)
Options for improvement in application(Room Doc 3, section I.1 )
Options for improvement in methodology (Room Doc 3, section I.2)
Status of members’ reporting (Room Doc 2)
Environment and Rio marker data on bilateral ODA activities are currently reported to the DAC by all members (excluding the three members who recently joined the DAC). A few caveats on coverage: incomplete data mainly from the United States (lack of a
statistical tool for tracking in their financial system) and from the European Union (where EIB is not yet reporting).
for non-export credit OOF, only two countries are reporting on Rio markers: France (AFD) and Germany (KfW).
coverage for disbursements (vs. commitments) needs to be confirmed.
Application of methodology (Room Doc 1, section I)
The vast majority of members apply the methodological approach described in the DAC Statistical Reporting Directives to assign the environment and Rio markers.
However it is generally felt that the Directives leave too much room for interpretation: broad definitions may be interpreted differently depending on who is using them, especially with respect to the difference between significant and principal markers, which is not easy for non-expert staff to follow.
Application of methodology (Room Doc 1, section I)
Some members adjust the methodology for their internal purposes:France (AFD) has developed a different
methodology, based on an assessment of the carbon footprint of projects for mitigation (tracking at project’s component level), and on a positive list of activities for biodiversity.
Three members describe a more granular scoring system used internally, including application of percentages to total project support.
Application of methodology (Room Doc 1, section I)
The score is determined based on a case by case assessmentmarkers are in most cases not applied by default to certain
activities or sectors;
markers are generally assigned at the design stage of projects, when objectives are determined;
the responsibility for applying markers is for most members shared between project officers, sector experts and possibly the central statistical unit in charge of reporting to the DAC.
Application of methodology (Room Doc 1, section I)
Quality control mechanisms are in place for all members for more than half of members, controls are systematic
and take place prior to annual CRS reporting;
for the other half, there are either basic consistency checks and/or periodic quality reviews;
in many cases, quality controls are led by sector experts.
Application of methodology (Room Doc 1, section I)
Training to support officers in their task of marking training and internal guidelines (translated from the DAC
Statistical Reporting Directives into other languages or adjusted to fit their own needs);
list of typical sectors for each convention (e.g. climate change or biodiversity) but it is never exhaustive as mainstreaming can happen in any and all sectors.
Application of methodology (Room Doc 1, section I)
Half of members report that they consult with recipient countries on the contribution of the activities to environment and Rio Conventions’ objectives
Strengths and weaknesses (Room Doc 1, section III)
Strengths internationally-agreed definitions,
simple methodology,
unique tool to track environmental relevance of projects,
flexibility and comprehensiveness,
transparency,
useful as a planning tool,
value for mainstreaming purposes.
Strengths and weaknesses (Room Doc 1, section III)
Weaknesses
The differences in interpretation among officers in charge of marking are considered a large challenge as the application of the markers needs a certain level of environmental expertise. There is a significant level of subjective judgement and interpretation involved.
Lack of granularity leads to overestimation:
overlap,
Rio markers are qualitative policy markers and cannot be used to track actual expenditures (particularly for “Significant”)
Lack of comparability between members:
Options for improvement in application (Room Doc 3, section I.1 )
O1. Timeliness O2. Coverage on OOF O3. Improvements to activity descriptions reported and
recorded in the CRS O4. Members’ internal quality assurance mechanisms O5. Improve the harmonisation in the application of Rio
and environmental markers across members O6. Promote partner country involvement/raise
awareness to statistical marking
Options for improvement in methodology (Room Doc 3, section I.2 ) O7. Improve definitions of Rio O8. Adjustments to the mitigation marker O9. Adjustments to the adaptation marker O10. Improvements/updates to Handbook and FAQs O11. Automatically mark against the Environment
marker, i.e. marking by default, if one (or more) of the Rio markers are applied
O12. Differentiated solutions for different modalities of aid
Options for improvement in methodology (Room Doc 3, section I.2 ) O13. Reflect General Budget Support (GBS) in Rio-
marker based statistics O14. Better quantify Rio marker data within CRS
reporting :• Consider increased granularity, consider scoring at
component (instead of activity) level for the largest activities);
• Consider options for how to treat the overlap between Adaptation and Mitigation markers, (and Biodiversity to a lesser extent)