Malawi + 2 more

A stitch in time? Vol. 2: Appendices

Format
Evaluation and Lessons Learned
Source
Posted
Originally published

Attachments

Appendix 1: Terms of Reference: Independent Evaluation - Southern Africa Crisis Appeal
A1.01 Background

The Disasters Emergency Committee launched an appeal for this crisis on the 25th. July 2002. All the Members supported the appeal and 12 requested funds for their emergency responses. The DEC proposal to the Broadcasters quoted WFP's James Morris: "Throughout the region people are walking a thin tightrope between life and death. The combination of widespread hunger, chronic poverty and the HIV/AIDS pandemic is devastating and may soon lead to a catastrophe. Policy failures and mismanagement have only exacerbated an already serious situation."

And went on to say:

The launching of appeal now is crucial to enable the Member Agencies meet the needs of the majority at risk before their situation deteriorates further and we are faced with harrowing pictures of another famine in Africa.

"Disaster Response Programme proposals" were submitted to the Secretariat by the 29th September on the basis of the Fund allocation set out below. The Secretariat reviewed with the members the most strategic use of the Appeal Fund, which resulted in Merlin withdrawing their proposal and some modifications to specific proposals.

Agency Share Of Disaster Response Programme Fund
IOC
Final IOC103
Appeal
Funds
Revised
Allocation
Revised
Percentage
1
AAID
6.32%
5.85%
468,188
482,146
6.03
2
BRC
16.63%
15.41%
1,232,933
1,269,691
15.87
3
CAFOD
5.29%
4.90%
392,092
403,781
5.05
4
CAID
9.80%
9.08%
726,675
748,340
9.35
5
CARE
8.44%
7.82%
625,806
644,463
8.06
6
CCF
0.64%
3.00%
240,000
240,000
3.00
7
CONCERN
1.61%
3.00%
240,000
240,000
3.00
8
HTA
1.44%
3.00%
240,000
240,000
3.00
9
MERLIN
1.33%
3.00%
240,000
30,115
0.38
10
OXFAM
21.74%
20.14%
1,611,537
1,659,582
20.74
11
SCF
15.73%
14.57%
1,165,824
1,200,581
15.01
12
TFUND
6.16%
5.71%
456,810
470,429
5.88
13
WVISN
4.86%
4.50%
360,136
370,873
4.64
TOTAL
100%
100.00%
8,000,000
8,000,000
100.00

A second round of funding following the format of the Extended Response Programme was completed by the 16th May 2003 and the share by agency was as follows:

Member Agency Allocation
1 ActionAid 130,000
2 British Red Cross 130,000
3 CAFOD 130,000
4 Christian Aid 130,000
5 Care Int'l 130,000
7 CONCERN 90,210
8 Help the Aged 128,708
10 Oxfam 130,000
11 Save the Children 144,641
12 TEARFUND 55,000
13 World Vision UK 130,000
1,328,559

A1.02 The DEC Evaluation Policy:

An independent evaluation is an integral part of the DEC's approach. This evaluation enables the DEC Secretariat and its Members meet their two prime responsibilities of accountability and improved performance. The Evaluation policy highlights four priorities: Financial accountability to the donors Accountability to beneficiaries Impact with regard to the membership criteria especially the Red Cross/NGO code of conduct, the Humanitarian Charter and Sphere standards Lessons on topics pertinent to the nature of the emergency There is a tension for the evaluation process between evaluating the impact and strategic use of the DEC funds as a whole and the performance of the individual members. The DEC evaluation should review the overall impact of the Members but a real effort should be made to root these conclusions in a substantial number of specific examples of individual members performance. The collective impact is best addressed in looking at the Members adherence to and performance against the standards associated with the Membership criteria especially in points iv, v, and vii of the Membership criteria in Appendix 3,. Specific conclusions should be drawn in the evaluation on how the principles and standards of the Code, Sphere and People in Aid have been addressed.. To facilitate this the DEC has developed the Red Cross NGO Code as an evaluation tool and the Evaluators should refer to the Appendix 2 below in framing their methodology. Reference should be made to how the DEC funds complemented other funding sources and any lessons concerning the extent of functional and/or geographic cohesion in the combined efforts of the DEC members. The evaluation methodology to establish the impact and effectiveness of the relief effort should also use the adapted DAC criteria which have been systematically used in previous DEC evaluations: Timeliness and Appropriateness of response - this would also cover issues of capacity and preparedness to enable a rapid and sensitive response Cost effectiveness - the efficiency of the response Impact - effectiveness in the reduction in mortality, morbidity and suffering achieved by the Member's actions Coverage - scale and ability to reach those most in need, given political, religious and social context of the emergency Connectedness - links into local capacity, plans and aspirations and the collaboration and coordination of the Members efforts Coherence - the integration of relief activities to policy and practice changes needed to address root causes

A1.03 Key areas of special concern to the DEC evaluation of the Southern Africa Crisis

The scale and scope of this relief effort is unusual so the evaluation and the field visits should focus on three countries, Malawi, Zambia and Zimbabwe. An effort must be made to draw out lessons from the three countries and to set these findings against an understanding of the crisis as a complex regional phenomenon as set out in the Monitoring Report. Did a regional analysis of approach improve the effectiveness of the programmes in the three countries. The evaluators should make use of existing studies and evaluations done by the Members in Southern Africa and draw out any lessons identified and examples of best or inovative practice. This Appeal was unique for the DEC in its overt preventive strategy. The evaluation needs to spell out how effective the DEC members were in achieving this goal. It should address the role the Members played in ensuring the food security of the populations at risk in the three countries. This should cover the extent that the members added value to: the conceptual framework of the international humanitarian response, especially given the HIV/Aid pandemic the analysis and identification of the needs the targeting and delivery of the food aid the members' ability to build in other strategies to complement the basic ration distributions especially targeted feeding schemes and the role of the seeds and tools programmes. Accountability to beneficiaries is central to the DEC and is underlined in the Red Cross/Crescent and NGO Code. It is important for the evaluation to assess to what extent the interventions met the expectations of the people concerned and how they were involved in decision making. Tenders should spell out how the evaluation team intends to enable the voice of the rural and urban beneficiaries to be heard especially how they viewed the crisis and the response that followed. This should be in depth in at least one of the three countries covered by the evaluation. It will be important to identify examples of good practice by the members in their effort to involve and be accountable to the beneficiaries. The Financial accountability in this evaluation should be addressed by the team reassuring themselves that the overall funding to each Agency in the given countries translated into value for money for the donors. The evaluation team is not expected to audit the Members, but to ascertain that the necessary professional skills and systems were in place and well managed by the Member Agencies to ensure that funds reached the intended beneficiaries. The review should analyse the types of contact the Members have with partner agencies in terms of financial controls and how well the Members have used either internal or external audits in meeting their responsibility for financial control.

A1.04 The Monitoring Report

An initial monitoring visit was carried out in October 2002. The Executive Summary is set out in Appendix 1, which forms part of the basis for this evaluation. Issues and areas of focus identified in the monitoring report should be developed where relevant.

A1.05 Support and Documentation

Participating DEC agencies are required to submit the following material (in both hard copy and electronic format) to the Secretariat to assist the Evaluation: a summary chronology and key documents on the agency's response to the emergency and their use of DEC funds especially any appraisal, monitoring, evaluation or audit reports names, contact details and roles of key agency and partner personnel in the head office and in the relevant field offices who can be interviewed by the evaluators The Secretariat will prepare a package of materials on each participating agency to be given to the evaluation team, as well as appeal related documentation on financial and other actions and the full Monitoring Report undertaken in October 2002.

It will be important that the Consultants review the existing DEC evaluations, and the Vine Management Summary Report so that this evaluation builds on the existing body of knowledge and lessons learnt available to the DEC Members

A1.06 Evaluation Methodology

The evaluation team will begin with a review of available documentation especially exisiting member evaluations. The team will be responsible for ensuring appropriate data-collection is undertaken in the field. The DEC evaluation mission has been delayed to coincide with the early phase of the Extended Response Programmes of the Members so that the evaluation covers the total expenditure of the appeal fund and not just the Disaster Response Programme. The evaluation team's schedule accommodation and transport arrangements will be finalised and communicated to the all agencies at least one week prior to the field visits. The evaluation and the methodolodgy needs to recognise that DEC Members have a range of operating cultures ranging for being directly operational to funding independent organisations in the affected countries or a mixture of both. During their time with each agency the team will interview key personnel remaining in country (contacting others prior to the field visits or on their return) and undertake visits to selected project sites/areas. The field visits must include at least one DEC funded project site for each agency active in the three countries covered by the evaluation. (see appendix 4 for map of Agencies in each country ) It is recognised that the actual relief effort might well have been phased out or have changed its focus by the time of the evaluators visit, but this should not stop the evaluation visiting and reviewing the impact of the programme. It should be noted that in the case of some agencies they will also have funds from their own donors and from global counterparts/sister agencies, nevertheless an effort should be made to distinguish DEC funding. As well as interviewing the agencies' project officers, key officials in co-ordinating agencies (e.g. UNICEF, OCHA, and Regional governments), and partner agencies, a sample of beneficiaries will be selected and interviewed by the evaluators. These interviews will be conducted without agency personnel being present; using interpreters (where necessary) hired directly by the evaluation team. The beneficiaries will be questioned on their views of the assistance provided, the way they were selected and their overall views of the agency and the relief offered. Interviews with individuals may be complemented by discussions with groups of beneficiaries. So as to assess the agency's targeting and beneficiary selection methods the evaluation team will also interview a selection of potential beneficiaries who did not receive assistance where at all possible. It is expected that the evaluation team will disagregate their findings on the basis of gender and use gender-aware and participatory approaches to seek the views of beneficiaries and, where appropriate non-beneficiaries. Inclusive techniques will be expected of the evaluators, to seek active participation in the evaluation by members of local emergency committees, staff of implementing partner agencies and member agencies, and representatives of local and central governments. Confidentiality should be respected. Before leaving each country, the lead consultant will indicate the broad findings to Country Representative and senior staff of each agency and note their comments.

A1.07 The Report

The evaluation report should consist of: executive summary and recommendations (not more than five pages) commentary and analysis addressing the issues raised in the TOR, conclusions and recommendation with a section dedicated to suggestions for taking forward particular lessons learned, (not more than thirty five pages in all)

Appendices, to include evaluation terms of reference, maps, sample framework, beneficiary research and bibliography. (All material collected in the undertaking of the evaluation process should be lodged with the Secretariat prior to termination of the contract)

A1.08 Evaluation team and timeframe

It is anticipated there will be a core team of three people. The Team Leader should have a proven background in emergency evaluations with a balance of analytical skills and practical experience. The team will need the relevant skills to do the participatory research to establish the beneficiaries' view on the relief effort, a good understanding of the international communities mechanisms for large scale food aid programmes, and be able to review the management and administrative systems and skill available to the agencies. It is anticipated that the evaluation will last a maximum of six weeks, with one week in the UK at the beginning, three to four weeks in the Region and a further one-week of consultation and writing up. The evaluation timeframe should allow for the circulation of a first draft before the end of September 2003. A workshop should then be held in London to discuss the draft report of the evaluation. The report should be circulated at least one week prior to the workshop to allow for preliminary review by agencies and for agencies to communicate factual errors to the authors and highlight issues that whey want to address in the workshop. The completion date for the Final Evaluation Report will be end October 2003, the consultants having addressed agencies' comments as appropriate. The Team Leader should alert the Secretariat immediately if serious problems or delays are encountered. Approval for any significant changes to the evaluation timetable will be referred to the Chief Executive.

A1.09 Follow up

The DEC Board which includes the Chief Executives of the Member agencies will review the findings of the evaluation and will monitor the follow up by the members of the recommendation made to specific agencies and to the DEC as a whole. The DEC Secretariat will hold a lesson learning workshop on the findings and set them against the cumulative lessons of previous evaluations. It will be important for the evaluation to note how well prior lessons have been learnt by the Members.

A1.10 Tenders and Evaluation Management

Tenders should be submitted to the DEC Secretariat by the closing date of 18th July 2003. A maximum 5 page summary should be submitted with appendices of team member CVs (each CV a maximum of 3 pages) and an indication of availability. The DEC may wish to see substantive pieces work or to take up references of short-listed consultants. The final decision on tenders will be taken by the DEC Chief Executive, following short-listing and possible interviews. Key factors in the decision process will include: Provisional framework, methodology, skills, African experience, timeframe and budget (realism not just competitiveness) an appreciation by the bidder of key constraints and any commentary on the above terms of reference, especially on the use of the Red Cross/NGO code and the balance between collective and individual agency impact Gender and cultural balance of the team Clear written English Tenders will be accepted from ad hoc groups of freelance consultants as well as from Commercial companies, NGOs or academics. Tenders are particularly welcome from a joint UK/Africa based teams. It is anticipated the selection process will be complete by end July 2003.

A1.11 Further information

For further information please contact:

Brendan Gormley, Chief Executive
Disasters Emergency Committee (DEC)
15 Warren Mews
London W1T 6AZ
Tel: 0207 387 0200
Email: info@dec.org
Or visit the DEC's website: www.dec.org.uk

A1.12 ToR Appendix 1: Southern Africa Crisis Monitoring Report

A1.12.1 EXECUTIVE SUMMARY

The Disasters Emergency Committee (DEC) launched an appeal for the Southern Africa Crisis on the 25th July 2002. At end November 2002 the total amount raised by the DEC was =A315 million of which =A38.7 million was pooled income. All DEC members supported the appeal, twelve agencies requested funds to directly support their emergency programming and one agency received funds to support their assessment process. Agencies 4 week proposals/plan of actions indicated that =A35 million would be spent by the end of November and indicated an tension between the need for a response that would have a preventative impact and the growing understanding that the underlying causes of the Crisis would require a sustained response. The DEC Monitoring Mission visited the region during November and made field visits to Zambia, Zimbabwe and Malawi. All twelve agencies were visited in the field (where discussions were held with field representatives, partners and beneficiaries) and a DEC interagency meeting was held in each country. Where it was possible, meetings were also held with government and UN representatives, donors and non-DEC agencies. The initial overall response (which included the response of most DEC members) was one of early recognition of the existence of the crisis but also one where agencies lacked the capacity and skills to immediately effectively analysis and respond. It has taken all actors time to scale up to meet the challenge. The actors who responded most promptly had often recently invested in emergency preparedness activities. The response is now being driven by macro-level co-ordination mechanisms and consortiums, which, while reported to be efficient, run the risk of excluding non-consortium members and local capacities. In the absence of community level analysis and the traditional strengths and capacities of ngos there is a danger that the response could become increasingly supply driven. Working within the system, agencies could increasingly become contractors to the donors and fail to maintain their independence while the concerns surrounding consultation, effective targeting and local ownership become sidelined. While much of the DEC agencies programming is of a high quality, coverage is limited and DEC agencies as a whole are uncertain about their capacity to scale up further in the short term. There is little sharing of lessons or discussion about qualitative issues between agencies. The coordination mechanism is focused about the functional issues related to distribution and information gathering. A strong and repeated emphasis was placed by the DEC agencies upon the detrimental impact of delayed decision making by donors. Donors in their turn expressed doubts as to the quality of the analysis of agencies. DEC agencies appear to have little individual or collective impact at the macro policy level. Awareness of the existence of the Code of Conduct was generally good but application of the principles was inconsistent. It did not appear to be utilised as an advocacy tool.

Agencies claimed knowledge of SPHERE but SPHERE appeared to be utilised in a purely mechanistic way, as a series of measurements and there was little reference to SPHERE as a holistic management tool. There was extremely little awareness of the existence of People in Aid amongst managers at country level and no awareness at field level. The Monitoring Report recommends that the DEC evaluation should focus upon only two or three countries at most. A full beneficiary consultation process is recommended but the geographical scope of this should be limited. The DEC Secretariat should consult with the DEC agencies to determine whether there would be potential added value to the DEC leading a joint impact evaluation.

103 This is subject to a 25% maximum limit and a 3% minimum.

(pdf* format - 724.2 KB)