WORLD VISION SOMALIA
END OF PROJECT EVALUATION _ BHA MULTISECTOR EMERGENCY RESPONSE PROJECT
Statement of Work
1.0 INTRODUCTION
1.1 Project Summary
Project name (s)
Multisector Emergency Response Project with primary focus on WASH, Health & Nutrition, Agriculture and Economic Recovery and Protection Support
Project goal
To restore the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts affected communities, including IDPs, hosting communities in Somalia and Somaliland.
Project Purposes
Purpose 1:Disaster affected communities have increased access to food through their own production and protection of productive assets.
Purpose 2: The health and nutrition of the targeted community is improved through enhancing healthcare services and prevention and treatment of malnutrition.
Purpose 3:Households and communities have improved access to clean water, sanitation, and critical WASH items, as well as increased adoption of positive hygiene and environmental health practices.
Target beneficiaries
The number of people targeted is 556,454(Of those, 223,197 are internally displaced persons)
Project locations
a. Jubaland: Gedo Region – Luuq & Dollow districts
b. South West State:
Bay Region: Berdale, Qansadhere and Burhakaba districts
Bakool Region- Hudur & Wajid
c. Somaliland:
Waqooyi Galbeed: Gebilay and Berbera
Togdheer: Burao & Odweyne,
Implementing Partners by Sector and Location
Partner Name, Sectors Covered and Geographic Locations
Project Start Date: July 2021
Project End Date: April 2023
Evaluation Type: Summative Evaluation
Evaluation Purpose
World Vision Somalia will conduct an evaluation of the project to identify the impact of the project, to see if the project was able to achieve the intended outcomes with respect to related indicators.
The evaluation should document evidence of impact, and to highlight and strengthen best practices emerging from the project. World Vision would also like the evaluation to assess the appropriateness of the approaches/ project models used in the program, as well as the potential sustainability markers that have been achieved through the project’s implementation.
This study is to be a combination of end-of-project or summative evaluation, where project outcomes results will be compared the desired targets to deduce impact. The results will also be considered as annual outcome reviews, in which the findings will be used to establish outcomes indicators targets for the cost extension.
Methodology
The evaluation study will adopt a mix of quantitative and qualitative techniques to compare project outcomes against the desired targets. This evaluation will also adopt a population-based and participant-based survey to collect data at the end-of the-project evaluation and measure project performance. The methods are as summarized below:
Quantitative:
Qualitative:
Evaluation Duration
15 March – April 30th 2024
Available Project Documents: Project proposal, baseline reports, M&E plan, ITT, Detailed Implementation Plan (DIP), semi-annual reports
2.0 PROJECT DESCRIPTION
World Vision and four local partners are implementing Multisector Emergency Response Project (MERP) in Somalia and Somaliland with funding support from the United States Agency for International Development (USAID) through the Bureau for Humanitarian Assistance (BHA). The Partner*—African Relief and Development (ARD),* Centre for Research and Integrated Development (CeRID), Mandher Relief and Development Organization (MARDO), and Somali Relief and Development Action (SRDA) work in difficult to access areas in South West State (Bakool, Bay) and and Jubaland (Luuq district of Gedo region), while WV is a direct implementer in Somaliland (Woqooyi Galbeed and Togdheer regions) and Doolow district of Jubaland State. The action is also undertaken together with selected governmental line ministries (Ministry of Health and Ministry of Agriculture), local authorities, community level partners and additional relevant stakeholders across the project districts. The goal of project is to restore the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts affected communities; including IDPs, and host communities in Somalia and Somaliland. The number of people targeted is 556,454. Of those, 223,197 is internally displaced persons affected by recurrent drought and insecurity.
WV proposes a performance evaluation that will measure the impact and effectiveness of the program in the target locations with the communities and the provincial authorities. WV also proposes a performance evaluation to ensure that the project met its intended goals and outcomes and how it can be strengthened in the future. The study will be carried out externally through a participatory approach involving partner organizations, line ministries, and beneficiary communities. The evaluation results will help the key stakeholders measure the level of project success concerning service delivery to the project beneficiaries. More importantly, this end of project evaluation will generate recommendations and lessons learnt for ongoing and upcoming World Vision projects.
2.1 Geographic Scope and Target Population
The project targets 11 districts including Berbera and Gabiley districts of the Woqooyi Galbeed region; Burco and Odweyne districts of the Togdheer region; Wajid and Huddur districts of the Bakool region; Luuq and Doolow districts in the Gedo region; and Qansaxdhere and Berdale districts of the Bay region in Somalia. And evaluation will take place in all the targeted districts. In addition, this survey will assess progress made from the inception and the overall achievement of project outcomes.
2.2 Context
This evaluation is to be done in relation to the context taking into consideration the changes that took place during the project implementation period (Covid19, political tension/election in Somalia, droughts, flooding, and insecurity). The survey team will also analyze on how the changes in context affect project implementation and impact including the effectiveness of adaptation efforts to be made by the implementing partners, communities and other stakeholders.
2.3 Community Level Partners
The Project end-line evaluation will involve all stakeholders who were involved in the project design and expected to engage into implementation such as:
3. METHODOLOGY
This section provides a detailed description of the end-line evaluation process. It focuses on the evaluation design, data collection and analysis and reporting process.
3.1. Scope of the Evaluation
This summative evaluation will provide a detailed analysis of achievements against the anticipated objectives (baseline values) and measure up-to-date performance of the project. This aims to showcase the project’s relevance of the interventions, efficiency, and effectiveness impacts, of the project intervention towards future sustainability.
3.2 Evaluation Type
This summative evaluation aimed at assessing the impact and effectiveness of the program in the target locations with the communities and the local authorities. WV also proposes a performance evaluation to ensure that the project met its intended goals of restoring the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts-affected communities; including IDPs, hosting communities in Somalia and Somaliland and sectoral outcomes and how it can be strengthened in the future. World Vision Somalia will hire an external consultant through a competitive process to carry out a quality project end evaluation.
3.3 Evaluation Purpose and Objectives
World Vision Somalia will conduct this evaluation of the project to identify the impact of the project and to assess the relevance of the project logic, and effectiveness of the project models to realize the project outcomes. It further aims to assess the effectiveness of the integration model of the BHA project. World Vision would also like the evaluation to assess the appropriateness of the approaches used in the program, as well as the potential sustainability markers that have been achieved through the project’s implementation. The end-line evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders.
The objectives of the evaluation are to:
3.4 Evaluation Questions
a) Relevance:
1. To what extent has the intervention appropriately assisted the affected population?
2. How has management adapted the project design or implementation based on monitoring information and feedback from the target population?
3. How equitably has the project benefited the; women, men, boys and girls, IDPs and host community and special interest groups such as the disabled?
b) Effectiveness:
c) Efficiency:
d) Sustainability:
e) Impact:
It is expected that the evaluation will provide lessons Learned, areas of improvements and recommendations for similar programs.
3.5 Study Design
The project will employ a mixed-methods design for the final evaluation. A quantitative survey will be used to produce indicator values and collect demographic information about the beneficiaries. A qualitative component using Key Informant Interviews (KIIs), Focus Group Discussions (FGDs) and observation will provide insights into the context and help to explain the reasons for the changes observed
3.6 Data Collection Methods
3.6.1 Quantitative Data Collection
The quantitative Population-Based Survey will be administered among a probability sample of participants and households in the target areas. The sample size, sampling frame, and data collection tools will be designed so that data can be statistically compared to test for difference. The questionnaire will be designed and administered using Kobo Toolbox/FieldTask.
3.6.2 Sampling Design
The evaluation will adopt the two-stage cluster sampling design where the target population will first be divided into clusters; in this case the clusters will comprise of the villages/communities/groups in the target areas. These clusters are randomly selected from a list of all clusters. In the second stage, a sample including households in these communities is randomly selected from each of the sampled clusters using the probability proportional to size (PPS) method.
3.6.3 Sampling Frame
To develop sampling frames, the project will consider its targeting strategy and the different target populations for various interventions being a multi-Sector response. The survey design will require multiple sampling frames to organize the target groups that will receive a similar set of interventions. The sampling frame for the Population-Based Survey will be developed based on census data and community records to appropriately reflect the target population. The sampling frame for the Participant Based Survey will be developed based on the participant register to appropriately reflect the target population. The sampling unit is the household, where a knowledgeable adult will be answering the questions. The sampling frame will include the following key elements collected at registration:
3.6.4 Sample Size Calculation
To derive the sample size, the key the purpose of the evaluation (comparative), the key indicators of interest, and the sampling methodology; in this case two-stage cluster sampling will be considered. For the evaluation survey the appropriate sample size for comparing the values of indicators collected at two points in time: at the start of the activity and after the activity is completed will be calculated using a two-stage cluster sample design. The sample size will be estimated for each key indicator and adjusted based on the proportion of households in the sub-set.
Using Feed, the Future Sample Size Calculator, WV calculated sample sizes for the key project indicators for comparing the values at two points and found that the sample size was largest using Food Consumption Score (FCS). Additionally, these calculations meet the minimum number of respondents recommended by BHA for indicators expressed as a proportion using Simple Random Sampling, Systematic Random Sampling and PPS, which is 388 and 776 respondents.
Calculating Sample Size for Comparing Indicators (Edline) expressed as a Proportion using two stage cluster sample
INDICATOR: Percent of households with poor FCS score for two stage sampling:
𝑃1,est – 50% (0.5)
𝑃2,𝑒st – 40% (0.4)
𝑧1−∝ – 95% (1.64)
𝑧1−𝛽 – 80% (0.80)
𝐷est – 2
𝑛initial – 776
Non-response adjustments – 10%
𝑛final – 854
3.7 Qualitative Data Methods
Physical observations, Focus Group Discussions (FGDs) and Key Informant Interviews (KIIs) will be carried out with beneficiaries and stakeholders to understand the extent to which the project achieved the intended objectives and addressed community needs. The FGD guides will contain a checklist of questions generated from the main project objectives and activities. The survey team working together with project staff will select the participants of the FGDs based on the focus of the study. Mothers with children under five will be purposively included for the FGD.
The focus groups will target key stakeholders including beneficiaries, comprising; FSL targeted community members, Pregnant and lactating Women (PLW), CHWs, WASH Committees, Protection Committees, facility staff, camp leaders, youth clubs and children’s clubs among others. Taking covid19 protocols into consideration, every FGD will have standard 6-10 participants within the project areas. For purposes of plural investigation, the exercise will be conducted with a broad range of representation within the community to enable triangulation of findings and incorporate wide-ranging perspectives.
Key Informant Interviews (KIIs) will be conducted with a wide range of stakeholders including; WV Project staff, Government officials from the Ministry of Agriculture, Ministry of Livestock, Forestry and Rangelands, MOH, MoEWR & Min of Gender, Health facility in-charges and other implementing partners working within the project area.
Note: The survey team will also consider gathering information on the impacts of COVID-19 as a cross-cutting theme by deliberately incorporating key questionnaires related to the pandemic. Key questions will be developed by the survey team.
3.8 Documents review
The key documents to be reviewed for the final evaluation are as follows:
The key documents to be reviewed for the evaluation are as follows:
3.9 Recruitment and Training of Enumerators
Enumerators with previous research experience will be recruited. About 30 enumerators (50% women) will be recruited for the quantitative-household survey while the qualitative data collection including key informant interviews, focus group discussions, and review of records will be facilitated by the partner organization staff. The data collection team will undergo 3 days of comprehensive training on data collection techniques, mastery of methodology, and data collection tools before field data collection and pre-test of the tools. During the training, the field team will also be briefed on the objectives of the evaluation, how to identify the appropriate respondents at various levels, and how to fill in the questionnaire appropriately. Emphasis will be put on research ethics, accuracy, and completeness among others. Quantitative data collection is expected to take 5 days with each enumerator administering 7 to 8 questionnaires on daily basis.
3.10 Data Collection Tools
A blend of several data collection methods will be used. This is aimed at triangulating and authenticating the data collected as indicated in the study design. This will also be able to overcome the intrinsic biases that emerge as a result of the application of a single method. The following are key data collection methods and tools that will be employed in the evaluation;
Household questionnaires will be developed, coded and leveraged on smartphones with a Global Positioning System (GPS) facility enabled to support geo-referencing of survey locations and spatial analysis and visualization of the evaluation figures inform of maps and graphs for easy interpretation. The mobile data collection tools will be deployed on Open Data Kit (ODK) platform. ODK is a free and open source set of tools that will help to author and manage the field mobile data collection process.
3.11 Pre-test, Quality Assurance and Control
Quality assurance will be an integral component of the entire survey process. Data quality check will be done at four levels: designing of tools, pre-testing, sampling, data collection and analysis. Besides extensive training and pre-testing of survey questions, quality assurance will among other measures entail; appropriate preparation and orientation of data collection team to ensure that they are sufficiently familiar with the survey processes and the tools used. It will also entail providing adequate support supervision by partner organisation technical staff at every stage of the survey with an emphasis on quality data collection. Data collection supervisors in each district will conduct daily de-briefing with the enumerators. Any errors found will be discussed with the enumerators and guidance provided before proceeding to do more fieldwork activities the next day. This procedure will help to effectively identify mistakes during the recording of responses and rectify the mistakes. Routine validation of data will be done on daily basis by the database administrator as well as the setting of constraints and skips for filling only relevant questions in the ODK platform.
3.12 Data Analysis and Presentation
Upon completion of the data collection process, survey data will be downloaded from ODK aggregate and data processing and analysis will be undertaken by designing a tabulation guide for each questionnaire to establish the indicator values. In particular, descriptive statistics will be run in SPSS to examine the status of different project indicators and results disaggregated by gender and beneficiary category. Inferential statistics involving use of t-tests will also be run to assess the level of significance in in terms of change in the performance of different project indicators. However, content analysis will be used to establish relationships in case of qualitative data obtained from key informant interviews, focus group discussions and unstructured interviews. In addition, GIS solution using ArcGIS software will be applied to perform spatial analysis and spatial visualization of key project indicators. Therefore, the data will be presented in the form of graphs, tables, and maps for easy interpretation.
3.13 Validation of Results
Data validation will be conducted with the involvement of all the stakeholders including BHA partner organizations, line ministries and community representatives. A validation workshop will be organized to present and discuss the evaluation findings to generate feedback and validate any issues identified for quality improvement. The team will use the feedback provided in the compilation of the final report that will be shared with all the stakeholders.
3.14 Ethical Considerations
All data quality control measures will be adhered to during the survey including; reviewing of the study tools, translation of the tools into local languages, standardization of the training (pre-testing and ensuring that the enumerators are familiar with local terminology), review of evidence against bond evidence parameters, using GPS functionality in FieldTask to geo-reference the data, regular supervision and data cleaning. The administered tools will be checked regularly for correctness, completeness, and consistency. After entry, the data will be crosschecked to ensure the accuracy of the information obtained from the field then compared and validated. During analysis, validation will be done by comparing the emerging information with secondary data to ensure that any outliers are addressed.
3.15 Ethical Consideration:
The following ethical considerations will be adhered to during the survey process:
3.16 Limitations
Security in most of the targeted districts is volatile with the likelihood of disruptions that may restrict enumerators from accessing some field locations, hence affecting the quality of the evaluation. This will be mitigated by working together with our local line ministry staff right from the inception to the end, as this will empower them to adequately supervise the survey and update sectors on a daily basis.
4. DELIVERABLES
The final evaluation Report:
5. RESPONSIBILITY
The evaluation will be conducted externally by a consultant in collaboration with the World Vision M&E team. World Vision will externally hire a qualified evaluation consultant through a competitive process to conduct quality project end-line evaluation. The external evaluators are not involved implementation process to ensure that the evaluation process is fully impartial and independent from the process related to designing the study, field data collection, supervision, and report writing. Impartiality contributes to the trustworthiness of evaluation and the prevention of any biased assessment design, data analysis and generating an assessment report, and drawing conclusions and actionable recommendations. In addition, independence gives validity to the evaluation process and reduces the possible conflict of interest that might emerge when project implementers are tested to evaluate and rate the performance of their projects.
However, the BHA M&E team will provide the necessary support to the evaluator. The BHA MEAL Monitoring and Evaluation Manager will be responsible for the overall coordination of all the evaluation tasks. In addition, the Program manager and DQA manager, World Vision Somaliland Regional Operations Managers, and the Design Monitoring and Evaluation Manager will provide the necessary technical and operational support required throughout the evaluation process. The M&E officers and M&E coordinators will be tasked to supervise the enumerators and ensure that good-quality data is collected from the field.
Roles and Responsibilities
Consultant’s Roles and Responsibilities
The consultant shall conduct desk reviews of relevant project documents such as proposals, assessments, project budgets, monitoring and assessment reports, and WV guidelines. In addition to the desk review, the consultant shall prepare and submit an inception report with a detailed analysis plan that sets out the conceptual framework to be used in the evaluation, the key evaluation questions including the methodology to be used, work plans and schedules for both quantitative and qualitative aspects of the assignment for review and feedback and approval by WV. Finally, the consultant will prepare and submit the evaluation report to WV. The consultant shall carry out field visits to selected sites; conduct household surveys, interviews and/or focus group discussions and Key informant interviews with local partners, key stakeholders, households, and herder groups
World Vision’s Roles and Responsibilities
Estimated Final Evaluation Survey Schedule and Deliverables
A. Summary of Key Activities, Expected Delivery Date and Responsible for Delivery
A. Develop inception report detailing the evaluation survey work plan, analysis matrix, comments to the evaluation concept note/protocol, list of documentation, and formulation of evaluation survey data collection tools etc. – 15th March 2024 – Consultant
B.Review and Approval and validation of the inception report and evaluation survey data collection tools – 20th of March 2024 – World Vision
C**.** Coding of the finalized tools – 23rd of March 2024 – Consultant
D. Training of enumerators – 25th of March 2024 – Consultant
E. Field data collection – 30th of March 2024 – Consultant
F. Data cleaning and analysis – 03rd of April 2024 – Consultant
G. First draft of the final evaluation report – 13th of April 2024 – Consultant
H. Review of draft final evaluation report – 15th of April 2024 – World Vision
I. Finalize the evaluation report and dissemination – 20th of April 2024 – Consultant
J. Incorporation of feedback received – 25th of April 2024 – Consultant
K.Submit Final Evaluation Report to BHA – 30th April 2024 – World Vision
6.LOGISTICS
7. EVALUATION REPORT STRUCTURE
Title and Opening pages (front matter)—should provide the following basic information:
Table of Contents -including boxes, figures, tables, and annexes with page references.
List of acronyms and abbreviations
Executive Summary
A stand-alone section of two to three pages that should:
Introduction
This section will;
Description of the Intervention
This section will provide the basis for report users to understand the logic and assess the merits of the evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. In particular, the section will;
Evaluation Scope and Objectives
This section of the report will provide an explanation of the evaluation’s scope, primary objectives and main questions.
Evaluation Approach and Methods
This section will describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description will help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations. The description of methodology will include discussion of each of the following:
Findings and Conclusions
This section will present the evaluation findings based on the analysis and conclusions drawn from the findings. In particular,
Findings: This section will present findings as statements of fact that are based on analysis of the data. The evaluation findings will be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results will be explained, as well as factors affecting the achievement of intended results. The assumptions or risks in the project design that subsequently affected implementation will also be discussed.
Conclusions: This section will be comprehensive and balanced and highlight the strengths, weaknesses and outcomes of the intervention. The conclusion section will be substantiated by the evidence and logically connected to the evaluation findings. The conclusion will also respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making.
Recommendations: The evaluation will seek to provide very practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations will be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. This shall also address sustainability of the initiative and comment on the adequacy of the project exit strategy.
Lessons Learned
The report will include a discussion of lessons learned from the evaluation, that is; new knowledge gained from the particular circumstances (intervention, context outcomes, even about the evaluation methods) that apply to a similar context. Concise lessons based on the specific evidence presented in the report will be presented in the evaluation report
Report Annexes
The Annex section will include the following to provide the report reader with supplemental background and methodological details that enhance the credibility of the report.
8. BIDS EVALUATION PROCESS AND REQUIREMENTS
The Selection of the consultancy firm will be made based on cumulative analysis (i.e., mandatory requirement and technical qualifications as follows:
NB: Bidders who will fail to provide mandatory requirements will not qualify for next stage (Technical Evaluation)
2. Technical Evaluation
The consultant must have proven expertise and experience in social research with a special focus on Agriculture, Agricultural Economics, Health and Nutrition, Development studies, Baseline, end-of-project evaluations, midterm evaluations, and impact assessments and be able to implement the final project evaluations in Somalia following the required procedures. Proof of these is to be provided by submitting, together with the application:
Proposal Contents
Proposals from Consultants should include the following information (at a minimum)
III. Financial Evaluation
Clarification of Bidding Document
A prospective bidder making an inquiry relating to the tender document may notify WVS in writing at somalia_procurement@wvi.org. WVS will only respond to requests for clarification received no later than 24th November 2023
All interested bidders are requested to submit their proposal in English and by email to somo_supplychain@wvi.org on or before 30th November 2023 Proposals should be submitted in three distinct/separate attachments, namely. Mandatory Requirements, Technical Proposal, and Financial Proposal (Bidders who will combine both technical and financial proposals shall be disqualified)
EMAIL TITLE SHOULD BE: – END OF PROJECT EVALUATION- MULTISECTOR EMERGENCY RESPONSE PROJECT
Bids received after deadline shall not be considered.