Request for Proposals _ BHA Endline Evaluation

  • Location:
  • Salary:
    negotiable
  • Job type:
    consultancy
  • Posted:
    12 months ago
  • Category:
    Monitoring and Evaluation
  • Deadline:
    26/11/2024

WORLD VISION SOMALIA

END OF PROJECT EVALUATION _ BHA MULTISECTOR EMERGENCY RESPONSE PROJECT

Statement of Work

1.0 INTRODUCTION

1.1 Project Summary

Project name (s)

Multisector Emergency Response Project with primary focus on WASH, Health & Nutrition, Agriculture and Economic Recovery and Protection Support

Project goal

To restore the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts affected communities, including IDPs, hosting communities in Somalia and Somaliland.

Project Purposes

Purpose 1:Disaster affected communities have increased access to food through their own production and protection of productive assets.

  1. BHA Sector Name – Food Assistance: WV will work to increased access to food for disaster affected households, through conditional assistance (cash-for-work activities) and unconditional assistance to food insecure households.
  2. BHA Sector Name – Agriculture: Disaster-affected communities have increased access to food through improved agricultural practices, higher quality seed, and quality livestock.
  3. BHA Sector Name – Economic Recovery and Market Systems: Disaster affected households have diversified and expanded livelihoods through village savings and loan associations.
  4. BHA Sector Name – Multipurpose Cash Assistance: Assist communities affected by emergencies with short-term, unconditional cash assistance aimed at meeting their immediate needs.

Purpose 2: The health and nutrition of the targeted community is improved through enhancing healthcare services and prevention and treatment of malnutrition.

  • BHA Sector Name – Health: Communities receive increased access to primary health care services, focusing on diseases treatment and prevention, public health emergency response, mother and child health services, and health systems strengthening.
  • BHA Sector Name – Nutrition: Parents and caregivers receive knowledge and care needed to treat their malnourished children through access to SAM and MAM care.

Purpose 3:Households and communities have improved access to clean water, sanitation, and critical WASH items, as well as increased adoption of positive hygiene and environmental health practices.

  • BHA Sector Name – Water, Sanitation, and Hygiene: WV proposes a holistic WASH approach to target the same communities with multiple interventions, working toward addressing immediate WASH needs, while also contributing to sustainable community WASH improvements.

Target beneficiaries

The number of people targeted is 556,454(Of those, 223,197 are internally displaced persons)

Project locations

a. Jubaland: Gedo Region – Luuq & Dollow districts

b. South West State:

Bay Region: Berdale, Qansadhere and Burhakaba districts

Bakool Region- Hudur & Wajid

c. Somaliland:

Waqooyi Galbeed: Gebilay and Berbera

Togdheer: Burao & Odweyne,

Implementing Partners by Sector and Location

Partner Name, Sectors Covered and Geographic Locations

  1. ARD – Food Assistance and WASH – Berdale and Wajid
  2. CeRID – Food Assistance, WASH, ERMS and Agriculture – Luuq and Qansadhere
  3. MARDO – ERMS – Huddur
  4. SRDA – WASH and Agriculture – Burhakaba

Project Start Date: July 2021

Project End Date: April 2023

Evaluation Type: Summative Evaluation

Evaluation Purpose

World Vision Somalia will conduct an evaluation of the project to identify the impact of the project, to see if the project was able to achieve the intended outcomes with respect to related indicators.

The evaluation should document evidence of impact, and to highlight and strengthen best practices emerging from the project. World Vision would also like the evaluation to assess the appropriateness of the approaches/ project models used in the program, as well as the potential sustainability markers that have been achieved through the project’s implementation.

This study is to be a combination of end-of-project or summative evaluation, where project outcomes results will be compared the desired targets to deduce impact. The results will also be considered as annual outcome reviews, in which the findings will be used to establish outcomes indicators targets for the cost extension.

Methodology

The evaluation study will adopt a mix of quantitative and qualitative techniques to compare project outcomes against the desired targets. This evaluation will also adopt a population-based and participant-based survey to collect data at the end-of the-project evaluation and measure project performance. The methods are as summarized below:

Quantitative:

  • Household surveys

Qualitative:

  • Focus group discussion (FGD), Key Informant Interviews (KII)
  • Document Reviews and Case studies

Evaluation Duration

15 March – April 30th 2024

Available Project Documents: Project proposal, baseline reports, M&E plan, ITT, Detailed Implementation Plan (DIP), semi-annual reports

2.0 PROJECT DESCRIPTION

World Vision and four local partners are implementing Multisector Emergency Response Project (MERP) in Somalia and Somaliland with funding support from the United States Agency for International Development (USAID) through the Bureau for Humanitarian Assistance (BHA). The Partner*—African Relief and Development (ARD),* Centre for Research and Integrated Development (CeRID), Mandher Relief and Development Organization (MARDO), and Somali Relief and Development Action (SRDA) work in difficult to access areas in South West State (Bakool, Bay) and and Jubaland (Luuq district of Gedo region), while WV is a direct implementer in Somaliland (Woqooyi Galbeed and Togdheer regions) and Doolow district of Jubaland State. The action is also undertaken together with selected governmental line ministries (Ministry of Health and Ministry of Agriculture), local authorities, community level partners and additional relevant stakeholders across the project districts. The goal of project is to restore the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts affected communities; including IDPs, and host communities in Somalia and Somaliland. The number of people targeted is 556,454. Of those, 223,197 is internally displaced persons affected by recurrent drought and insecurity.

WV proposes a performance evaluation that will measure the impact and effectiveness of the program in the target locations with the communities and the provincial authorities. WV also proposes a performance evaluation to ensure that the project met its intended goals and outcomes and how it can be strengthened in the future. The study will be carried out externally through a participatory approach involving partner organizations, line ministries, and beneficiary communities. The evaluation results will help the key stakeholders measure the level of project success concerning service delivery to the project beneficiaries. More importantly, this end of project evaluation will generate recommendations and lessons learnt for ongoing and upcoming World Vision projects.

2.1 Geographic Scope and Target Population

The project targets 11 districts including Berbera and Gabiley districts of the Woqooyi Galbeed region; Burco and Odweyne districts of the Togdheer region; Wajid and Huddur districts of the Bakool region; Luuq and Doolow districts in the Gedo region; and Qansaxdhere and Berdale districts of the Bay region in Somalia. And evaluation will take place in all the targeted districts. In addition, this survey will assess progress made from the inception and the overall achievement of project outcomes.

2.2 Context

This evaluation is to be done in relation to the context taking into consideration the changes that took place during the project implementation period (Covid19, political tension/election in Somalia, droughts, flooding, and insecurity). The survey team will also analyze on how the changes in context affect project implementation and impact including the effectiveness of adaptation efforts to be made by the implementing partners, communities and other stakeholders.

2.3 Community Level Partners

The Project end-line evaluation will involve all stakeholders who were involved in the project design and expected to engage into implementation such as:

  • Community groups and committees to be involved in project implementation (WASH, H&N and Protection, FSL).
  • Implementing partners-African Relief and Development (ARD), Centre for Research and Integrated Development (CeRID), Mandher Relief and Development Organization (MARDO), and Somali Relief and Development Action (SRDA)
  • The relevant line ministries (Ministry of Health, Ministry of Gender and Social Affairs, Ministry of Water and Mineral Resources, Ministry of Agriculture and Livestock)
  • Local Administration offices in the project implementation districts

3. METHODOLOGY

This section provides a detailed description of the end-line evaluation process. It focuses on the evaluation design, data collection and analysis and reporting process.

3.1. Scope of the Evaluation

This summative evaluation will provide a detailed analysis of achievements against the anticipated objectives (baseline values) and measure up-to-date performance of the project. This aims to showcase the project’s relevance of the interventions, efficiency, and effectiveness impacts, of the project intervention towards future sustainability.

3.2 Evaluation Type

This summative evaluation aimed at assessing the impact and effectiveness of the program in the target locations with the communities and the local authorities. WV also proposes a performance evaluation to ensure that the project met its intended goals of restoring the wellbeing of vulnerable communities in Somalia affected by the aftermath of severe flooding, drought, and locusts-affected communities; including IDPs, hosting communities in Somalia and Somaliland and sectoral outcomes and how it can be strengthened in the future. World Vision Somalia will hire an external consultant through a competitive process to carry out a quality project end evaluation.

3.3 Evaluation Purpose and Objectives

World Vision Somalia will conduct this evaluation of the project to identify the impact of the project and to assess the relevance of the project logic, and effectiveness of the project models to realize the project outcomes. It further aims to assess the effectiveness of the integration model of the BHA project. World Vision would also like the evaluation to assess the appropriateness of the approaches used in the program, as well as the potential sustainability markers that have been achieved through the project’s implementation. The end-line evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders.

The objectives of the evaluation are to:

  1. Establish the extent to which the project achieved its outcomes among target populations (specifically vulnerable groups including, PLWs, children under 5, and women-headed households) and determine the impact or potential impact of the project;
  2. Determine how the project involved and benefited the community throughout the planning, design, implementation, and, monitoring processes and its appropriateness in addressing the causes and responding to community needs;
  3. Document sustainability practices (mechanisms, plans, structure), identify key sustainability recommendations, lessons learned, good practices, and any innovative ways that have contributed to the attainment of the project objectives; which can be used to inform future programs.

3.4 Evaluation Questions

a) Relevance:

1. To what extent has the intervention appropriately assisted the affected population?
2. How has management adapted the project design or implementation based on monitoring information and feedback from the target population?
3. How equitably has the project benefited the; women, men, boys and girls, IDPs and host community and special interest groups such as the disabled?

b) Effectiveness:

  1. To what extent have the activity’s interventions adhered to planned implementation – schedules, participant targeting, resource transfer composition/quantities, inputs and service delivery, and outputs – and achieved intended goals, purposes and outcomes?

c) Efficiency:

  1. Have project resources (inputs) resulted in expected results?

d) Sustainability:

  1. To what extent is the likelihood that the benefits of the project will endure over time after the completion of the project?
  2. What sustainability drivers did the project engage?
  3. How appropriate are these drivers for moving towards sustainability in the project?
  4. What other drivers or mechanisms could programs consider for future programming?

e) Impact:

  1. What changes – expected and unexpected, positive and negative- were experienced by the targeted beneficiaries and other stakeholders?
  2. What factors appear to facilitate or inhibit these changes and how?
  3. What have been the main themes of changes happened as a result of the project intervention?

It is expected that the evaluation will provide lessons Learned, areas of improvements and recommendations for similar programs.

3.5 Study Design

The project will employ a mixed-methods design for the final evaluation. A quantitative survey will be used to produce indicator values and collect demographic information about the beneficiaries. A qualitative component using Key Informant Interviews (KIIs), Focus Group Discussions (FGDs) and observation will provide insights into the context and help to explain the reasons for the changes observed

3.6 Data Collection Methods

3.6.1 Quantitative Data Collection

The quantitative Population-Based Survey will be administered among a probability sample of participants and households in the target areas. The sample size, sampling frame, and data collection tools will be designed so that data can be statistically compared to test for difference. The questionnaire will be designed and administered using Kobo Toolbox/FieldTask.

3.6.2 Sampling Design

The evaluation will adopt the two-stage cluster sampling design where the target population will first be divided into clusters; in this case the clusters will comprise of the villages/communities/groups in the target areas. These clusters are randomly selected from a list of all clusters. In the second stage, a sample including households in these communities is randomly selected from each of the sampled clusters using the probability proportional to size (PPS) method.

3.6.3 Sampling Frame

To develop sampling frames, the project will consider its targeting strategy and the different target populations for various interventions being a multi-Sector response. The survey design will require multiple sampling frames to organize the target groups that will receive a similar set of interventions. The sampling frame for the Population-Based Survey will be developed based on census data and community records to appropriately reflect the target population. The sampling frame for the Participant Based Survey will be developed based on the participant register to appropriately reflect the target population. The sampling unit is the household, where a knowledgeable adult will be answering the questions. The sampling frame will include the following key elements collected at registration:

  • Unique household identification number
  • Household contact information (including name, physical location, primary and secondary phone number)
  • Household characteristics (gender composition, size, primary and secondary livelihood activities)
  • Intervention(s) to be received/or received
  • Participant target criteria met

3.6.4 Sample Size Calculation

To derive the sample size, the key the purpose of the evaluation (comparative), the key indicators of interest, and the sampling methodology; in this case two-stage cluster sampling will be considered. For the evaluation survey the appropriate sample size for comparing the values of indicators collected at two points in time: at the start of the activity and after the activity is completed will be calculated using a two-stage cluster sample design. The sample size will be estimated for each key indicator and adjusted based on the proportion of households in the sub-set.

Using Feed, the Future Sample Size Calculator, WV calculated sample sizes for the key project indicators for comparing the values at two points and found that the sample size was largest using Food Consumption Score (FCS). Additionally, these calculations meet the minimum number of respondents recommended by BHA for indicators expressed as a proportion using Simple Random Sampling, Systematic Random Sampling and PPS, which is 388 and 776 respondents.

Calculating Sample Size for Comparing Indicators (Edline) expressed as a Proportion using two stage cluster sample

INDICATOR: Percent of households with poor FCS score for two stage sampling:

𝑃1,est – 50% (0.5)

𝑃2,𝑒st – 40% (0.4)

𝑧1−∝ – 95% (1.64)

𝑧1−𝛽 – 80% (0.80)

𝐷est – 2

𝑛initial – 776

Non-response adjustments – 10%

𝑛final – 854

3.7 Qualitative Data Methods

Physical observations, Focus Group Discussions (FGDs) and Key Informant Interviews (KIIs) will be carried out with beneficiaries and stakeholders to understand the extent to which the project achieved the intended objectives and addressed community needs. The FGD guides will contain a checklist of questions generated from the main project objectives and activities. The survey team working together with project staff will select the participants of the FGDs based on the focus of the study. Mothers with children under five will be purposively included for the FGD.

The focus groups will target key stakeholders including beneficiaries, comprising; FSL targeted community members, Pregnant and lactating Women (PLW), CHWs, WASH Committees, Protection Committees, facility staff, camp leaders, youth clubs and children’s clubs among others. Taking covid19 protocols into consideration, every FGD will have standard 6-10 participants within the project areas. For purposes of plural investigation, the exercise will be conducted with a broad range of representation within the community to enable triangulation of findings and incorporate wide-ranging perspectives.

Key Informant Interviews (KIIs) will be conducted with a wide range of stakeholders including; WV Project staff, Government officials from the Ministry of Agriculture, Ministry of Livestock, Forestry and Rangelands, MOH, MoEWR & Min of Gender, Health facility in-charges and other implementing partners working within the project area.

Note: The survey team will also consider gathering information on the impacts of COVID-19 as a cross-cutting theme by deliberately incorporating key questionnaires related to the pandemic. Key questions will be developed by the survey team.

3.8 Documents review

The key documents to be reviewed for the final evaluation are as follows:

The key documents to be reviewed for the evaluation are as follows:

  • Proposal narrative
  • Project design document (needs assessment and concept notes)
  • Project progress reports semi-annual
  • M&E plan, IPTT and PIRS
  • Project financial reports
  • PDM Reports
  • Third party monitoring reports
  • Beneficiary databases
  • Baseline Reports
  • BHA indicator handbook
  • Any district level secondary data and other relevant documents and reports.

3.9 Recruitment and Training of Enumerators

Enumerators with previous research experience will be recruited. About 30 enumerators (50% women) will be recruited for the quantitative-household survey while the qualitative data collection including key informant interviews, focus group discussions, and review of records will be facilitated by the partner organization staff. The data collection team will undergo 3 days of comprehensive training on data collection techniques, mastery of methodology, and data collection tools before field data collection and pre-test of the tools. During the training, the field team will also be briefed on the objectives of the evaluation, how to identify the appropriate respondents at various levels, and how to fill in the questionnaire appropriately. Emphasis will be put on research ethics, accuracy, and completeness among others. Quantitative data collection is expected to take 5 days with each enumerator administering 7 to 8 questionnaires on daily basis.

3.10 Data Collection Tools

A blend of several data collection methods will be used. This is aimed at triangulating and authenticating the data collected as indicated in the study design. This will also be able to overcome the intrinsic biases that emerge as a result of the application of a single method. The following are key data collection methods and tools that will be employed in the evaluation;

  • Household survey questionnaire
  • Document review checklist
  • Key informant interview and focus group discussion guides.

Household questionnaires will be developed, coded and leveraged on smartphones with a Global Positioning System (GPS) facility enabled to support geo-referencing of survey locations and spatial analysis and visualization of the evaluation figures inform of maps and graphs for easy interpretation. The mobile data collection tools will be deployed on Open Data Kit (ODK) platform. ODK is a free and open source set of tools that will help to author and manage the field mobile data collection process.

3.11 Pre-test, Quality Assurance and Control

Quality assurance will be an integral component of the entire survey process. Data quality check will be done at four levels: designing of tools, pre-testing, sampling, data collection and analysis. Besides extensive training and pre-testing of survey questions, quality assurance will among other measures entail; appropriate preparation and orientation of data collection team to ensure that they are sufficiently familiar with the survey processes and the tools used. It will also entail providing adequate support supervision by partner organisation technical staff at every stage of the survey with an emphasis on quality data collection. Data collection supervisors in each district will conduct daily de-briefing with the enumerators. Any errors found will be discussed with the enumerators and guidance provided before proceeding to do more fieldwork activities the next day. This procedure will help to effectively identify mistakes during the recording of responses and rectify the mistakes. Routine validation of data will be done on daily basis by the database administrator as well as the setting of constraints and skips for filling only relevant questions in the ODK platform.

3.12 Data Analysis and Presentation

Upon completion of the data collection process, survey data will be downloaded from ODK aggregate and data processing and analysis will be undertaken by designing a tabulation guide for each questionnaire to establish the indicator values. In particular, descriptive statistics will be run in SPSS to examine the status of different project indicators and results disaggregated by gender and beneficiary category. Inferential statistics involving use of t-tests will also be run to assess the level of significance in in terms of change in the performance of different project indicators. However, content analysis will be used to establish relationships in case of qualitative data obtained from key informant interviews, focus group discussions and unstructured interviews. In addition, GIS solution using ArcGIS software will be applied to perform spatial analysis and spatial visualization of key project indicators. Therefore, the data will be presented in the form of graphs, tables, and maps for easy interpretation.

3.13 Validation of Results

Data validation will be conducted with the involvement of all the stakeholders including BHA partner organizations, line ministries and community representatives. A validation workshop will be organized to present and discuss the evaluation findings to generate feedback and validate any issues identified for quality improvement. The team will use the feedback provided in the compilation of the final report that will be shared with all the stakeholders.

3.14 Ethical Considerations

All data quality control measures will be adhered to during the survey including; reviewing of the study tools, translation of the tools into local languages, standardization of the training (pre-testing and ensuring that the enumerators are familiar with local terminology), review of evidence against bond evidence parameters, using GPS functionality in FieldTask to geo-reference the data, regular supervision and data cleaning. The administered tools will be checked regularly for correctness, completeness, and consistency. After entry, the data will be crosschecked to ensure the accuracy of the information obtained from the field then compared and validated. During analysis, validation will be done by comparing the emerging information with secondary data to ensure that any outliers are addressed.

3.15 Ethical Consideration:

The following ethical considerations will be adhered to during the survey process:

  • The rights and privacy of individuals
  • Voluntary nature of participation – and the rights of individuals to withdraw partially or completely from the process
  • Consent and possible deception of participants
  • Maintenance of the confidentiality of data provided by individuals or identifiable participants and their anonymity
  • Reactions of participants to the ways in which researchers seek to collect data.
  • Effects on participants of the way in which data is analyzed and reported.
  • Behavior and objectivity of the surveyor.

3.16 Limitations

Security in most of the targeted districts is volatile with the likelihood of disruptions that may restrict enumerators from accessing some field locations, hence affecting the quality of the evaluation. This will be mitigated by working together with our local line ministry staff right from the inception to the end, as this will empower them to adequately supervise the survey and update sectors on a daily basis.

4. DELIVERABLES

The final evaluation Report:

  • Findings should be disaggregated by Sectors and also provide the cumulative results by district and project.
  • Recommendations need to be supported by a specific set of findings.
  • Evaluation findings should be presented as analysed facts, evidence and data and not based on anecdotes, hearsay or the compilation of people’s opinions.
  • Findings should be specific, concise and supported by strong quantitative or qualitative evidence.
  • Include recommendations for improved future programming and implementation
  • No more than 40 pages (without annexes)
  • Includes photos and quotes from key stakeholders such as beneficiaries and project facilities during FGDs & KII
  • Including, but not limited to, sections on context, sampling, methodology, findings (including table showing end-results per indicator/ activity
  • Disaggregation of beneficiaries – Male/Female, IDPs/Host community levels at a minimum.
  • Annexes with the following information:
  • List of enumerators
  • All data collection tools and methodologies
  • Findings on the integration of gender, disability, resilience, child protection, COVID-19 and engagement with local administration and religious leaders

5. RESPONSIBILITY

The evaluation will be conducted externally by a consultant in collaboration with the World Vision M&E team. World Vision will externally hire a qualified evaluation consultant through a competitive process to conduct quality project end-line evaluation. The external evaluators are not involved implementation process to ensure that the evaluation process is fully impartial and independent from the process related to designing the study, field data collection, supervision, and report writing. Impartiality contributes to the trustworthiness of evaluation and the prevention of any biased assessment design, data analysis and generating an assessment report, and drawing conclusions and actionable recommendations. In addition, independence gives validity to the evaluation process and reduces the possible conflict of interest that might emerge when project implementers are tested to evaluate and rate the performance of their projects.

However, the BHA M&E team will provide the necessary support to the evaluator. ​ The BHA MEAL Monitoring and Evaluation Manager will be responsible for the overall coordination of all the evaluation tasks. In addition, the Program manager and DQA manager, World Vision Somaliland Regional Operations Managers, and the Design Monitoring and Evaluation Manager will provide the necessary technical and operational support required throughout the evaluation process. The M&E officers and M&E coordinators will be tasked to supervise the enumerators and ensure that good-quality data is collected from the field.

Roles and Responsibilities

Consultant’s Roles and Responsibilities

The consultant shall conduct desk reviews of relevant project documents such as proposals, assessments, project budgets, monitoring and assessment reports, and WV guidelines. In addition to the desk review, the consultant shall prepare and submit an inception report with a detailed analysis plan that sets out the conceptual framework to be used in the evaluation, the key evaluation questions including the methodology to be used, work plans and schedules for both quantitative and qualitative aspects of the assignment for review and feedback and approval by WV. Finally, the consultant will prepare and submit the evaluation report to WV. The consultant shall carry out field visits to selected sites; conduct household surveys, interviews and/or focus group discussions and Key informant interviews with local partners, key stakeholders, households, and herder groups

  • Conduct document review
  • Hire and train the data collection team, if necessary
  • The consultant shall prepare materials for training and train enumerators and shall design and pre-test data collection tools
  • Lead and supervise the data collection and achieve quality
  • Interview randomly selected respondents during the evaluation.
  • Conduct entry and exit meetings (debriefing) with WV staff and key stakeholders
  • Submit a draft evaluation report and finalize it based on the feedback from World Vision Somalia.
  • Submit the Final Evaluation report to WV

World Vision’s Roles and Responsibilities

  • Set-up an evaluation team
  • Conduct briefing and inception report review
  • Mobilize participants/stakeholders for project entry and exit meetings
  • Review and approve the evaluation tools and methodology
  • Provide all the necessary support to the external consultant to ensure timely completion and compliance with the international survey standards
  • Provide all the required technology, security or requested facilitation and coordination required to achieve objectives of the evaluation
  • Assist in organizing meetings with stakeholders sampled to participate in the evaluation
  • Help in recruiting data collection team if requested by the consultant
  • Process the payment for the consultant upon completion of the assignment.

Estimated Final Evaluation Survey Schedule and Deliverables

A. Summary of Key Activities, Expected Delivery Date and Responsible for Delivery

A. Develop inception report detailing the evaluation survey work plan, analysis matrix, comments to the evaluation concept note/protocol, list of documentation, and formulation of evaluation survey data collection tools etc. – 15th March 2024 – Consultant

B.Review and Approval and validation of the inception report and evaluation survey data collection tools – 20th of March 2024 – World Vision

C**.** Coding of the finalized tools – 23rd of March 2024 – Consultant

D. Training of enumerators – 25th of March 2024 – Consultant

E. Field data collection – 30th of March 2024 – Consultant

F. Data cleaning and analysis – 03rd of April 2024 – Consultant

G. First draft of the final evaluation report – 13th of April 2024 – Consultant

H. Review of draft final evaluation report – 15th of April 2024 – World Vision

I. Finalize the evaluation report and dissemination – 20th of April 2024 – Consultant

J. Incorporation of feedback received – 25th of April 2024 – Consultant

K.Submit Final Evaluation Report to BHA – 30th April 2024 – World Vision

6.LOGISTICS

  • BHA Project Management will fund enumerators, get on board, and sign agreements with them for the assignment (covers costs related to enumerators’ training and incentives during data collection)
  • The evaluator trains the enumerators on data collection tools, supervise during data collection
  • BHA Project Management team will hire vehicles for data collection
  • Ensure printing/copying materials are readily available for the survey team
  • World Vision Somalia will provide logistics and accommodation to consultants

7. EVALUATION REPORT STRUCTURE

Title and Opening pages (front matter)—should provide the following basic information:

  1. Name of the project evaluated
  2. Time frame of the evaluation and date of the report
  3. Project location (districts, Regions and States and country)
  4. USAID logo as well as partner organizations
  5. Acknowledgments

Table of Contents -including boxes, figures, tables, and annexes with page references.

List of acronyms and abbreviations

Executive Summary

A stand-alone section of two to three pages that should:

  • Briefly describe the intervention (the project(s) that was evaluated.
  • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses
  • Describe key aspect of the evaluation approach and methods.
  • Summary of the key findings, conclusions, and recommendations.

Introduction

This section will;

  • Provide a brief explanation of why the evaluation was conducted, why the intervention is being evaluated at this point in time, and why it addressed the questions it did.
  • Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why and how they are expected to use the evaluation results.
  • Identify the intervention (the project(s) that was evaluated
  • Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the evaluation and satisfy the information needs of the report’s intended users.

Description of the Intervention

This section will provide the basis for report users to understand the logic and assess the merits of the evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. In particular, the section will;

  • Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address.
  • Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy.
  • Link the intervention to the durable solution framework
  • Identify any significant changes (plans, strategies, logical frame-works) that have occurred overtime and explain the implications of those changes for the evaluation
  • Identify and describe the key partners involved in the implementation and their roles.
  • Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component.
  • Indicate the total resources, including human resources and budgets.
  • Describe the context of the social, political, economic, and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes.
  • Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).

Evaluation Scope and Objectives

This section of the report will provide an explanation of the evaluation’s scope, primary objectives and main questions.

  • *Evaluation scope-*define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.
  • Evaluation criteria-define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation.
  • Evaluation questions- the evaluation questions will define the information that the evaluation will generate. The report will detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.
  • Evaluation objectives-spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions.

Evaluation Approach and Methods

This section will describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description will help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations. The description of methodology will include discussion of each of the following:

  • *Data sources-*sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions.
  • Sample and sampling frame-the sample size and characteristics; the sample selection criteria, the process for selecting the sample (e.g. random, purposive); and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results.
  • Data collection procedures and instruments-methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity.
  • Performance standards**-**standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales).
  • *Stakeholder engagement-*stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results.
  • Major limitations of the methodology-major limitations of the methodology shall be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.
  • Data analysis-procedures used to analyse the data collected to answer the evaluation questions. This will detail the various steps and stages of analysis that will be carried out, including the steps to confirm the accuracy of data and the results. The report will discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and Conclusions

This section will present the evaluation findings based on the analysis and conclusions drawn from the findings. In particular,

Findings: This section will present findings as statements of fact that are based on analysis of the data. The evaluation findings will be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results will be explained, as well as factors affecting the achievement of intended results. The assumptions or risks in the project design that subsequently affected implementation will also be discussed.

Conclusions: This section will be comprehensive and balanced and highlight the strengths, weaknesses and outcomes of the intervention. The conclusion section will be substantiated by the evidence and logically connected to the evaluation findings. The conclusion will also respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making.

Recommendations: The evaluation will seek to provide very practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations will be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. This shall also address sustainability of the initiative and comment on the adequacy of the project exit strategy.

Lessons Learned

The report will include a discussion of lessons learned from the evaluation, that is; new knowledge gained from the particular circumstances (intervention, context outcomes, even about the evaluation methods) that apply to a similar context. Concise lessons based on the specific evidence presented in the report will be presented in the evaluation report

Report Annexes

The Annex section will include the following to provide the report reader with supplemental background and methodological details that enhance the credibility of the report.

  • ToR for the evaluation
  • Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate
  • List of individuals or groups interviewed or consulted and sites visited
  • List of supporting documents reviewed
  • Project results map or results framework
  • Updated Indicator value table
  • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators.

8. BIDS EVALUATION PROCESS AND REQUIREMENTS

The Selection of the consultancy firm will be made based on cumulative analysis (i.e., mandatory requirement and technical qualifications as follows:

  1. Mandatory Evaluation Requirements
    1. Provide a certified copy of business registration(company/organization)
    2. Provide a certified copy of the tax registration
    3. Provide information on ownership structure – the name of directors/owners of the company(company/organization)
    4. Provide references including names and contact information from previous clients who can be contacted regarding relevant experience (At least three similar assignments in a similar context)
    5. Successful bidders will be required to sign the World Vision Supplier Code of Conduct form

NB: Bidders who will fail to provide mandatory requirements will not qualify for next stage (Technical Evaluation)

2. Technical Evaluation

Qualifications

The consultant must have proven expertise and experience in social research with a special focus on Agriculture, Agricultural Economics, Health and Nutrition, Development studies, Baseline, end-of-project evaluations, midterm evaluations, and impact assessments and be able to implement the final project evaluations in Somalia following the required procedures. Proof of these is to be provided by submitting, together with the application:

  • An overview of relevant works
  • Working samples
  • Contact details for references
  • The proposed consultant’s/research team’s CVs

Requirements in detail:

  • In-depth knowledge of Somalia and its regions including government and community-level service delivery structures. Understanding of the local context, political and security environment. The CVs of the consultants with extensive working experience or studies conducted in the Horn of Africa, especially Somalia/Somaliland is an added advantage. CVs, assignments, and contracts will be reviewed to check consultants’ suitability for the end-project evaluation.
  • The lead consultant should have a Master/Ph.D. degree in Agriculture, Agricultural Economics, Health and Nutrition, Development studies, Peace and security, Monitoring and Evaluation, International studies and 10-15 years of consultancy experience. Interested consultants and firms have to submit Reports and previous contracts to ascertain the suitability of the consultants with the evaluation at hand.
  • Has technically sound experience in end-of-project evaluation, baselines, and Studies in the Horn of Africa or Somalia. Consultants are expected to submit reports of the past end project evaluation, baselines, and assessments along with their application package.
  • Has extensive experience in multi-sectors including Emergency programming, Food Security, livelihoods, Cash transfers programs, and Health and Nutrition. Consultants should able to provide a good track record of their past experience in evaluating cash transfer programs, WASH, and Health and Nutrition in their technical proposals. Relevant reports and contracts to prove the consultant’s experience in specified thematic areas should be submitted along with the application.
  • Strong written, communication, and interpersonal skills in English with substantial experiences in training and managing multicultural teams.
  • Proven skills in research, monitoring, and evaluation
  • Proven experience in conducting qualitative, quantitative, and mixed methods evaluation studies. Consultants will be needed to submit evaluation, assessment and baseline reports with strong quantitative and qualitative methodologies.
  • Computer proficiency with excellent MS Office knowledge (Word, Excel, PowerPoint), SPSS, and STATA. Submitted reports by consultants will be reviewed to check if the consultants have excellent knowledge of data analysis software.
  • Experienced in undertaking baselines/evaluations in the Horn of Africa any country will be an added advantage, especially Somalia and Somaliland. Consultants are required to submit evaluation and Baseline reports done in Somalia or the Horn of Africa region.
  • Excellent analytical and report-writing skills.
  • Excellent written and spoken English.
  • Excellent time management skills.
  • Ability to work well both independently and in a team.
  • Excellent conflict-sensitive approaches and ability to work in highly sensitive environments

Proposal Contents

Proposals from Consultants should include the following information (at a minimum)

  • Technical Proposal with clear methodology, including types of Monitoring & Verification tools and analysis
  • CVs of key consultant(s) attached to the technical proposal
  • Proposed timeline/Work plan
  • At least 3 References including names and contact information (at least three similar assignments in a similar context are also required)

III. Financial Evaluation

  • A financial proposal with a detailed breakdown of costs (which shall include professional fees and operational budget) quoted in USD. The applicable tax amount must be clearly stipulated and separated from the base costs.
  • Payment Terms
  • Credit Period

Clarification of Bidding Document

A prospective bidder making an inquiry relating to the tender document may notify WVS in writing at somalia_procurement@wvi.org. WVS will only respond to requests for clarification received no later than 24th November 2023

How to apply

All interested bidders are requested to submit their proposal in English and by email to somo_supplychain@wvi.org on or before 30th November 2023 Proposals should be submitted in three distinct/separate attachments, namely. Mandatory Requirements, Technical Proposal, and Financial Proposal (Bidders who will combine both technical and financial proposals shall be disqualified)

EMAIL TITLE SHOULD BE: – END OF PROJECT EVALUATION- MULTISECTOR EMERGENCY RESPONSE PROJECT

Bids received after deadline shall not be considered.