USAID-BHA Performance Evaluation - Consultant

International Rescue Committee

Performance Evaluation: Empower and Inform Project SCOPE OF WORK

Organizational and program background

The BHA-funded Empower and Inform project aims to foster accountable, client-centric and quality humanitarian assistance by empowering crisis-affected people with information to make informed decisions in times of crisis and enabling their participation to shape the humanitarian assistance they receive. The project is implemented through two pillars described below: Responsive Information in Emergencies (RISE) and Empower to Enable (E2E).

Under RISE, the project focuses on addressing critical gaps in the space of responsive information and its alignment with communication and community engagement strategies to develop a harmo- nized toolkit that brings together best practice and operational resources in responsive on and of- fline information services with RCCE for application in emergency settings, while additionally devel- oping and piloting approaches to meaningful inclusion in responsive information as well as facilita- tion of leadership by community networks and actors. The Toolkit was developed following a forma- tive learning exercise and is being piloted in partnership with the IRC Nigeria and Myanmar country programs and the RISE Advisory Group and Technical Team. The Toolkit will undergo a revision and finalization process based on the feedback received during the pilot. The purpose is to ensure the toolkit’s relevance to frontline staff from diverse organizations and its effectiveness in supporting frontline staff to design and deliver context-appropriate responsive information services to people affected by crisis.

Under Empower to Enable (E2E), the project focuses on the empowerment of frontline staff to en- hance client participation through the E2E Toolkit, a set of tools developed in consultation with the IRC Iraq and Democratic Republic of Congo (DRC) country programs and the E2E Project Advisory Committee. The toolkit comprises eight tools that address the enablers, barriers, and strategic changes outlined in the E2E Learning Report. The E2E toolkit was tested during a five-month pilot program by IRC Iraq and DRC and select local partners, and will undergo a revision and finalization process based on the feedback received during the pilot. This will ensure the toolkit’s relevance to diverse organizations and its effectiveness in empowering frontline staff and facilitating meaningful client participation.

Objectives of the Evaluation

The main purpose of the evaluation is to assess the effectiveness and relevance of the Empower & Inform project, including perceived value, quality, and appropriateness of the newly developed pro- ject RISE and E2E toolkits. The evaluation will use a mixed methods approach – including triangula- tion with existing project monitoring and pilot data – to identify successes, challenges, and lessons related to these lines of inquiry, along with related conclusions and recommendations. Evaluation results will be used to inform IRC and BHA’s approaches to the design, development, and implementation of the RISE, E2E, and similar toolkits in the future.

Specifically, the evaluation should be able to:

  • Assess the extent to which the Empower & Inform project was effective in meeting its intended activities and outcomes;
  • Assess the relevance and perceived value of the RISE and E2E toolkits to frontline staff;
  • Identify best practices and lessons learned through toolkit development/implementation and provide concrete recommendations;
  • Assess and identify critical internal and external factors that have contributed, affected, or impeded the project’s achievements and how IRC has managed these factors.

Evaluation Questions

The evaluation should explore the program's relevance and effectiveness, including successes, challenges, and lessons learned. At minimum, the following evaluation questions should be answered:

  • To what extent was the Empower & Inform project effective in meeting its intended activities and targets?
  • To what extent were project activities completed as proposed?
  • To what extent were project targets (including outputs and outcomes) met?
  • To what extent were the RISE and E2E toolkits relevant and valuable for frontline staff?
  • In what ways did the RISE toolkit strengthen the capacity of frontline staff to deliver inclusive and responsive information services?
  • In what ways did the E2E Toolkit empowered frontline staff to enhance client partici- pation?
  • How appropriate and relevant are the current RISE and E2E toolkits to IRC country program implementation contexts?
  • What lessons and best practices were learned through the RISE and E2E toolkit development and implementation process?
  • What are the biggest successes or contributions of the project/these toolkits?
  • What were the biggest challenges or gaps related to the project/these toolkits?
  • What recommendations do stakeholders have for designing and structuring similar projects to be more effective in the future?
  • What internal and external factors critically contributed to, affected, or impeded the project’s achievements?
  • How did the IRC manage these factors?

Intended Users

The intended users for the evaluation will be:

  • The Empower & Inform program team and advisory committee(s)
  • All interested parties in IRC, including for RISE, Country Programs, Senior Leadership Group, Technical Team, Violence Prevention and Response (VPRU) Technical Advisors, and Emergencies and Humanitarian Action Unit (EHAU); and for E2E, Country Programs and Partner- ships and Measurement Units.
  • Representatives of BHA as the donor that has funded the project.

The IRC will also share key evaluation findings though relevant inter-agency IRC Signpost partners; blog posts and/or articles on public platforms where relevant; and with the frontline staff, teams, and partners involved in the piloting of the toolkits.

SCOPE OF CONSULTANCY

The Consultant will design an appropriate evaluation methodology based on their understanding of the expectations of the terms of reference and evaluation questions, as well as initial consultations with the project team. The Consultant should propose and adopt methodologies that combine quali- tative and quantitative techniques, including at minimum (i) desk review of existing project materials and resources (ii) key informant interviews with key project stakeholders and/or clients, and (iii) tri- angulation with existing project monitoring and pilot data.

The Consultant is expected to propose their methods for the evaluation that should include but not limited to:

  • Drafting data collection tools, with a plan for feedback on tools to be used in implementa- tion contexts;
  • Developing the data collection methodology, including sampling;
  • Developing the quantitative and qualitative data analysis plan;
  • Plan for validation of findings with key stakeholders;
  • Production of the evaluation report using USAID evaluation format.

The Consultant should submit a detailed evaluation inception report (described below) that includes data collection instruments, clear roles and responsibilities, timeframe, data analysis process, validation approach, and report writing, as well as an evaluation matrix. All data collected must be dis- aggregated as relevant and feasible according to the USAID requirements, including but not limited to by sex, age, disability, and location.

Evaluation Roles and Responsibilities IRC will:

  • Facilitate engagement with IRC country program colleagues, clients, and/or other relevant stakeholders, as required
  • Cover data collection costs such as enumerators’ allowances, vehicle rental, airtime for enumerators, and enumerators’ training costs, as required
  • Provide all necessary program documents and relevant monitoring data
  • Review the Consultant's proposal, tools and evaluation report

The Consultant will:

  • Be responsible for all aspects of the entire evaluation process, including evaluation prepara- tion, data collection, analysis, and report writing.
  • Be responsible for paying any tax or other fees related to this assignment.
  • Be responsible for their working tools such as computer and data analysis software.

Reporting Arrangements

The Consultant will report to the RISE MEAL focal point, with close collaboration with the Evaluation Committee, comprised of the Response Information Specialist and Senior Technical Advisor and the Client Responsiveness Project Manager and Specialist. The Consultant will also collaborate with respective advisory committees and IRC country program colleagues, as appropriate.

Duration of assignment

The consultancy should not last more than 40 working days, excluding weekends.

DELIVERABLES

  • Inception Report: The Consultant shall be expected to produce an inception report upon commencement of the assignment. The inception report will detail the agreed methodologies to be employed during the evaluation. The report should also include the finalized activity plan and a structural outline of the final evaluation report and should be shared and approved by IRC be- fore the data collection and analysis begin. The inception report should also contain:
  • A detailed methodology for the evaluation implementation, including sampling;
  • The indicators and monitoring data that fall within the scope of the program review;
  • Draft data collection tools (qualitative and quantitative);
  • A workplan that sets out the preparatory activities, specific deliverables, and timeline related to the program review and budget for the data collection and analysis activities.
  • Facilitated Validation Session: The Consultant will facilitate a validation workshop preceding the final report where the evaluations’ preliminary findings, conclusions, and recommendations will be presented to the IRC project team. The consultant will incorporate comments and feedback from the validation workshop into the final draft of the evaluation report.
  • Evaluation Report: The report should address the above consultancy objectives and contain an executive summary, acknowledgments, introduction including program summary and purpose of the evaluation, a detailed methodology (including limitations); key findings (covering both docu- ment review and primary research), lessons learned, evidence-based recommendations, conclu- sions, and annexes. Annexes should include, at a minimum: data collection tools, evaluation da- taset, any additional reference materials, and a list of key informants interviewed. A soft copy of the report should be shared with the Evaluation Committee and MEAL focal point, and the re- port should not be more than 30 pages, excluding the cover page and annexes.
  • Summary Report: A summarized four pages evaluation report that summarizes the evaluation purpose and background, evaluation questions, findings, lessons learned, conclusion and recommendation).
  • Presentation: A PowerPoint presentation outlines the evaluation process, key findings, lessons learned, and key recommendations.

The deliverables above will be accompanied by regular communication and feedback from IRC. The report should be shared with the Evaluation Committee and MEAL focal point.

Read Full Description
Confirmed 8 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles