Wednesday, April 08, 2015

Terms of Reference for Final Evaluation/End line Survey of the Integrated Recovery Program (IRP) in Pakistan - 2015



****Terms of Reference for Final Evaluation/End line Survey of the Integrated Recovery Program (IRP) in Pakistan****


  1. Purpose:

****Time Frames****: All work must be completed and deliverables submitted by 31st July 2015.


****Primary Location****: Pakistan (District Swat and District Battagram)


****Providing services for****: Asia, International Development, International Operations


****Reporting to****: Research Management Committee


  1. Background Information

The Pakistan IRP is the recovery program following the 2010 mega floods which focuses on four thematic areas: Health, Organizational Development (OD), Disaster Management (DM) and a Violence Prevention (VP) pilot. Program started in 2011, although with slow start of activities, after the signing of Program Cooperation Agreement in 2012 the implementation picked up and project launched in July 2012. While the IRP program in Pakistan will be closing, the Canadian Red Cross (CRC) Pakistan delegation will remain in place and continue supporting the Pakistan Red Crescent Society (PRCS) in a program aimed strengthening capacity of the PRCS National Head Quarters (NHQ) and Provincial Head Quarters (PHQ) Gilgit Baltistan.


The IRP for Pakistan is designed to support the IFRC/PRCS Plan of Action 2011-12 and aims to provide support to PRCS branches in districts Swat and Battagram of KPK.


Drawing from the PRCS 2015 Strategy, the overall outcome for the 3-year PRCS-CRC program is:


‘To reduce mortality, morbidity, injuries and psychological and physical impacts from diseases, disasters and public health emergencies, including a pilot program in violence prevention in Swat, with priority given to women and children in Swat and Battagram Districts.’


The program has four major components:


The first is the Healthcare component which encompasses curative and preventative health services. These services centre on MNCH including antenatal, post-natal care and vaccinations – as well as the provision of CBHFA training to the community. This component further aims to support PRCS’s Harmonized Action Plan 2011-12 and to strengthen the planning, monitoring, supervision and evaluation systems of PRCS health programming through its support for the review of PRCS health policy and the development of an MNCH training manual and reporting format. It is intended to strengthen the capacity of district branches, local health systems and communities, along with their ability to work together to reduce public health risks and restore the access of vulnerable communities to basic health care.


The second component of the program is the Disaster Management/DM/DRR component intended to develop the capacities of targeted district branches in disaster response and to undertake disaster risk reduction activities among vulnerable communities, reaching 10,500 families through integrated DM, Health and OD programming.


The third component of the program is Organisational Development/Capacity Building (OD/CB) which has a two-tiered approach:


  1. Directly support PRCS targeted branches in Battagram and Swat to build/strengthen their delivery capacity through integrated DM, Health & OD programming.

  2. Help to build and improve PRCS-NHQ coordination with its Provincial/ Branch offices and its volunteer management system as these are among the key challenges of the National Society.

The final components of the program are Violence Prevention and Dengue prevention pilot projects in district Swat, implemented through the health component of the project. These are community based programs focusing on violence prevention against children in the district and Dengue awareness and prevention in response to Dengue outbreak in Swat.


It should be noted that PRCS is the lead organization in all aspects of the program including: design, implementation, monitoring, evaluation and decision-making. CRC’s role, as defined in the Project Cooperation Agreement (PCA), is that of technical and financial support to PRCS at the NHQ, PHQ, and district Levels.


  1. Objectives

The Integrated Recovery Program (IRP) in Pakistan has been implemented since April 2012 and completed on 31st March 2015. The Evaluation/End line survey of this program will be carried out in order to meet the learning requirement of the program staff and partners in Pakistan and Canada, as well as the accountability requirement for all key stakeholders (e.g., Canadian Red Cross (CRC), Pakistan Red Crescent Society (PRCS), MoH).


  1. Deliverables, Activities, Timelines

****4.1 Audience and use of findings****


UsersPurpose/Intended Use Management of CRC and PRCS in Pakistan and respective Headquarters


Sectoral Technical staff of PRCS and CRC in Pakistan and respective Headquarters Accountability:


To understand achievements of the projects’ outcomes, impact and reasons for degree of the achievements/results in order to improve performance and accountability.


Learning:


To understand remaining unmet gaps in the community in order to guide the strategic direction of the PRCS future similar programing.


To understand strengths and weakness of the project management practices in order to continuously improve practices during future programing developed by the PRCS.


To understand sustainability outcomes of the projects by the community level and the implementing partners. Lessons learned will be informing future planning and where possible be incorporated for sustainability of results at all levels.


To learn about steps and conditions for effective gender sensitive health programming in Pakistan for future replication in future programs. Evaluation and Knowledge Management Manager of CRC Learning:


This final evaluation/End line will contribute to the sectoral meta-review as well as the periodic meta-review of all evaluations of CRC.


  1. Scope of this evaluation:

Scope of sector and time-frame of intervention: Maternal and Neo Natal Child Health/MNCH, as well as Violence Prevention/VP.


Scope of evaluation criteria: relevance, effectiveness, sustainability, impact, and gender.


Scope on stakeholders: staff and volunteers of implementing partners, CRC staff in Ottawa and in-country, government partners and local authorities (e.g., MoH).


Scope of program sites: sampling frame is population in all sites (for the end-line survey, see the survey protocol in the Annex 1).


  1. Evaluation Criteria and Method

****Evaluation Criteria and Questions****Priority Respondents and data collection method Relevance


What are the priority health concerns of the targeted communities that are not addressed by the project? Method: Qualitative evaluation


Respondents: end-beneficiaries, informal and formal community leaders, health staff personnel What is the level of satisfaction of the end-beneficiaries and implementing partners with project personnel’ responses to their inputs? Method: Qualitative evaluation


Respondents: end-beneficiaries, staff or volunteers of implementing partners (e.g., MoH), project personnel Have the end-beneficiaries and implementing partners received adequate information they need about the project activities? Method: Qualitative evaluation


Respondents: end-beneficiaries, staff or volunteers of implementing partners (e.g., MoH), project personnel ****Effectiveness****


Are there significant differences in the status of indicators of immediate and intermediate outcomes at the community between the beginning and end-line of the MNCH and VP intervention and control groups? Method: endline survey


Respondents: end-beneficiaries To what extent has the projects’ activities contributed to achievement of expected immediate and intermediate outcomes at the community (changes in knowledge and behaviors)? Method: endline survey


(inferential statistical analysis of the dose response questions in the end line survey)


Respondents: end-beneficiaries


Example of dose response indicators are number of project activities participated by respondents; number of meeting with staff or volunteers to talk about health issues (this type of questions will be included in the questionnaire) What factors have influenced stronger achievement of outcomes in communities with strong and weak outcomes? Possible factors to explore: intensity of sectoral intervention, community mobilization practices, community characteristics (e.g., social capital, leadership, initial capacity) This is priority and we would like data gathering more focused with end-beneficiaries. Qualitative data collection with end-beneficiaries to understand the pattern of findings of the quantitative survey.


This is the mixed method From the implementing agencies’ perspectives, what external and internal factors that have hindered or facilitated achievements of the project’ outcomes? Possible factors to explore: project management practices, pertinent business support system to the project (e.g., HR, procurement, etc.), etc. This is priority and we would like data gathering more focused with PRCS and MoH staff from different levels of position. Method: Qualitative evaluation


Respondents: end-beneficiaries, informal and formal community leaders, health staff personnel ****Impact****


What are unintended positive and negative impacts of the projects? (e.g., openness to new ideas in the targeted communities ) Method: Qualitative evaluation


Respondents: end-beneficiaries, informal and formal community leaders, project staff and volunteers ****Sustainability at the community level****


Which outcomes (e.g., utilization of health services, health prevention practices)/activities (e.g., community committees) that the communities would like to maintain and have the resources (e.g., Knowledge, Skills and financial) to do so? This is priority and we would like data gathering more focused with end-beneficiaries. Method: Qualitative evaluation


Respondents: end-beneficiaries, informal and formal community leaders ****Sustainability by implementing partners (PRCS, MoH****)


To what extent, the projects have resulted in capacity building of the staff and volunteers of the implementing partners that can be replicated in the second phase or other projects? This is priority and we would like data gathering more focused with partners’ staff and volunteers. Method: Qualitative evaluation


Respondents: staff and volunteers of the implementing partners ****Gender****


What are the intended and unintended results of implementation of gender mainstreaming activities of the project (e.g., provision of female doctors, sessions with women teachers, etc.)


What are the facilitating and hindering factors to mainstream gender issues in this project? Method: Qualitative evaluation


Respondents: end-beneficiaries, informal and formal community leaders, project staff or volunteers (?)


4.4 Consultancy overview: End line Survey


4.4.1 Survey objective(s)


The objective of the end-line surveys is to understand the changes resulted from the projects by comparing baseline and end-line data of the project.


4.4.2 Sampling frame(s) and coverage


The sampling design, sample size and final selection of household methodology will replicate the baseline survey, which has intervention and control groups.


4.4.3 Survey questionnaire(s)


The baseline survey questionnaire will be used. Additional questions will be included are


  1. The questions for contribution/attribution analysis, which are based on two conceptual framework:

  2. Dose response analysis: The idea behind the dose-response option is that causal inference is strengthened if the outcome condition improves as an increasing function of the amount of program participation (http://betterevaluation.org/evaluation-options/checking_dose_response_pa…). Examples of indicators are:

  3. Number of project activities participated by respondents in the past XX

Question: Over the past XX months, have you participated in the following activities?


Provide a list of main project activities


  • Frequencies of meetings with the project staff to talk about issues related to the project

Question: Over the past XX months, haw many times have you met with XX staff or volunteers to discuss XX?


Examples of indicators are


  • % of respondents who attribute positive changes in XX (e.g., health care utilization) to the project. Question: In your view (or compared to other organization), to what extent has the Red Crescent program played a role in improving health care utilization in your village?

  • % of respondents who attribute the project as the source of reliable information or support or services in the specific thematic areas of intervention (e.g., who do you rely on information on XX and then a list of organizations).

  • Have you participated in programs from other NGOs or Government-run programs?

The questionnaire will be translated into Urdu in order to be administered to the local populace. Pre-testing of the questionnaire will be conducted with randomly selected respondents in the local vicinity.


4.4.4 Data analysis


Guiding principles for finalization of the data analysis:


  • The statistical analysis of such a study will involve point estimation of indicators across both intervention groups and the control group of villages at baseline, and an examination of the change from baseline to end-line in the indicators by these comparator groups. All data analysis to be segregated by women and men.

  • The statistical analysis of the differences between the intervention and control groups should include controlling confounding variables or covariates (e.g., background variables).

  • The data analysis should allow for the incorporation of sampling design in the variance estimation.

  • Statistical analysis will be done for the contribution/attribution analysis using dose response and end-beneficiaries’ perception: Possible methods are:

  • Regression analysis: The effectiveness of the program can be examined by using the level of involvement as a reflection of exposure to the program and testing its causal relationship with indicators of the outcomes (e.g., use of health care service) using regression analysis. The p value of < 0.05 will show that the probability of the trends/association is not due to chance.

  • In the analysis, the engagement ordinal variable can be analyzed using chi-square test scenario. If p-value for trend < 0.05, it means the dose response is unlikely.

4.5Evaluation Management Structure


*Evaluation Management Team:*The evaluation management team will oversee the evaluation to ensure the evaluation team upholds the expected quality of the evaluation. It will make a decision on the final TOR, the inception report, and the final report.


Each representative is responsible to consult with relevant personnel within their respective organization and consolidation of inputs to provide to the consultants.


*In-Country Coordinator and Supervisor:* Once the selected consultants start the implementation of the evaluation in Pakistan, they are required to maintain coordination with and report to CRC Country Representative/Program manager. He or she will be responsible for


a. Assuring that the consultant/firm is provided with the specified documents, and answer day-to-day enquiries.


b. Facilitating the work of the consultant/firm with beneficiaries and other local stakeholders.


c. Monitoring the daily work of the consultant/firm and flag concerns.


*Intellectual property:* All quantitative and qualitative data collected under this contract remains the property of CRCS and PRCS. Authorship for publication if any will be discussed with CRC/PRCS during the contract, if necessary. All these issues will be outlined in the contract between CRCS and the consultants.


****4.6 Time Table****


The summary of the tasks of the external consultants, deliverables and timelines are summarized below:


Please visit the Canadian Red Cross Career website at the link below in order to view the timetable in its entirety:


http://www.redcross.ca/who-we-are/jobs/opportunities-in-canada/tor–fina…


****Deliverables:****


· The inception report [research questions, methodology (sampling, sample size, etc.), data collection methods and tools, a timeframe with firm dates for deliverables, and the travel and logistical arrangements for the team]. The inception report is done based on desk-review of relevant project documents.


· Data collection and analysis, prepare and submit the first draft


· Present preliminary findings with MOH, CRC in Pakistan and another presentation for CRC-Ottawa by skype.


· Final report after receiving comments


· All raw, labelled data along with the analysis in a USB/CD to CRC/PRCS 3 days


25 days


3 days


3 days The deadlines are negotiable as long as the final report is completed by 31st July 2015


*Workshop for formulating follow-up actions after all the studies completed:*The consultants will present the findings in-country to key decision-makers and relevant staff of CRCS in the country and its partners (e.g., Host National Society). A workshop (1/2 day or one day) to construct recommendations/workplan based on the findings in with the evaluator to discuss the findings for future programming.


****4.7 Report Outline and Guideline on Provision and Adoption of Comments to the Report****


The final report will follow the outline below:


4.7.1 Structure of End-line Survey Report


The End-line line Study Report shall be written in English, and structured as follows;


  • Executive Summary (6 pages maximum)

  • Introduction/background (5 pages maximum)

  • Summary study methodology (3 pages maximum )

  • Study findings (30 pagesmaximum )

  • Conclusions and Recommendations (5 pages maximum)

Annexes


  • Terms of Reference

  • Name of the Consultant and their company where applicable

  • Detailed study methodology (data collection tools, sample design worksheet, area population statistics, etc.)

  • List of persons/organizations/facilities consulted

  • Literature and documentation reviewed

  • Geographical map of the area surveyed.

  • Other technical annexes where relevant (e.g. statistical analyses)

4.7.2 Structure of Final Evaluation Report


Executive Summary (a max of 8 pages): brief summary of the projects, the evaluation, and an overview conclusion by evaluation criteria, highlights of findings by evaluation criteria and highlights of recommendations.


****Main Body of the Report (no more than 60 pages)****


  1. Rationale and purpose of an evaluation

  2. Brief description of methodology of the evaluation – the details will go to the Annexes

  3. Findings by Evaluation Criteria and Questions- The findings will discuss consistency or inconsistency of findings from survey and qualitative data collection

  4. Recommendations (with clear linkages to the findings)

****Annexes:****


  1. Terms of Reference

  2. Detail information about methodology, including evaluation framework or matrix. The methodology of the survey should be described in line with the international standard of survey report.

  3. References and List of Documents Reviewed

  4. List of Respondents

  5. Data collection Tools (e.g., interview guide, etc.)

  6. Map of evaluation sites

  7. Brief Background of Evaluators and their contact details (a max of 2 pages), the contact details are needed if more clarifications are required from the consultants after the evaluation is completed.

The evaluation process will be followed to ensure stakeholder input while maintaining the integrity and independence of the evaluation report according to the following lines.


  • Inaccuracy. Inaccuracies are factual, supported with undisputable evidence, and therefore should be corrected in the evaluation report itself.

  • Clarifications. A clarification is additional, explanatory information to what the evaluators provided in the report. It is the evaluators’ decision whether to revise their report according to a clarification; if not, the evaluation management response team can decide whether to include the clarification in their management response.

  • Difference of opinion. A difference of opinion does not pertain to the findings (which are factual), but to the conclusions and/or recommendations. These may be expressed to the evaluators during the evaluation process. It is the evaluators’ decision whether to revise their report according to a difference of opinion; if not, the evaluation management response team can decide whether to include the clarification in their management response.

4.7.3 Evaluation Quality & Ethical Standards


The evaluators should take all reasonable steps to ensure that the evaluation is designed and conducted to respect and protect the rights and welfare of the people and communities involved and to ensure that the evaluation is technically accurate and reliable, is conducted in a transparent and impartial manner, and contributes to organizational learning and accountability. The evaluator will sign and adhere to the Canadian Red Cross Code of Conduct.


The evaluation standards are:


  1. Utility: Evaluations must be useful and used.

  2. Feasibility: Evaluations must be realistic, diplomatic, and managed in a sensible, cost effective manner.

  3. Ethics & Legality: Evaluations must be conducted in an ethical and legal manner, with particular regard for the welfare of those involved in and affected by the evaluation.

  4. Impartiality & Independence; Evaluations should be impartial, providing a comprehensive and unbiased assessment that takes into account the views of all stakeholders.

  5. Transparency: Evaluation activities should reflect an attitude of openness and transparency.

  6. Accuracy: Evaluations should be technically accurate, providing sufficient information about the data collection, analysis, and interpretation methods so that its worth or merit can be determined.

  7. Participation: Stakeholders should be consulted and meaningfully involved in the evaluation process when feasible and appropriate.

  8. Collaboration: Collaboration between key operating partners in the evaluation process improves the legitimacy and utility of the evaluation.

It is also expected that the evaluation will respect the seven Fundamental Principles of the Red Cross and Red Crescent: 1) humanity, 2) impartiality, 3) neutrality, 4) independence, 5) voluntary service, 6) unity, and 7) universality. Further information can be obtained about these Principles at: www.ifrc.org/what/values/principles/index.asp


****5.0 Qualifications and Experience****


Selection of the external End term assessment consultant/team will be based on the following qualifications:


****Requirements for the Evaluation Team****


  • Having completed the baseline surveys of similar programs in Pakistan (KP)

  • Previous experience of completing large scale final evaluation in similar context.

  • A mixed of male and female member of the Team

  • Strong knowledge of quantitative and qualitative evaluation/research methods

  • The team must include technical expert in Public health, MNCH, Organizational Development, Disaster Management, gender and Violence Prevention

  • Good facilitating and diplomacy skills, highly cultural awareness and sensitivity.

  • Strong writing skills in English

  • Proficiency in Urdu, in Pashtu is an advantage




Source link



0 comments:

Post a Comment