TOR for Evaluation of Rwandan Girls’ Education and Advancement Programme (REAP) in Rwanda

Background
to the GEC Programme and the Project
1. GEC
programme background
The
Department for International Development (DFID) manages the UK’s aid to poor
countries and works to get rid of extreme poverty. DFID is working to reach the
Millennium Development Goals (MDGs), the international targets agreed by the
United Nations (UN) to halve world poverty by 2015. Progress on girls’
education is critical to the achievement of these targets.
DFID is
refocusing its efforts on girls’ education through the Girls Education
Challenge (GEC) with the ambition that this will have a catalytic effect on
other international partners. The GEC was open to competitive bids from
non-state organisations to fund programmes that focus on getting girls into
primary and lower secondary education, keeping them there and making sure they
learn. It is expected that £355 million is available in total to support the
GEC up to March 2016. This support should enable at least 660,000 marginalised
girls to complete a full six-year cycle of primary school or 1 million
marginalised girls to complete three years of junior secondary school.
2.
Project context
Health
Poverty Action, known as Health Unlimited prior to 2010, works
in 13 countries across Africa, Asia and Latin America (with more currently in
development).  We always prioritise the most poor and the most
marginalised – those neglected by almost everyone else.  Health Poverty
Action has been operational in Rwanda since 1998 and started its programming
with the now highly popular Urunana radio soap opera.  Health
Poverty Action is currently implementing sexual and reproductive health, sexual
and gender based violence, water and sanitation, HIV, and youth income
generation projects. 
In 2012,
Health Poverty Action conducted a comprehensive assessment and stakeholder
consultation in Nyaruguru to explore the reasons why girls drop out at
secondary level.  It found that the main barrier to girls’ education at
this level is: cost; lack of opportunities for girls at the end of their
education; teenage pregnancy (about 20% of girls have started child bearing by
the age of 19); school environments that are not girl-friendly; and sexual
harassment in school. 35% of families withdraw their children due to the
inability to afford the user fees, books, uniforms and lunches, and the
opportunity cost of girl labour at home.  Families prefer to send boys
because they are perceived to have more income generating opportunities after
school than girls. Lack of opportunities for girls after school means there is
no reason for completion.  With a ratio of one toilet to 60 students,
schools do not have safe and adequate sanitary and dormitory facilities for
girls and boys. Girls mostly stay home from school during menstruation. Girls
who are: orphans, affected by HIV, disabled, household heads, historically
marginalised (Batwa etc.) are particularly at risk of dropping out.
3.
Project rationale
The
government and other NGOs are investing in improving the supply side of
education in Rwanda, for example with teacher training, new curricula, and
improving classroom structures.  Health Poverty Action therefore proposed
a project that would apply innovative approaches to increase demand for
girls’ education.  DFID agreed to fund the project.
4.
Project aims and objectives
The
project’s expected impact is to improve life chances of marginalised
girls.  Its expected outcome is 8,983 marginalised girls in Rwanda able to
complete a full cycle of education and demonstrate learning.  Expected
outputs are:
  • Increased income for
    investment in schools and girls’ education.
  • Safe, sanitary and
    girl-friendly school environments.
  • Increased awareness among
    girls and parents about the importance of and barriers to girls’
    education.
  • Learning and operational
    research from the project contributing to and informing girls’ education
    models and their replication. 
5.
Intended project outcomes:
The
project will benefit approximately 16,324 girls aged 6–19 in 28 primary and
secondary schools.  If the pilot is successful, it will be scaled up to
reach 37,843 girls (aged 7-12) in 86 primary schools and 6,017 girls (13-18) in
39 secondary schools across Nyaruguru District.  It can also be applied in
other regions of Rwanda.  The project’s community-led targeting process
will target the worse off schools, and within those worse off schools the girls
most in need will benefit from the IGA, including girls who are orphans,
affected by HIV, disabled, household heads, historically marginalised groups
like the Batwa) and out of school girls. 
6.
Project activities:
The
project will implement the following innovative methods:
  • “Education that Pays for
    Itself” model: Teach a Man to Fish (TAMTF) will pilot this model (which
    has been successful in Tanzania, Kenya, Uganda and 2 institutions in
    Rwanda).  Alongside traditional academic subjects the Financially
    Sustainable Schools will teach business and practical skills through
    running their own profit-making businesses which provide motivation and
    cover part of girls’ school costs.
  • Mother-Daughter Clubs
    (MDC):  This method has never been tested in Rwanda.  The teams
    will be trained on Income Generating Activities (IGA) (e.g. production of
    washable sanitary napkins); formed into cooperatives; and facilitated to
    discuss issues like teenage pregnancy and the importance of education.
  • ECOSAN composting toilets in
    schools:  Previously tested in Rwanda, these toilets turn human
    excreta into safe compost material.  The project will facilitate
    communities including women and men to construct school toilets, maintain
    them, and use the compost for income generating school gardens. 
    Changing facilities will also be constructed alongside the toilets for
    girls to wash and dry their reusable sanitary napkins.
  • Radio Soap Opera on Girls’
    Education:  Building on the experience of an ongoing health radio
    soap opera, the project will pilot a new version of the soap (using the
    same popular characters) to address issues related to girls’ education.
7.
Project delivery roles and responsibilities:
The
project will be led by Health Poverty Action (HPA), a UK based NGO already
managing several DFID contracts, whose focus is on the links between health and
poverty. Partners will include: 1)Teach a Man to Fish – a UK based NGO
specialising in education, which invented the “Education That Pays For Itself”
methodology which has been piloted in Latin America, Tanzania, Kenya, Uganda
and two Rwandan institutions; 2) Urunana Development Communication (UDC) – a
local Rwandan NGO which, for more than 10 years, has been running Urunana, an
entertaining and educational radio soap opera that has successfully
disseminated health messages and led to behaviour change among 70% of the
national population/ 7 million Rwandans who listen to it; and 3) Nyaruguru
district authority who will support the project to coordinate with schools and
communities and other initiatives in the district.
8.
Overview of the project budget and implementation timescales:
The
project runs from June 2013 to February 2016 with a budget of
£1,100,640.08.  Health Poverty Action is also adding £225,000 from other
sources.
Evaluation
Objective
9. The project is seeking to
procure the services of an independent Evaluation Team to design, plan and
conduct the evaluation of the REAP project over the next three years, which is
funded through the GEC. The Evaluation Team will provide an independent and
rigorous evaluation and research function, designing and implementing a
framework which will assess the process of delivery, effectiveness, Value for
Money (VfM) and impact of the project and report the findings and lessons
learnt throughout the process.
Recipient
of the Service
10. The recipient of the service is
Health Poverty Action.
Scope of
Work
11. GEC
Programme Evaluation Questions
1. Was
the GEC successfully designed and implemented?
1.1 Did
the multi-window challenge fund structure lead to: a) the funding of innovative
projects, b) increased funding of new partners, c) more effective and
efficiently implemented interventions, d) delivery of results?
1.2 Did
the fund management model ensure the aims and objectives of the GEC were
achieved more effectively and efficiently than alternative models of delivery?
1.3 Did
the selection process ensure that projects funded were aligned with the GEC
aims and could deliver results against objectives in the time available?
1.4 What
lessons can be learned from the Recipient experience of the
application/selection/implementation process?
2. What
impact did the GEC funding have on the retention and learning of marginalised
girls? Was the GEC good VfM?
2.1 How
successful was the programme in targeting marginalised girls? Which projects
were most successful?
2.2 What
was the impact of the programme on enrolment and the retention of girls in
school? Which projects were most successful?
2.3 What
was the impact of the programme on the learning outcomes of marginalised girls
and other Recipients? How did this vary by project?
2.4 What
was the impact on school level indicators of performance such as teacher
absence or pupil-teacher ratio? 
2.5 Did
the impact of the GEC represent good VfM? How did this vary by project?
2.6 Did
the benefits of GEC outweigh the costs of investment?
3. What
works to increase the enrolment, retention and learning of marginalised girls?
3.1 What
impact did the GEC have on barriers to educating girls at the individual,
community, institutional level?
3.2 Which
approaches were most successful in increasing retention and learning of
marginalised girls?
3.3 Are
interventions to increase the enrolment, retention and learning of girls more
effective in a private school setting or in government provided places?
3.4
Gender inequalities theme: How did girls’ experience of gender inequality
interact with educational opportunity?
3.5
Dimension of marginalisation theme: How did girls’ experience of
marginalisation interact with educational opportunity?
3.6
Conflict theme: How did conflict affect and shape the educational access and experience
of girls? 
4. How
sustainable were the activities funded by the GEC and was the programme
successful in leveraging additional interest and investment?
4.1 How
successful was the GEC in leveraging funds?
4.2 Did
Recipients engage with other education programme leaders in country to ensure
complementarities and minimise overlap?
4.3 To
what extent was GEC successful in leveraging change in practice and improving
impact and lesson learning on policy and programme interventions to improve
retention and learning outcomes of marginalised girls?
4.4 Are
funded projects sustainable? Have they made a sustainable impact?
12.
Project Evaluation Questions
5. 
What differences can be seen within Treatment A and Treatment B groups, and
what does this say about the different activities conducted in each of those
groups?
6. How
much income have schools and Mother-Daughter Clubs been able to generate, and
how are they spending this money?
7. How
has the project’s approach negatively or positively affected girls’ and
household income and nutrition?
13.
Overall evaluation approach
The
Evaluation Team will be required to design, plan and conduct the evaluation
process for the project and produce the reports required to demonstrate the
effectiveness, impact and VfM of the project.
14.
Evaluation approach
The
Evaluation Team will be required to develop an evaluation approach that enables
the Project Management Team to report to DFID, partners and stakeholders
against the following overarching questions as a minimum:
  • Process – Was the project
    successfully designed and implemented?
  • Impact – What impact did the
    project have on the retention and learning of marginalised girls?
  • VfM – Was the project good
    VfM?
  • Effectiveness – What worked
    to increase the enrolment, retention and learning of marginalised girls?
  • Sustainability – How
    sustainable were the activities funded by the GEC and was the project
    successful in leveraging additional interest and investment?
15.
Monitoring strategy
The
Evaluation Team will be required to support the Project Management Team to
design, establish and implement a comprehensive monitoring strategy including a
data collection strategy to support the implementation of the evaluation.
Technical support should include guidance concerning the project monitoring
processes required to satisfy the Payment by Results process.
16.
Cohort tracking
Monitoring
and evaluation (M&E), learning moments, participatory research, and
operational research will take place throughout the project.  A quasi-experimental
technique will use to divide the target schools and areas into three
cohorts:  Treatment Group A (14 schools) will receive the complete set of
interventions; Treatment Group B (14 schools) will receive all interventions
except for the toilets and changing room construction; and Control Group 14
will receive none of the project interventions, except that the students may
here the radio programme which will be broadcast across the country and cannot
be avoided.
The
Evaluation Team will be required to develop the project’s methodology for
tracking a cohort as an integral part of their impact evaluation strategy to
enable an accurate and useful comparison of the effects on those who have
benefited from a project with those who have not by:
  • Enabling us to define a
    group that has a consistent meaning, which makes it possible to match and
    compare change over time;
  • Allowing us to precisely
    link outcomes (or effects) back to earlier circumstances, which helps to
    allow (and ‘control’) for differences in baseline circumstances when
    attempting to evaluate the impact that is attributable to a particular
    project;
  • Providing greater insights
    over the longer term into the causal effects of projects; and
  • Providing a useful means of
    tracking outcomes for a ‘later’ cohort for the purpose of evaluating
    sustainability effects.
17.
Inception report
The
Evaluation Team will be required to produce an Inception report that includes:
M&E
strategy including monitoring strategy, sampling framework (cohort tracking strategy),
evaluation design and research methodology (including key evaluation
questions);
  • Risk management plan;
  • Quality assurance plan;
  • Research ethics plan;
  • Baseline study design;
  • Roles and responsibilities;
  • Stakeholder engagement and
    communication plan; and
  • Detailed M&E work plan.
18.
Baseline Study
The
Evaluation Team will be required to design and implement a baseline study as an
integrated part of the overall M&E strategy and plan for the project.
19.
Impact evaluation
The Evaluation Team will be
required to design and implement a project impact evaluation involving
quantitative and qualitative primary research of beneficiaries and
non-beneficiaries as a control or comparison group for the purpose of
establishing a counterfactual, against which the attributable impacts of the
projects can be evaluated.
Existing
Information Sources
20.
Sources of info
In the
first instance, bidders should refer to the DFID GEC website: www.dfid.gov.uk/Work-with-us/Funding-opportunities/Not-for-profit-organisations/Girls-Education-Challenge
for general information concerning the Girls’ Education Challenge.
Bidders
should refer to the following GEC programme documentation that includes:
  • DFID GEC Business Case;
  • GEC programme logframe;
  • GEC Programme Evaluation
    Questions;
  • GEC Evaluation Strategy;
  • Grant Recipient Handbook;
Bidders
should refer to the following GEC project documentation that includes:
  • Project logframe;
  • Project Full Application as
    included in the Accountable Grant Arrangement;
  • Evaluation Manager feedback
    on the M&E components of the Project Full Application;
  • Health Poverty Action’s
    organisational M&E database.
Bidders
should also refer to relevant country data and information that is currently
available, as required, to prepare the proposal.
Evaluation
Approach
21.
Project evaluation approach
In
developing the project evaluation approach the Evaluation Team will need to
consider the following:
  • the programme evaluation
    objectives and evaluation questions and the projects relationship to
    these;
  • the complexity and clarity
    of the project logframe, design, evaluation questions and the
    measurability of the intended outcomes and the effect this has on its
    evaluability;
  • the requirements of the GEC
    Programme Evaluation Strategy and the planned approach to evaluation and
    data collection;
  • availability and quality of
    existing evidence and data sources; and 
  • the proportionate amount of
    time and resources that should be allocated to evaluation given the type
    of project interventions, operational context and the reporting
    requirements of the GEC.
22.
Programme Evaluation Strategy
The
Evaluation Team will be required to develop an evaluation approach that is
complementary of the programme evaluation approach. Essentially, the GEC
Programme Evaluation Strategy proposes a representative population household
survey to capture the prevalence of different risk factors and reading and
numeracy skills at the baseline stage amongst the general population, an
assessment of levels of exposure to the project’s intervention and changes in
intermediary outcomes (e.g. attitudes) at the midline stage and assessment of
the outcomes achieved at the endline stage.
23.
Programme quantitative research
The
programme approach to quantitative primary research involves:
  • Probability sampling of
    communities and households within the relevant areas;
  • Baseline measurements of
    risk factors (including barriers to education) and reading / numeracy
    ability;
  • Tracking of individual
    families through repeat visits, with the capability to link across
    measurements and to other sources of data and to process data where
    appropriate;
  • Midline measurement of
    receipt and exposure to the project intervention, based on direct survey
    questions and checks to project and school records and findings of
    community surveys; an assessment of intermediary outcomes such as changes
    in attitudes / expectations; and
  • Endline measurement on the
    range of GEC outcomes, again through direct survey and checks to project
    and school records. This may be an opportunity to assess other outcomes of
    interest, such as predictors of other longer term outcomes (impacts) such
    as self-concept, health behaviour awareness and life skills and
    effectiveness.
  • Alongside this
    population-level evaluation, the programme evaluation strategy seeks to
    triangulate the primary data collection with quantitative and qualitative
    data collected at the project level.
24.
Programme qualitative research
Qualitative
research has a vital role to play in both delivering independent evidence of
project effects and effectiveness, but also in providing evidence about the
causal relationships involved. Whilst project participants are unlikely to be
able to objectively judge causal relationships at the aggregate level, they
offer unique knowledge of reasons for change at the individual level, as well
as an understanding of the diverse influences which cannot be identified
through quantitative research. Qualitative research provides a clearer
understanding of why projects work or do not work in expected and unexpected
ways.
25.
Project evaluation approach
The
Evaluation Team will be required to develop an evaluation approach that is
capable of rigorously evaluating the performance, process of delivery,
effectiveness, VfM, impact and sustainability of their projects. To answer the
GEC Evaluation Questions with regards to these aspects of the projects,
Recipients will need to design a range of approaches using a mixture of
quantitative and qualitative research methods.
26.
Project impact evaluation
Bidders
are required to outline their approach to evaluating the impact of the project.
This should include consideration of the most rigorous approach to establishing
a counterfactual that enables comparison of the outcomes achieved by a target
group who were affected by a project intervention with the outcomes achieved by
a group of people who are similar in every way to the target group, except that
they have not in any way been exposed to or affected by the project
intervention i.e. a control or comparison group. Careful consideration should
be given to the use of experimental or quasi-experimental methods for this purpose.
27. Project cohort tracking
The
project is required to track a cohort of the population – defined as a group of
individuals who progress through life (community or school) together. Cohort
tracking is an important tool for monitoring and evaluating changes in outcomes
for a particular population. In order to identify and assess the impact of the
project on the project’s target group it is necessary to track and measure the
changes experienced by those who are not exposed to the project’s activities in
order to compare and contrast these effects.
Project
Research Methodology
28.
Quantitative research
The
Evaluation Team will be required to develop quantitative research methodologies
that enable an assessment of the attributable impacts of the project based on
evidence that the target population have realised improved educational outcomes
as a result of the project’s outputs. Although aggregate evidence from a
representative selection of schools is important for some outputs and for some
outcomes, such as retention, ultimately what is important is to study a
representative selection of target girls’ experiences and outcomes. This
involves identifying a representative sample of the target population and
showing that their educational outcomes are (positively) related to project
activity.
29. Project sampling framework
The
Evaluation Team will be required to design sampling frameworks for both
qualitative and quantitative samples that should be of a sufficient size and
representativeness to allow:
  • reasonable levels of
    certainty that the findings are representative for the target population;
  • reasonable ability to
    generalise the intervention’s effectiveness to similar contexts;
  • reasonable ability to
    generalise the insights into what works and why for similar contexts; and
  • for the quantitative
    sampling framework, a design that is shared with the Evaluation Manager
    for the purpose of facilitating joint implementation as appropriate.
Bidders
will be required to come up with the best possible sampling framework. 
However, an idea of a possible sampling framework for the project is presented
below:
Around
216,000 community members in Nyaruguru (approximately 80% of the District
population) will be reached by community level girls’ education promotion
activities. 7,000,000 Rwandans nationally will listen to girls’ education
messages on UDC’s popular ‘Urunana’ radio show which is broadcast nationwide
and listened to by 74% of the national population, including the Nyaruguru
community members mentioned above. 
The
sampling units of the quantitative survey will be the project target groups and
beneficiaries which include:
  1. Households who have children
    benefiting from school businesses and Mother-Daughter Clubs (MDC): 
    These Clubs will be tracked for Indicator 1.3 (% increase in targeted
    households reporting increased income) and Output 3 indicators related to
    “increased awareness among girls and parents about the importance of and
    barriers to girls’ education.”  Treatment Group A and B of these
    target households will be compared against households in the Control Group
    schools for these indicators.  The household respondents will include
    parents; and girls of school going age. 
  2. All girls in the targeted
    schools will be tracked for Outcome indicators such as the number of them
    who stay in school, and number who have increased their learning (through
    testing) and Indicator 2.1 (% pupils in Nyaruguru have composting toilets
    with hand washing facilities at their school).  Treatment Group A and
    B of the schools will be compared against the Control Group schools for
    their indicators.  In the case of school testing, not every one of
    the over 15,000 girls will be tested by the project every year. 
    Rather, a sample of pupils in 5 classes in each school (totalling 270
    girls per school). The pupils will be from the following grades , P1, P3,
    P5, SS1, SS5 and will undertake the EGRA and EGMA tests every year of the
    project, with comparisons made across treatment Groups A and B as well as
    the Control Group.  For the % with access to composting toilets and
    hand washing facilities, this will be calculated among all the pupils in
    Treatment Groups A and B and the Control Group. 
For the
qualitative research methods, other targets / key stakeholders will be
included:
  • Health Poverty Action and
    other partners’ management staff.
  • The direct staff of the
    project.
  • DFID,  the Fund Manager
    and the Evaluation Manager.
  • Ministry of Education staff
    at national, district, sector, cell, and school levels.
  • Ministry of Gender and
    Family Promotion officials at national and district levels.
  • Energy, Water and Sanitation
    Authority staff.
  • Nyaruguru District Mayor and
    other authorities.
  • Project Steering Committee.
  • Project Evaluation Steering
    Group.
  • Parent Teacher Associations
    involved in the project.
  • Other teachers of the
    school.
  • Mother-Daughter Clubs
    involved in the project.
  • Those involved in building
    the school toilets, change rooms, and water facilities.
  • Boys.
The
sample size computed for the households who have children benefiting from
school businesses and Mother-Daughter Clubs (MDC) will be based on the formula
shown in box 1 and three statistically significant assumptions detailed
beneath.
Where:
n =
      Sample size required
N
=       Total population size
d =
      Precision level
Z
=       Number of standard deviation units of
the sampling distribution corresponding to the
Assumed
values of the variables in the above formula are:
N= Total
population size
d=
precision level, which will be 0.05
z=confidence
level, which will be 1.96
Thus, the
sample size for the Treatment Group A and B and Control Group altogether are
calculated based on the above formula.  Accordingly it will be xxxx
households. With an assumed 10% none response rate added to this total results
in a the final sample size of xxxx.
In order
to know the estimated sample size for each Treatment and Control Group, the
computed sample size will be divided to each group based on the probability of
proportionate to population size of each group. In this regard the sample size
for Treatment Group A will be xxx (xx% of the total computed sample size);
Treatment Group B will be xxx (xx% of the total computed sample size) and the
Control Group will be xxx (xx% of the total computed sample size).  
Moreover,
an appropriate statistical sample significant for the project target groups
will be identified. This will be based on the proportion of population of the
project target groups and beneficiaries. These will be parents/guardians; and
girls.
The
project will directly benefit XXXXX  people stratified into four
population groups: parents/guardians and girls of school going age.  
Proportionally, xx% are women with children; xx% are men with children and xx%
are girls. Hence, this proportion further will be used to compute how many
samples from each group (strata) will be selected from the total sample size
computed in each group (Treatment A and B and Control). In this regard, to
ensure sufficient number of responses from each beneficiary group in each
group, the sample that will be taken from each group will be as indicated in
the table below:
Beneficiary
by group
Treatment
A
Treatment
B
Control
Parents
Xxx
xxx
xxx
Girls
of school-going age
Xxx
xxx
xxx
In order
to measure and compare the study variables at three points in time (baseline,
mid-term, final), the same sample size should be used each time.
The study
units for the quantitative survey will be contacted at the household
level. Cluster sampling will be utilized in order to select survey households
from Treatment and Control Groups. Thus, the sampling will use the following
procedure;
  1. The total sample size
    (households) computed for each sector will be divided into each strata
    based on the proportion of probability population of each strata from the
    total population.
  2. From each stratum one cell
    will be selected randomly. 
  3. From each randomly selected
    cell, households (HHs) the size allotted to the strata from which this
    randomly selected cell belongs to will be contacted to attain the sample
    size desired for the sector.
  4. In each cell, two groups of
    people will be interviewed: parents and girls of school going age. xx% of
    the xx HHs will be with parents and xx% with female and female youths.
  5. HH from each cell will be
    selected by the research team by standing at the centre of the cell and
    spinning a bottle (or use another random technique) to determine the
    direction of movement. A random number will be chosen to identify the 1st
    household in each direction.  Based on the numbers per cell, the
    consultants will decide how many household to survey in a straight line,
    and the number of houses to skip in between each one
  6. One of the eligible
    respondents (parent and girl) in selected households will be
    interviewed.  If an eligible respondent is not at home, if possible
    he or she should be called.    If the eligible person is
    not able to come, s/he will be recorded as “Not Present.”
  7. If an eligible respondent in
    the household is Not Present, the researchers will select the next
    household by counting five houses to the right.  If the border of the
    cell is met before the desired number of interviews is reached, the
    interviewer and/or supervisor should return to the centre of the cell and
    choose a new direction and new random start.
  8. Where the number of
    households in the randomly selected cells is found to be less than the
    required number, then the next cell from the same strata will be selected
    randomly and so on until the required number of households is reached.
The same
sampling procedure that is used for the baseline survey will be the same at that
used at the end line. Moreover, the same study cells that were randomly
selected during the baseline will be measured again at mid-term and final
surveys, hence measurements at points-in-time (before and after project
intervention) can be compared across time.
Quantitative
information at the school level will collect data on the entire target
beneficiary population of girls in order to calculate the project’s reach i.e.
the number of marginalised girls who could have been directly and indirectly
affected by the project will be counted.  Retention numbers will be
attained from school data and the Education Monitoring Information
System.  The data on cohort tracking is required by the Evaluation Manager
to estimate (and generalise) what the actual effects (i.e. impacts) of the
project on the target population as a whole might be on the basis of the
effects on the sample that was measured through impact evaluation. 
However, educational testing to “demonstrated learning” will not be measured
with every girl in every school in each year.  Rather, some of the grades
will be sampled annually to give an overall picture.  Testing every girl
per year would not be possible within the budget of this project.  
The study
subjects for the qualitative study will be selected purposively. These
will include parents, girls of school-going age; and the other targets /
stakeholders named above.
Professional
Skills and Qualifications
30.
Qualifications
Bidders
are required to clearly identify and provide CVs for all those proposed in the
Evaluation Team, clearly stating their roles and responsibilities for this
evaluation.
The
proposed evaluation person / team should include the technical expertise and
practical experience required to deliver the scope of work and evaluation
outputs, in particular, with regards to:
  • Evaluation design: design
    and plan the evaluation approaches and research methodologies, including
    quantitative and qualitative research methods – the team should include
    skills and expertise required to design, plan and conduct impact
    evaluation, potentially using experimental or quasi-experimental
    techniques;
  • Information management:
    design and manage data and information systems capable of handling large
    datasets for M&E purposes;
  • Statistical analysis: a
    range of statistical modelling and analysis of impact data; highly
    proficient user of: SPSS or STATA; and qualitative data analysis software
    e.g. ATLAS.ti, NVivo or equivalent; and
  • VfM assessment of education
    projects: education economics expertise to conduct cost benefit analysis
    and cost effectiveness analysis as part of the assessment of the project’s
    VfM.
31.
Organisational experience
Bidders
should provide evidence of previous project experience for the provision of
similar evaluation services and the design and implementation of similar
evaluation activities required by this ToR.
Evaluation
Governance Arrangements
32.
Project Evaluation Steering Group
The
evaluation process will be guided by a dedicated Evaluation Steering Group from
the start to finish of the project.
33. Evaluation Steering Group
purpose
  • The Evaluation Steering
    Group will play an advisory role in the planning and implementation of the
    evaluation. Their role is :
  • advise on the terms of
    reference, scope and focus of the evaluation;
  • support the evaluation by
    facilitating access to the documentation and data required for the purpose
    of evaluation;
  • regularly assess and assure
    the quality of the design, research and deliverables;
  • provide a source of
    validation for the findings emerging from the evaluation; and
  • ensure that findings and
    lessons learnt are fed back to relevant audiences in order to maximise the
    utility of the evaluation process.
34.
Evaluation Steering Group Meetings
The
Evaluation Steering Group will meet regularly (timetable to be confirmed),
particularly at stages in the evaluation process when deliverables are
produced, including:
  • submission of the Inception
    Report;
  • submission of the Baseline
    Study Report;
  • submission of the
    participatory research on reasons why girls drop out;
  • submission of emerging
    findings from the evaluation fieldwork; and
  • submission of the Final
    Project Evaluation Report.
35.
Evaluation Steering Group composition: the steering group will include:
  • Health Poverty Action’s
    Project Coordinator for REAP
  • Health Poverty Action’s
    M&E Officer for REAP
  • A representative from Teach
    a Man to Fish
  • A representative of the
    Nyaruguru District Office
  • A representative of the
    Ministry of Education
  • An evaluation research
    specialists or academics (e.g. a professional statistician or economist).
36.
Day–to–day project management of the evaluation
will be the responsibility of
[Recipient to insert name and position of responsible person].
Recipient
note:
an
Evaluation Steering Group will often include project stakeholders and people
with subject matter expertise evaluation, such as:
  • representatives
    of the commissioning agency;
  • high-level
    project staff;
  • representatives
    of beneficiary organisations; and
  • representatives
    of the funding agency.
Evaluation
Steering Groups take time to organise and manage. Make sure that you factor in
this time as a cost and ensure that you have the right expertise available to
ensure effective management and participation of the group.
Deliverables
and Schedule
37. Project deliverables: the
main deliverables for this project are as below. 
1.
Inception report
Setting
out the design of the M&E strategy and plan and associated planning,
logistics, quality assurance and risk management information.  A
detailed methodology and tools in English will be included in the
report.  The Evaluation Steering Committee as well as the project key
staff will review the report and tools and provide feedback, which the
Evaluation Team will need to incorporate and have approved by Health Poverty
Action before commencement of the work.
2.
Baseline study
Design,
conduct and submit a baseline study that describes the initial conditions
(before the start of the project) against which progress can be measured or
comparisons made to show the effects and impacts of the project in the final
project evaluation report. The main body of the baseline study report should
be no more than 30 pages excluding additional annexes. A final report
structure will be provided at the Inception meeting.  The aims of the
survey include:
a)     
Collecting baseline data against all indicators in the project’s logical
framework, disaggregated by the project’s three cohorts
b)     
Researching norms, values, beliefs and the community practices in relation to
Girls Education
c)     
Measuring and documenting current barriers to education at school and
household level in the focus areas with consideration to numbers, age, level
of education etc.
d)     
Identifying key motivators for safe Girl’s education environment (to inform
culturally appropriate messaging, understanding motivations, barriers to
actions, opportunities)
The
scope of the survey will entail a desk review, the preparation of research
protocols, data collection tools, pretesting of questionnaires, training of
enumerators; data processing (data entry, verification and analysis),
coordinate report writing, dissemination workshop and finalising the survey
report.  The specific key tasks that would be expected for this
consultancy include:
a)    
Clarification on the purpose of the survey and develop a detailed work plan
for the period of the consultancy. 
It is
important to be absolute, clear about the purposes at the start with the
consent of HPA and the government line ministries. Prior to the commencement
of the survey, the consultant is expected to develop a detailed work plan for
the whole exercise. This plan will be shared with HPA, the line ministries
and a few nominated sector partners.
b)    
Definition of the study population
It is
prudent to define exactly whom we are interested in studying. It is vital to
ensure that this definition corresponds to the purposes of the survey. The
study population includes the catchment areas of 14 Treatment A Group
schools, 14 Treatment B Group schools, and 14 Control Group Schools.
c)     
Sampling and estimating the sample size
In
order for the result to be representative of the target population, the
sample size will be determined by appropriate sampling technique and the
chosen level of precision agreed upon by HPA and the consultant. 
d)    
Data collection and analysis
The
data for this survey will be collected through desk review, questionnaires,
key informant interviews and focus group discussions. It is important to
ensure that all the enumerators follow the same interview protocol. All
interviewers should adopt the same approach in explaining the survey,
phrasing particular questions, and recording the responses. This will
minimise any bias. The results will be recorded and analysed using
statistical packages such as statistical package for the social science
(SPSS) or other similar. The consultant is expected to train enumerators on
data collection, data entry and, if required, train key enumerators on the
application of the software.  Double data entries should be applied and
necessary measures should be made to ensure proper data entries. Data should
be tabulated into dummy tables and be analyzed per variable and / or
multi-variables. Findings should be well analyzed by closely looking at both
qualitative and quantitative data/information collected.  A stakeholders
workshop for findings validation will be organised to present the preliminary
findings.
3.
Operational research and value for money analysis on pilots
The
project will pilot 4 innovative activities:  1) “Education that Pays for
Itself”; 2) MDCs; 3) using school-based ECOSAN toilets to boost production in
school gardens; and 4) a radio soap opera on girls’ education. 
Together, these make up an innovative holistic multi-sector integrated
programme to tackle gaps in Rwanda’s education system other actors are not
filling.  This holistic approach is in itself innovative, and also
unique because it will be implemented by a wide range of actors not normally
working together on girls’ education, e.g. Government ministries related to
education, gender, and water and sanitation; NGOs focusing on health,
education, and radio; private sector craftsmen; PTAs; village microfinance
institutions; and community structures e.g. CDCs.
However,
some may argue that the full scale intervention may not be replicable for all
schools due to the infrastructure elements (construction ECOSAN toilets and
changing rooms) which some schools cannot afford with external support /
funding.  The project will therefore have two different intervention
cohorts:  Treatment Group A (14 schools / catchment areas with all the
activities, including initial investment in toilets and change room
construction) and Treatment Group B (14 schools / catchment areas with all
activities but not toilets or change rooms), as well as a representative
control group.  This methodology will assess whether the activities of
the project are successful with and without an initial capital investment in
toilet and change room facilities.  This will isolate them and measure
their individual and combined impact, as well as enabling assessment of value
for money (VFM) of each part and the overall.  
The
Evaluation Team will propose and carry out an ongoing operational research
approach which may include plans for data gathering, possibly with occasional
“learning moments” in addition to the mid-term and final evaluation mentioned
below to compare Treatment Group A, B and the Control Group. 
They
will also conduct operational research on the individual innovative pilots
and the overall holistic approach in order to determine which of the
innovative pilots are most clearly yielding measurable results and whether
the holistic package creates greater impact through synergies (i.e. the
efficiency and effectiveness of the intervention in terms of VfM).  Each
of the pilots has been budgeted separately, and will be analysed individually
and as a whole during the VfM assessments at mid-term and endline. 
These VfM assessments will examine the economy, efficiency and effectiveness
of each pilot and the holistic package.  Using approaches proposed by
Coffey International Development and the Bond Effectiveness Programme, the
project will include a measurement and comparative assessment approach to VfM
assessment, and a management approach focusing on the extent to which key
management processes and resource allocation decisions result in efficient
delivery of higher value inputs, the efficient conversion of these into
activities and outputs, and ultimately the contribution of each of the outputs
and the project as a whole to the 11,900 marginalised girls in Rwanda.
4.
Analysis of educational testing
In
addition to the government educational testing, the project will implement
its own annual testing.  The project testing will take place among a
sample of grade levels (the same every year to ensure comparability) as
testing every girl annually is not affordable.  The following tests will
take place:
  • P1
    – testing by the project
  • P3
    – testing by the project
  • p5
    – testing by the project
  • P6
    – standardised standardised testing of the government, but results will
    be captured and used in the project’s data collection system and
    analysed among the others
  • S1
    – testing by the project
  • S3–
    standardised standardised testing of the government, but results will be
    captured and used in the project’s data collection system and analysed
    among the others
  • S5
    – testing by the project
  • S6
    – standardised standardised testing of the government, but results will
    be captured and used in the project’s data collection system and
    analysed among the others
The
tests will take place in each of the 14 Treatment A, 14 Treatment B, and 14
Control Group schools.  In each grade level, one class per school will
be tested.  he testing will be organised by project staff and carried
out with the strong involvement of Ministry of Education, headmasters and
teachers. The Evaluation Team’s role in this exercise will be to review the
tools and protocols to ensure ease of data entry and analysis.  The
project’s staff will ensure marking and data entry of test results, while the
Evaluation Team will conduct the detailed analysis as part of the M&E
system.
5.
Midline project evaluation report
Design,
conduct and submit a midline evaluation report that assesses the
effectiveness, impact and VfM of the project at the midline point. This
includes a follow up survey to track progress on indicators against baseline
(additional funding will be available for this survey if required). The main
body of the final report should be no more than 30 pages excluding additional
annexes. A final report structure will be provided at the Inception
meeting.  The detailed methodology for the midline evaluation will be
elaborated in the inception report and again based on learning after the
baseline.
6.
Final project evaluation report
1.
Inception Phase
Inception
Meeting held
Once
Consultant appointed (indicative date: 1st August 2013)
Literature/document
review & data gathering completed
Bidder
to complete
Review
of project’s theory of change, impact logic and evaluability completed
Bidder
to complete
Stakeholder
consultation completed
Bidder
to complete
Sampling
framework for primary research for baseline completed
Bidder
to complete
Design
of data collection strategy including cohort tracking design completed
Bidder
to complete
Design
of primary research instruments for baseline completed
Bidder
to complete
Draft
Inception Report (including design of baseline study) submitted for review
and comments by Project Manager and Project Partners.
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Review
complete and comments returned to supplier
Bidder
to complete
Final
Inception Report submitted
14th
August 2013 (indicative date to be confirmed once consultant appointed)
2.
Baseline Study Phase
Baseline
research starts
Bidder
to complete
Baseline
research completed
Bidder
to complete
Draft
Baseline Study Report submitted for review
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Review
by Project Management and stakeholders completed /comments provided to
Supplier
Bidder
to complete
Supplier
addresses comments and revises Baseline Study Report
Bidder
to complete
Final
Baseline Study Report submitted
11th
September 2013
3.
Midline evaluation Phase
3.1
Start of Design Review Phase
Bidder
to complete
Preliminary
review of project information and data completed
Bidder
to complete
Review
of evaluation design and research methods completed
Bidder
to complete
Stakeholder
consultation completed
Bidder
to complete
Revisions
to evaluation design and research methods completed
Bidder
to complete
Review
of sampling framework for primary research completed
Bidder
to complete
Review
of primary research instruments for primary research completed
Bidder
to complete
Draft
Research Design Report submitted for review
Bidder
to complete
Draft
Research Design Report reviewed by Project Manager, Evaluation Steering
Group, etc completed and comments returned to supplier
Bidder
to complete
Final
Research Design Report submitted
Bidder
to complete
3.2
Start of Research Phase
Bidder
to complete
Analysis
of financial and monitoring data completed
Bidder
to complete
Analysis
of cohort tracking data completed
Bidder
to complete
Primary
quantitative research starts (e.g. household surveys, school surveys)
Bidder
to complete
Primary
quantitative research ends
Bidder
to complete
Primary
qualitative research starts (e.g. EGRA, EGMA, focus groups, workshops,
semi-structured interviews of stakeholders /partners)
Bidder
to complete
Primary
qualitative research ends
Data
verification, cleaning and validation completed
Bidder
to complete
3.3
Start of Analysis Phase
Bidder
to complete
Start
of analysis phase
Bidder
to complete
Analysis
of data and results completed
Bidder
to complete
Draft
Interim Report submitted
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Draft
Interim (Emerging Findings) Report reviewed by Project Manager, Evaluation
Steering Group, etc and comments returned to supplier
Bidder
to complete
Final
Interim (Emerging Findings) Report submitted
Bidder
to complete
3.4
Start of Reporting Phase
Bidder
to complete
Draft
Midline Project Evaluation Report submitted
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Draft
Midline Project Evaluation Report reviewed by Project Manager, Evaluation
Steering Group, etc and comments returned to supplier
Bidder
to complete
Midline
Project Evaluation Report submitted
Bidder
to complete
Final
Project Evaluation Report agreed
31
January, 2015
4.
Final Project Evaluation Phase
4.1
Start of Design Review Phase
Bidder
to complete
Preliminary
review of project information and data completed
Bidder
to complete
Review
of evaluation design and research methods completed
Bidder
to complete
Stakeholder
consultation completed
Bidder
to complete
Revisions
to evaluation design and research methods completed
Bidder
to complete
Review
of sampling framework for primary research completed
Bidder
to complete
Review
of primary research instruments for primary research completed
Bidder
to complete
Draft
Research Design Report submitted for review
Bidder
to complete
Draft
Research Design Report reviewed by Project Manager, Evaluation Steering
Group, etc completed and comments returned to supplier
Bidder
to complete
Final
Research Design Report submitted
Bidder
to complete
4.2
Start of Research Phase
Bidder
to complete
Analysis
of financial and monitoring data completed
Bidder
to complete
Analysis
of cohort tracking data completed
Bidder
to complete
Primary
quantitative research starts (e.g. household surveys, school surveys)
Bidder
to complete
Primary
quantitative research ends
Bidder
to complete
Primary
qualitative research starts (e.g. EGRA, EGMA, focus groups, workshops, semi-structured
interviews of stakeholders /partners)
Bidder
to complete
Primary
qualitative research ends
Data
verification, cleaning and validation completed
Bidder
to complete
4.3
Start of Analysis Phase
Bidder
to complete
Start
of analysis phase
Bidder
to complete
Analysis
of data and results completed
Bidder
to complete
Draft
Interim Report submitted
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Draft
Interim (Emerging Findings) Report reviewed by Project Manager, Evaluation
Steering Group, etc and comments returned to supplier
Bidder
to complete
Final
Interim (Emerging Findings) Report submitted
Bidder
to complete
4.4
Start of Reporting Phase
Bidder
to complete
Draft
Final Project Evaluation Report submitted
Bidder
to complete
Presentation
to Evaluation Steering Group
Bidder
to complete
Draft
Final Project Evaluation Report reviewed by by Project Manager, Evaluation
Steering Group, etc and comments returned to supplier
Bidder
to complete
Final
Project Evaluation Report submitted
Bidder
to complete
Final
Project Evaluation Report agreed
31
January, 2016
Design,
conduct and submit a final project evaluation report that assesses the
effectiveness, impact and VfM of the project. This will include a follow up
survey to track progress on indicators against baseline and mid-term (additional
funding will be available for this survey). The main body of the final report
should be no more than 30 pages excluding additional annexes. A final report
structure will be provided at the Inception meeting. The detailed methodology
for the Final project evaluation will be elaborated in the inception report
and again based on learning after the baseline.
All
reports should be submitted in English, as well as the raw data. The Evaluation
Team will be required to provide face-to-face presentations in-country of all
deliverables as an integral part of the submission process.  The
Evaluation Team will be expected to provide a fully ‘cleaned-up’ dataset in
SPSS file format with full cross-tabulations of the results. It is expected
that 10-12 sets of cross-tabulations breaking down the results for all
questions and including appropriate statistical tests so that significant
differences can easily be identified.  In addition, two personal case
studies about girls who have benefited from the project, including photos, will
be provide with the baseline, midterm, and final evaluation reports.
38.
Detailed work plan
Bidders
are required to provide a detailed work plan incorporating all relevant tasks
and milestones from start to finish of the evaluation study.
39.
Project milestones
Bidders
are required to include in their detailed work plans the milestones set out
below.
Reporting
and Contracting Arrangements
40.
Contact point
The
Evaluation Team will be expected to identify a Project Director and Project
Manager for communication and reporting purposes. At the Inception meeting the
Evaluation Team Project Manager will be expected to submit a full contact list
of all those involved in the evaluation.
41.
Participation in Evaluation Steering Group
The
Evaluation Team will be expected to attend report to the Evaluation Steering
Group and attend all meetings as agreed with the Project Evaluation Manager.
The Team will be required to submit to the Project Evaluation Manager bi-weekly
progress reports (by email) during the study periods summarising activities
/tasks completed to date (per cent achieved), time spent etc.
Budget
42. Estimated budget
The
estimated budget for this work is GBP 22,110. This budget is inclusive of all
costs covering team member costs, travel, research costs and any other costs
associated the completion of the work (e.g. printing, photocopying, materials,
stationery and travel / per diem costs for project beneficiaries participating
in the work). Bidders are required to organise and fund their own duty of care
arrangements as required.
43. Detailed bid budget
Bidders
are required to provide a fully costed proposal in the form of a price schedule
that as a minimum should include:
  • Sub-total of fees for the
    delivery of any task or deliverable;
  • Sub-total for number of days
    per partner organisation (as applicable);
  • Expenses and overheads
    broken down by the project cost categories [Recipient to provide]; and
  • Total costs before and after
    any taxes that are applicable.
Bidders
are required to provide a payment schedule on the basis of milestone payments
for the successful delivery of each deliverable
Applications
should be addressed via e-mail to
hparwanda@gmail.com not later than 22nd
of July 2013 at 5:00 pm. Only shortlisted candidates will be contacted.
0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x