Catch Up was successfully implemented and will be scaled up to approximately 1,800 schools across Zambia. The Catch Up monitoring system relied on government officials at the school, zone, and district level. Additional mentors from VVOB – education for development were provided to strengthen the mentoring and monitoring system. This case study was drawn from observations and process monitoring conducted by Innovations for Poverty Action (IPA) Zambia and J-PAL Africa during the Catch Up pilot. The monitoring system and tools implemented during the Catch Up pilot in Zambia serve as examples and should not be replicated directly in other contexts. The current tools are being re-evaluated and improved for scale-up. When setting up monitoring systems, governments and implementers should carefully consider their specific contexts and constraints.
Zambia’s School Monitoring Structure
Senior Teachers (1:1 school)
Senior Teachers provide a mentoring role in schools.
Zonal Inservice Coordinators (1:+-10 schools)
ZICs provide a cluster of schools with monitoring and mentoring support.
District Resource Centre Coordinators (1:+-100 schools)
DRCCs provide mentoring and teacher training to schools in the district.
Provincial Resource Centre Coordinators (1:+-1000 schools)
PRCCs coordinate districts and devise teacher support strategies.
1. Monitoring Team
The first step in setting up a monitoring system was to identify a team of people who could regularly observe, monitor, and offer support to Catch Up teachers. When building a monitoring team, Zambia’s Ministry of General Education tapped into existing monitoring systems. To ensure accountability, monitors from outside the school were trained in TaRL practices and assigned to schools. With large distances between schools and limited transportation options, it was difficult for mentors to move between schools to observe classes on a weekly basis. Therefore, to ensure that teachers received regular support and monitoring, a multi-layered monitoring system was created. This consisted of mentors drawn from inside and outside of the school:
Based in schools, responsible for giving ongoing support to teachers, and for collecting data every second week.
Zonal In-Service Coordinators (ZICS)
Responsible for supporting a zone of 5 schools
District Resource Center Coordinators (DRCCS)
Responsible for supporting a district of 20 shools.
Responsible for visiting a district of 20 schools during the pilot. In addition to using the already existing government teams, a support system was provided by VVOB. Responsible for visiting all schools in the district, observing Catch Up classes on a monthly basis, supporting teachers and collecting information. ZICs, DRCCs, and VVOB coordinators usually did not inform schools before their visits to ensure that they were observing regular Catch Up classes, rather than lessons prepared specifically for their visits. The Ministry of General Education also involved parents through a parent monitoring structure, and parents visited schools on a weekly basis to observe classes. During the Catch Up pilot, mentor/monitors played a dual role: as mentors, they were responsible for supporting and guiding teachers in the implementation of TaRL; as monitors, they collected data at each stage of TaRL implementation, aggregating and reviewing data to inform further TaRL design. Mentors were expected to actively engage in activities during classroom visits and provide actionable feedback to teachers after the class.
2. Monitoring Tools
With a team identified, tools were created for two types of data during Catch Up implementation: assessment data and classroom observation data.
Catch Up mentoring teams increased the number of visits during the assessment period to ensure assessment was taking place and to help the teachers implement this correctly. Assessment data flowed slowly from school level to zonal level, district level and national level.
Classroom observation forms included information on ability-based grouping; use of level-appropriate materials and activities; use of Catch Up activities and materials; and the level of child engagement during the class. Paper-based monitoring tools were used in the field because of the lack of internet, computers, and smartphones in most schools. ZICs then used excel spreadsheets to aggregate the data collected.
3. Monitoring Process
In order to properly mentor and monitor teachers, each group of mentors and monitors were equipped with TaRL expertise and data collection tools. Mentors then regularly observed Catch Up classes and gave feedback through one-on-one sessions with teachers. They wrote regular reports and attended review meetings to share TaRL challenges and successes.
Trainings focused on creating a solid understanding of TaRL methodology through interactive sessions and classroom practice. High-quality training in TaRL methodology, along with classroom practice, helps to ensure that mentors are able to identify when TaRL practices are properly applied, to recognise problems, and to accurately evaluate teacher performance. Mentors were introduced to the monitoring tools and trained in giving feedback, collecting and aggregating data, and writing reports. At trainings, mentors were given a clear idea of the information they were responsible for collecting, how frequently they should collect this data, and the specific tools they should use. Setting clear expectations early on helped mentors to adequately monitor TaRL implementation.
Classroom Observation And Data Collection
Mentors completed classroom observation forms for each of the classes visited. Where possible, they addressed mistakes or challenges they saw in the field, actively coaching teachers, and demonstrating Catch Up methods in the classroom when appropriate. ZICS, supported by VVOB mentors, compiled all observation forms at the monthly zonal review meeting to aggregate the data and create zonal reports.
Monthly Zonal Review Meetings
Regular mentor meetings ensured that there was adequate space and time for discussing programme challenges. Once a month, all mentors within a zone attended a zonal review meeting, during which senior teachers shared aggregated school data and mentors discussed their observations. Mentors brainstormed solutions and actively worked to improve Catch Up implementation in the zone. When problems could not be addressed at the zone level, they were taken to a district level and addressed by the District Education Office. ZICs and VVOB jointly aggregated the data and wrote zonal performance reports.
District-Level Data Collection
ZICs submitted electronic copies of their zone summary sheets to the District Education Office (DEO) every month. The DRCC and corresponding VVOB coordinator jointly wrote a monthly report about district-level Catch Up performance. The DEO was responsible for reviewing the information and deciding if any action should be taken at the district level.
Challenges And Lessons Learned
It is often possible for officials to focus more attention on a small number of schools during a pilot programme – although it may not be possible for them to continue to provide this much attention per school as a programme grows to scale. Teams should consider this when designing both the pilot and the scale up. Piloting in a relatively small number of schools may allow local actors to work out the kinks and make better plans for the future, but teams should be realistic about how well pilot systems will work when programmes go to scale.
Choose appropriate data collection tools for the context.
Since many schools in the Catch Up pilot districts had unreliable cell phone signal and no access to Internet, paper tools were used. The data was digitised when aggregated at the zone or district level, but this process proved challenging.
Collect and share assessment data as quickly as possible in order to allow officials to provide support to schools.
We recommend having data collected and shared within the first 10 days of the programme.
Consider whether data collected is useful for decision-making to improve teaching or the overall TaRL model.
An early version of the classroom observation tool required yes/no answers and, when used, did not show much variation in answers, with teachers scoring well in almost all areas. This made it difficult for ZICs and VVOB coordinators to know which schools to target with additional support. In response, the tool was adjusted to include more multiple-choice questions, resulting in a wider range of responses. When working with government monitoring systems aim to have a bare minimum of necessary questions. Early version of the Catch Up tool included several questions which were burdensome for the government system.
Set mentoring and monitoring expectations early, particularly if mentor/monitors play a dual role.
Sometimes, mentor/monitors succeeded in their monitoring role (i.e. observing the teacher, filling in forms, etc.) but failed at their mentoring role (i.e. helping groups of children, demonstrating activities, and actively supporting the teacher in the classroom). Supportive mentorship which teachers find helpful and empowering rather than punitive is a key principle of the TaRL approach. This was a missed opportunity to use the mentors’ TaRL expertise to advise and support teachers in the classroom. This was addressed in part by advising that mentors complete the forms after observing the class.
Carefully consider possible scheduling conflicts and have a backup plan in place.
In one district, Catch Up took place during the school holidays, while some senior teachers were attending professional development courses. To counteract this, ZICs and DRCCs were asked to make additional visits, which they did successfully. In some districts, sports and co-curricular activities prevented children from taking part in Catch Up activities on some days, therefore reducing the number of classroom observations that mentors could make.
Guide mentors in using data to improve implementation.
In some cases, it proved difficult to use data to inform specific actions. This indicates that more guidance is needed to effectively use data. In creating monitoring and aggregation tools, implementers should consider how the information will be used at every level and create tools and report templates which point towards action steps.
Supporting mentors to improve the monitoring process.
Process monitoring of the Catch Up pilot found a lack of monitoring and reporting on the performance of mentors themselves. Having a monitoring and feedback system in place for mentors could help them to more effectively mentor and monitor teachers. For example, more senior mentors could observe classes with senior teachers and provide feedback after the visit.
Create user-friendly data aggregation tools and processes.
Paper monitoring tools were correctly used but were difficult and time-consuming to convert to an aggregated electronic form for proper analysis and review at more central levels of the government. This was mitigated in part with the help of VVOB coordinators, who were responsible for supporting the aggregation process. In the scale-up, this challenge is being mitigated by creating a simpler, paper-based aggregation process, which can then be entered using a simple spreadsheet. A broader lesson learned is to simplify the data collection and aggregation process as much as possible to ensure that accurate, useful data is collected efficiently.
Set clear expectations for creating reports.
Although district- and zone-level officials were responsible for completing reports, in some cases the VVOB coordinator completed the report instead. In order to ensure that DRCCs and ZICs complete reports, the expectation could be made clearer at trainings and re-emphasised during implementation, to ensure that mentors see reporting as a core responsibility.