We aim to get children on track in their social and emotional health, and their academics. For academics, this means reading, writing and maths. Every year, we collect young people’s end-of-year academic grades from our local partner schools to:

● Identify young people for our programme;
● Plan support for the young people we are working with;
● Measure the impact of our programme on the young people who have completed our two-year programme.

The challenge

WLZ has continued to expand into more and more schools and, as per the graph below, this has meant an increasing number of young people’s data to analyse. This has increased from 62 young people in 2017 to just under 1,000 young people in 2021. In addition, the variety of assessment types we receive from our partner schools makes our analysis complex. And, as we were setting up more new cohorts in schools than ever before in September (39), we needed to complete the analysis by the end of August (not something we have achieved before). We needed to optimise our processes and more efficiently execute our analysis as we have continued to grow.

This year we have been working with a team of consultants at Analysis Group to do this!

The solution

Over the summer term, the Analysis Group pro bono team helped us to design a new tool using R (a programming language used for statistical computing) to better automate our academic analysis. The tool works in the following way:

The process

1. Raw data

Read in student-level reading, writing and maths grades collected from schools.

2. Intermediate data

Clean the raw data and calculate which young people:

• Are presenting below age-related expectations in their academics;
• Have met our expectations;
• Are on track to reach age-related expectations by the end of their setting;
• Have moved out of risk and are now operating at or above the expected level.

3. Final output

Aggregate the student-level data and produce summary statistics.

The results

In 2020, we analysed the impact data for 35 schools and processed 3,236 young people’s data to identify children for the next two-year programme. We were able to report on our results internally by mid-September. This year, our team of three analysed midpoint and endpoint data from our 37 local partner schools, in addition to analysing 9,558 young people’s data to support identifying young people for the next programme. Even with this significantly greater workload, we reported on our impact internally by 23 August. This wouldn’t have been possible without the Analysis Group pro bono project.

The results from the young people who finished our two-year programme in July 2021 are published here, and our delivery team are already underway using the young people’s midpoint data to plan support for their final year of the programme.

Thank you to Analysis Group for working with us to optimise our academic analysis, and also to The Lightbulb Trust for putting us in touch with them.