Doing our homework: how we’re improving our estimates of education output through the pandemic
Teachers, students and parents have had a difficult year: school closures, remote instruction and the need to ‘home-school’ have all made providing lessons to students challenging. Here Philip Wales talks about how our measurement has responded to keep pace with fast-changing schools policies during the pandemic, and how the improvements we are now introducing impact on our figures.
As has been widely discussed, ONS’s estimates of the output of public services what is delivered by those services; by weighting together how many primary and secondary school pupils are receiving education in the classroom to estimate education output in the National Accounts. Although this differs from the practice of some other countries, this approach is in line with best international standards.
However, as the pandemic struck our previous approach of counting students in the classroom couldn’t capture education output anymore: classrooms were largely empty, schools were closed, and yet teachers were still hard at work instructing students at home. Our measurement needed to adapt to include remote learners.
Initially, we used new sources such as Department for Education high frequency attendance data and information from Teacher Tapp, an innovative app that surveys teachers on what sort of work they were doing.
These sources allowed us to estimate how much remote learning was being provided. Our initial estimate was that remote learning was a partial substitute for in-school teaching: we estimated that each remote learning secondary school student was equivalent to around two-thirds of an ‘attending student’. Each remote learning primary school student was worth about 40% of an attending student – reflecting the larger role of parents in teaching younger children.
But at the time we made these estimates, we made it clear these early estimates were more likely to be revised than in normal times. Both the return to schools in September and their renewed closures in January created new measurement difficulties. To capture these events, we improved the questions we asked teachers to get more detailed information in the latter half of last year: asking directly how much material was being provided to remote learners.
We have now used these revised questions to improve our estimates of the value of remote learning for the first few months of the pandemic and to make sure we have a consistent approach throughout 2020. These estimates have reduced the value of each remote learning school student in those early weeks of the pandemic, which now rises gently through the rest of 2020 as teachers, parents and children acclimatised to the ‘new normal’.
These new education estimates have the impact of making GDP fall by 0.5 percentage points more than originally estimated in the second quarter of 2020 before recovering by 0.4 percentage points more in the third quarter.
These improved estimates paint a better, more consistent picture of the challenges faced by the education sector in the first months of the pandemic, and provide a firm basis for measurement in 2021. This will be particularly important as we start to think about how any ‘catch up’ provision might be captured in our estimates of education output in the coming months.
We will keep all of our methods and sources under review to ensure we can provide the best estimates possible of how the pandemic has affected our society and economy.