Does missing one week of school lead to lower grades?
(5 October 2016)
According to the UK’s Department for Education (DfE), “missing the equivalent of just one week a year from school can mean a child is significantly less likely to achieve good GCSE grades”. The DfE has made this claim for over a year in an effort to dissuade parents from taking their children on vacation during term-time, when flights and hotels are cheaper. The argument seems persuasive; parents will not want to gamble with the educational prospects of their children just to save some money on a holiday. But is it valid?
A DfE analysis, published in February 2015 confirmed a clear association between pupil absence from school and subsequent lower attainment at Key Stage 2 (the end of primary education at age 11) and Key Stage 4 (the end of secondary education at age 16). Nicky Morgan, the Secretary of State for Education at the time, released a summary to the press, stating that: “The myth that pulling a child out of school for a holiday is harmless to their education has been busted by this research.”
However, the DfE had simply found a correlation between school absence and educational attainment, which does not necessarily mean a causal link between term-time holidays and attainment, as the Education Secretary had attempted to say. The report was “based on all absences for whatever reason”, of which the most serious would be related to long-term illness, family emergency or exclusion for bad behaviour – the sort of factors that might offer an alternative explanation for a child’s poor performance.
Working with the Education Media Centre, I challenged these claims at the time of release and the DfE has since updated their analysis. The new figures, published in March 2016, look at the different reasons for absence and other pertinent factors, and the outcomes are modelled using logistic regression. In many ways this is an improvement on the 2015 report, but it misuses significance tests with population data, still uses the word ‘effect’ – implying a ‘causal effect’ – to describe the results throughout, and fails to acknowledge that absent pupils may be different to their peers in many ways not covered by the data available. Because of this, the link between absence and attainment must not be described as causal. The link may be causal, but a lot of alternative explanations have to be considered first.
To evaluate the effect that absence itself has on attainment, it would be best to conduct a randomised control trial (RCT), keeping one group of children off school for a period of time and comparing their performance to another group that continued their schooling – but this is neither ethical nor feasible. The closest practical alternative to an RCT would be a regression discontinuity study, which has already been used to assess the impact of missing a whole year of school. In England, August-born children go to school a year earlier than those born in September, so a study, by Hans Luyten, compared attainment for these two groups of children.
Clearly, attending school for an extra year does make some difference to attainment. However, the impact on attainment of not attending school for a year, as quoted in the Luyten study, is proportionately much less than that claimed by the DfE in its interpretation of the ‘effect’ of absences (that last only weeks, or sometimes days). This strongly suggests that any impact on attainment that the DfE is picking up is only minimally related to the absence itself, and has more to do with the kind of systematic differences that already exist between those pupils who are likely to miss school and those who are not.
None of these criticisms of the DfE study are intended to condone absence from school. But the DfE, the current Secretary of State, and the headteachers who have commented on this new report in the press are wrong to claim knowledge that absence is a root cause of low attainment.
Originally published by Significance - official magazine and website of both the Royal Statistical Society (RSS) and the American Statistical Association (ASA).