When AI Meets Schools: A Cautionary Tale on Attendance Targets
When AI Meets Schools: A Cautionary Tale on Attendance Targets
Recently, the government announced that every school in England will be given AI-generated pupil attendance targets. The stated aim is to raise attendance rates that remain below pre-pandemic levels. Whilst I'm sure everyone commends the sentiment of ensuring children are in school, the implementation raises serious concerns about how AI is being applied in education.
Schools already work tirelessly to improve attendance. Teachers, leaders, and support staff move heaven and earth every day to remove barriers and encourage pupils to attend. Yet, many factors that influence attendance from family circumstances to health, socio-economic challenges and geography, are entirely beyond a school’s direct control.
Issuing individual AI-generated targets to schools risks misunderstanding the realities on the ground. By plucking numbers from a model and imposing them, without sufficient human insight or contextual understanding, the system runs the danger of adding pressure rather than support. I've read that the models will take into account 'comparable schools' when issuing these targets but even schools separated by a handful of streets can have vastly different social, economic, and community challenges. Attendance patterns are shaped by nuanced, local factors that no algorithm can fully capture.
At the same time, schools across the country are beginning to explore how AI can be used ethically, fairly, transparently and socially responsibly. From teaching and learning to operational administration, there are opportunities for AI to support decision-making, reduce workload, and identify patterns, but only when framed by human experience and professional judgment.
This is where I feel this new concept of AI attendance targets fails.
AI should complement and not control the narrative.
If we get this balance right, AI can be a force for good: reducing unnecessary administrative burden, highlighting trends that might otherwise be missed and supporting better-informed decisions. Used this way, as a blunt instrument, bludgeoning targets without true context, it risks undermining the very goals it is meant to achieve, adding pressure rather than support.