Reading The Signs...
Understanding Expectations Around AI Use

As AI continues to shape the education landscape, many schools, colleges and MATs are still working out how to introduce it responsibly. While the use of AI is growing, the official expectations around its adoption remain broad and evolving.

There is currently no formal requirement for UK education settings to use AI tools. However, with Ofsted beginning to outline how AI may be considered during inspections, it’s important that schools are prepared to demonstrate both responsible implementation and clear thinking.

How Ofsted is approaching AI during inspections

 

At the time of writing, Ofsted does not assess AI use as a standalone element of an inspection. However, recent guidance sets out how inspectors may encounter and evaluate the use of AI during broader inspections. This guidance is informed in part by Ofsted’s research with early adopters of AI in education settings.

Here are three key points schools should understand:

1. AI will not be inspected in isolation

There is no dedicated section of the inspection focused on AI. Inspectors won’t actively seek out evidence of AI use, and AI will not be referenced in reports unless it plays a crucial role in wider inspection decisions.

Instead, AI use will be considered within existing frameworks—such as safeguarding, curriculum planning or data privacy. For example, if AI is used in lesson planning or student support, inspectors may explore how it aligns with the setting’s broader strategic goals or safeguarding practices.


2. AI use must serve the best interests of learners

Where AI is in use, inspectors will expect it to be used in a way that benefits students and aligns with the setting’s mission and values. They may consider:

  • How AI tools are used, and the impact on students and staff
  • Whether the school has made thoughtful, evidence-based decisions around AI use
  • How settings respond to the misuse of AI by students, staff or parents
  • How inappropriate or harmful content is managed, where relevant

3. Ofsted’s position is still developing

This is not a final stance. Ofsted acknowledges that it does not yet have enough evidence to define what “good” looks like in terms of AI use for inspection purposes. As more case studies and practical insights emerge, this position will continue to evolve.

For now, settings are encouraged to use AI responsibly, document their decision-making processes, and stay alert to updated guidance.

 


 

Example Questions Inspectors May Ask

While not exhaustive, the following questions reflect Ofsted’s current thinking.
Schools, colleges and MATs should be prepared to respond to questions such as:

 

What AI tools or platforms are currently in use across your setting?
How does your use of AI align with your wider strategic or pedagogical goals?
What AI-specific policies do you have in place? Have you updated existing policies accordingly?
Have you carried out Data Protection Impact Assessments (DPIAs) for tools processing personal data?
How do you assess the impact of AI on both staff and learners?
What safeguards are in place to protect students from potentially harmful or inappropriate AI-generated content?
What informed your decision to implement AI in specific areas (e.g. marking, admin, lesson planning)?
If AI has been used in sensitive contexts - for example, summarising a child protection conference - how is the accuracy and confidentiality of that information ensured?
lady-bg1