While education settings have flexibility in shaping their AI approach, they still need to abide by their legal and statutory obligations, including
Keeping children safe in education and data protection laws.
An effective AI policy should do more than list what’s allowed and what’s not. It should also consider:
Ethical use |
Does the AI policy support learning without encouraging shortcuts, bias, or misuse? |
Staff and student training |
Do our staff and students understand when, why and how to use AI tools safely and effectively? |
Safeguarding implications |
What risks could AI pose to student safety or wellbeing? |
Policy alignment |
Does the AI policy align with related policies, such as: staff code of conduct, student acceptable use |
Not all AI tools are created equal. They vary in purpose, functionality and risk - and the landscape is constantly shifting.
That’s why policies should clearly set out:
For example:
Are tools like ChatGPT only allowed for teacher use?The more specific the policy, the easier it will be to enforce and review.
Where AI use is permitted, it’s vital to set clear boundaries on what can and cannot be shared with these tools.
Intellectual property |
Lesson plans, homework, and essays remain the property of the person who created them. |
Personal data |
Staff and student data must not be shared with AI tools unless clear consent has been obtained. |
Terms and age restrictions |
Every AI tool comes with its own terms of use. These should be reviewed and incorporated into your policy - especially any age restrictions. |
Establishing a clear AI policy may feel complex, but taking action now ensures your setting is prepared, protected, and positioned to respond confidently as AI continues to evolve.
Start with a basic foundation and view it as an evolving policy that can be updated as new information or technology is introduced.
The following steps may help:
STEP 1: Discuss policy development as an SLT and get DSLs involved early
STEP 2: Review your setting’s current AI usage (both official and unofficial)
STEP 3: Collaborate with IT staff to understand any cybersecurity risks and training needs.
STEP 4: Stay informed of updates from the DfE and other trusted sources
Once your policy is in place, it’s essential to make sure staff and students understand it - along with the reasons behind it and the consequences of not following it.
Responsibility for formulating an AI policy or strategy shouldn’t rest with one individual. Given the fast-moving nature of the technology, it’s important to involve all relevant stakeholders in the creation and ongoing development of your AI policy. This ensures shared understanding, collective ownership, and consistent application across your setting.
Appointing an AI championMany early adopters of AI in schools, colleges and MATs have found it valuable to appoint an “AI champion” - a member of staff who can support colleagues in building their understanding of AI and help drive confident, informed adoption. This doesn’t need to be an IT specialist; it could be any staff member with a strong interest or knowledge in AI. |