Best Practice Update

 An image with a light green background features a 3D cartoon robot, primarily white and dark grey, standing in the foreground on the right. The robot holds a clipboard with lined paper in its left hand and a yellow pencil in its right hand. In the blurred background, two students are visible on the left with their hands raised in a classroom setting, and a green chalkboard is behind them. The words "new" in orange, "HOW OFSTED LOOKS AT" and "AI DURING INSPECTION" and "& REGULATION" in dark blue are prominently displayed on the upper left side of the image. In the bottom left corner, a circular logo for "DATA PROTECTION SERVICES PROVIDED BY: DPO PROTECTION" is visible.

How Ofsted looks at AI during inspection and regulation

The DfE have issued guidance about how Ofsted looks at AI during inspection and regulation.
This article looks at the data protection aspects of the guidance:

Evaluating the specific risks of AI

  • Data protection: many applications of AI use large amounts of data, which can include personal data. When appropriate, inspectors may ask questions similar to those they ask about any other instances of the collection, storage and processing of personal data.
  • Bias and discrimination: another defining characteristic of AI is that it can make decisions or create outputs based on data without direct human intervention. This introduces a risk that the AI will perpetuate bias and discrimination that is present in the data that it processes or has been trained on. Despite the new context, the risk of bias and discrimination, and measures taken by a provider to mitigate this, are already part of what inspectors consider when collecting evidence.
Inspectors may ask what steps the provider has taken to make sure their use of AI properly considers these risks. Were those risks considered when introducing the AI application? Do they have any assurance processes to identify emerging risks? Any evaluation that Ofsted makes is about the provider’s decision-making, what they have considered, and the impacts on children and learners, not about the tool itself.

Our advice is:

🤖Ensure you understand what AI is 

🤖 Understand how you are using AI and explain to staff how they should use it

🤖 Teach staff how to enter prompts so as not to include personal data

🤖 Ensure you have assessed the risks of any apps that use AI

🤖Ensure you have an AI policy

🤖 Ensure you include information about the use of AI in your privacy notices, i.e. how you process data using AI

🤖 Check existing applications to see if AI has been included as part of an update

DPE customers should

🤖 Contact us if they need help with assessing AI apps

🤖 Review our AI Best Practice Area

The full guidance can be reviewed: How Ofsted looks at AI during inspection and regulation

Search