Best Practice Update

Image generated by Microsoft copilot ai.  Facial recognition on a student in a classroom. ICO reprimand in computer text. Data Protection Education logo

ICO Reprimands a School

The ICO have issued a reprimand to a school that did not follow due diligence when introducing facial recognition technology (FRT).

Chelmer Valley High School, in Chelmsford, Essex introduced facial recognition technology as part of its cashless canteen payments with students.

Facial recognition technology processes biometric data to uniquely identify people and is likely to result in high data protection risks.

To use it legally and responsibly, organisations must have a data protection impact assessment (DPIA) in place. This is to identify and manage the higher risks that may arise from processing sensitive data.  

Chelmer Valley High School, which has around 1,200 pupils aged 11-18, failed to carry out a DPIA before starting to use the FRT.

The school had relied on assumed consent for facial recognition, except where parents or carers had opted children out of the processing.  The ICO viewed the parental opt-out deprived students of the ability to exercise their rights and freedoms in relation to the processing.

The school failed to seek advice from their DPO in relation to the introduction of the facial recognition technology, nor did they consult with parents, carers or students prior to the processing commencing.

Chelmer Valley High School has therefore failed to complete a DPIA where they were legally required to do so. This failing meant that no prior assessment was made of the risks to data subjects, no consideration was given to lawfully managing consent, and students at the school were then left unable to properly exercise their rights and freedoms.

The full ICO article can be read: Chelmer Valley High School Reprimand

Whenever you are thinking of using a new system that might process large amounts of personal data or sensitive data we would recommend some third party due diligence.  The best place to start with that is our Supplier Due Diligence Best Practice Area and our Due Diligence Step by Step article.

An organisation should have a Biometric Policy when processing biometric data:  document DPE Model Biometric Data Policy (178 KB)

If you need further help and advice, book in with your DPE school consultant for an online consultation or email This email address is being protected from spambots. You need JavaScript enabled to view it.

Here's a quick reminder you can send to staff to let them know how important supplier due diligence is:


When considering new systems that process personal data, does everyone in your organisation understand what due diligence should take place?
Invalid Input


Amazing, you have ticked off an important item on the supplier due diligence checklist:

For further help and guidance and access to the full checklist, please contact This email address is being protected from spambots. You need JavaScript enabled to view it..




Your organisation should have completed some due diligence on each supplier or third party that you share personal data with, especially when large amounts or sensitive personal data is involved in the processing. Contact your data protection lead or DPO for further advice.

Clipart cartoon with headphones on Please contact us for more help and advice about data protection compliance and cyber security standards: This email address is being protected from spambots. You need JavaScript enabled to view it. including the full checklist and best practice. 

 


Try asking the data protection lead in your organisation, or SLT digital lead or contact your DPO.  If you have a login to our portal, review the Supplier Due Diligence Best Practice Area.

We can provide help and guidance with data protection compliance, cyber security standards and records management: This email address is being protected from spambots. You need JavaScript enabled to view it. including the full checklist and best practice.


Article image generated using Microsoft Copilot AI and Canva.

Search