Lesson Overview
| Duration | 75 minutes |
| Key Stage | College (Post-16) |
| Subject Links | PSHE, Health & Social Care, Public Services, Criminology |
| Resources Needed | Pupil handouts, Quiz, Presentation slides |
Learning Objectives
- Critically analyse the Online Safety Act 2023 and its implications for platforms and users
- Understand the emerging threat landscape including AI-generated imagery and deepfakes
- Evaluate professional and ethical responsibilities around online safety in education and care settings
- Apply a safeguarding lens to online harm scenarios involving young people
- Understand the regulatory landscape and where accountability lies
Key Information
- The Internet Watch Foundation removed over 275,000 URLs hosting child sexual abuse material in 2024
- AI-generated CSAM now constitutes a growing proportion of IWF referrals
- Ofcom estimates 40% of children aged 8-17 have encountered harmful content online in the past year
- The Online Safety Act 2023 introduced duties affecting over 25,000 platforms operating in the UK
- Social media companies spent over £8m lobbying against elements of the Online Safety Act
Legal Framework
- Online Safety Act 2023 — categorised platforms, duty of care, safety by design, child user duties
- Communications Offences Act 2023 — updates to coercive online behaviour, false communication, cyberflashing
- Protection of Children Act 1978 — indecent images of children (includes AI-generated)
- GDPR / UK GDPR — children's data rights, minimum age for consent, right to erasure
- Children's Code (Age Appropriate Design Code) — privacy by default for under-18s
- Mandatory Reporting (Children's Wellbeing and Schools Act 2024) — applies to online harm disclosures
Lesson Plan
10 mins The Online Safety Act: What Changed?
Categorisation, duties, Ofcom enforcement. What platforms must do for child users. Where the Act is strong and where critics say it falls short.
12 mins AI and the New Harm Landscape
AI-generated CSAM, deepfakes, synthetic grooming. What does this mean for child protection? How should professional practice adapt?
12 mins Digital Rights and Children
UK GDPR, Children's Code, right to erasure, data exploitation by platforms. What rights do young people have and how can professionals support them?
10 mins Professional Safeguarding in Online Contexts
When a child discloses online abuse: what not to do (look at content, ask for details, tell them to delete). What to do. Referral pathway through CEOP.
6 mins Accountability and Activism
Who is responsible for online safety? Platforms, parents, schools, regulators, young people themselves. What needs to change and who should drive it?
10 mins Case Studies
Apply the OSA 2023 and safeguarding frameworks to three complex scenarios.
⚠️ Safeguarding Considerations
- College professionals delivering this should be trained in the CEOP referral pathway
- AI-generated content of real pupils from your institution has been documented — this is a local risk, not abstract
- Participants may disclose current online harm during or after this session
- The Mandatory Reporting Duty applies to online harm disclosures in educational settings
If a pupil makes a disclosure during this session, follow your school's safeguarding procedures and refer to your DSL immediately.
Key Messages
- The Online Safety Act 2023 is the most significant shift in platform liability since the internet began
- AI-generated CSAM is illegal under the same law as real imagery — and is a rapidly growing threat
- Children have significant digital rights under UK GDPR and the Children's Code — most don't know them
- Professional responsibility around online harm is codified in law — knowing the referral pathways is not optional
- Platform design is a public health issue — safety by design saves children's lives
Support Resources
| Organisation | Contact | Purpose |
| Childline | 0800 1111 | 24/7 support for young people |
| Crimestoppers | 0800 555 111 | 100% anonymous reporting |
| CEOP | ceop.police.uk | Report online exploitation |
| NSPCC | 0808 800 5000 | Child protection advice |
| Emergency | 999 | Immediate danger |