In this lesson, pupils explored why apps and social media platforms have age ratings and how these rules are designed to protect young people.
They looked at popular apps such as YouTube, TikTok, Snapchat, and Roblox, and learned that while these can be fun and creative, they also expose users to adult content, pressure, and harmful material if used too young.
Pupils discussed how algorithms can quickly move from innocent videos or games to inappropriate or extreme content, and how this can affect mental health, body image, and self-esteem. They also learned how to check app ratings and how to report content that makes them uncomfortable.
Studies from the Royal Society for Public Health link high social media use to increased anxiety, poor sleep, and negative body image in young people.
The NSPCC reports that more than half of children aged 11 to 13 have seen harmful or upsetting content online.
Research by Ofcom (2024) found that many children use apps rated for older users, which increases their risk of encountering adult content, grooming attempts, or cyberbullying.
Practical tips:
Check age ratings together. Look at the App Store or PEGI rating before downloading any app or game.
Use parental settings. Set restrictions on app downloads and enable “Restricted Mode” or “Family Pairing” on platforms like YouTube and TikTok.
Keep conversations open. Ask your child which apps they use and what they enjoy about them. Encourage honesty if they see something worrying.
Model healthy use. Show your child how you use social media mindfully and take screen breaks.
Remind them of privacy. Encourage strong privacy settings, avoiding location sharing, and never posting personal details.
Report and block. If they see harmful or inappropriate content, use the in-app report tools and talk to a trusted adult.
Internet Matters – Age-Appropriate Apps Guide: www.internetmatters.org/resources/what-age-can-my-child-start-social-networking/
Childnet – Social Media Top Tips: www.childnet.com/resources/parents-and-carers/
NSPCC – Online Safety for Apps and Games: www.nspcc.org.uk/keeping-children-safe/online-safety/
CEOP – Report Online Exploitation: www.ceop.police.uk
YouTube
13+ (YouTube main app) / No minimum for YouTube Kids (parent-controlled)
Unrestricted videos; comment sections and algorithm can expose inappropriate content.
TikTok
13+
Content can include explicit language, mature themes, and addictive scrolling.
Snapchat
13+
Disappearing messages, Snap Maps (location sharing), and pressure from streaks.
13+
Image comparison, body-image pressure, and exposure to adult content.
16+ (UK and EU law)
Encrypted private/group chats; sharing of images and location.
Discord
13+
Voice and text chat; servers may contain adult or harmful content.
Roblox
PEGI 7 (contains mild violence) / 13+ (chat features)
User-generated games; risk of inappropriate chats or content.
Minecraft
PEGI 7
Generally safe; online servers may include chat and user-created mods.
BeReal
13+
Location sharing by default; pressure to post daily photos.
X (formerly Twitter)
13+
Unfiltered content feed; potential exposure to violent or adult material.
13+
Data sharing, targeted ads, and access to groups with mixed audiences.
Twitch
13+
Live streaming — unpredictable language and behaviour from streamers.
13+
Community forums with little age filtering; explicit content possible.
13+
Safer visual platform but includes some adult themes in searches.
Kick (newer streaming app)
18+
Minimal moderation; gambling and explicit content common.
Telegram
17+
Encrypted messaging; some channels host illegal or adult material.
Threads
13+
Linked to Instagram; same data-collection and exposure risks.
Among Us
PEGI 7 / ESRB 10+
Chatroom can include strangers; suitable with restricted chat.
Fortnite
PEGI 12
Online chat and in-game purchases; can be addictive.
Call of Duty Mobile / Warzone
PEGI 18
Violence and online chat; not appropriate for younger users.