Social Media Safety for Teens: A Practical Dad's Guide
Navigate TikTok, Instagram, Snapchat and Discord with your teenager. Real guidance on privacy settings, age verification workarounds, and how to stay connected without constant surveillance.
Social Media Safety for Teens: A Practical Dad's Guide
Your teenager lives in a world you didn't grow up in. Their friendships exist in group chats. Their interests are shaped by algorithms. Their social standing is measured in followers, streaks, and likes.
You can't opt out of this world on their behalf—and you probably shouldn't try. Social media is how young people connect, learn, and build identity now. The question isn't whether they'll use it, but how they'll use it safely.
This guide covers the major platforms your teenager is probably on (or wants to be), what the risks actually are, and practical settings that help without turning you into Big Brother.
What the Research Says
Before diving into specific platforms, let's establish what we actually know about social media and teenagers.
The Numbers
Ofcom's 2024-2025 research found:
- Over half of all under-13s use social media despite age restrictions
- 38% of 5-7 year-olds use social media platforms
- Half of 8-17 year-olds now use AI tools like ChatGPT
- 19% of teenagers follow fitness programmes online (up from 14%)
The Risks (In Proportion)
Social media isn't automatically harmful. The risks are real but usually manageable:
Common concerns:
- Seeing inappropriate content (violence, sexual content, misinformation)
- Contact from strangers
- Cyberbullying from peers
- Privacy breaches from oversharing
- Time displacement (social media instead of sleep, exercise, in-person socialising)
- Comparison and self-esteem impacts (particularly around body image)
Less common but serious:
- Grooming by adults
- Involvement in harmful communities (self-harm, eating disorders, extremism)
- Sextortion (being coerced to share images, then blackmailed)
The first category is almost inevitable; your job is to help them handle it. The second category requires vigilance and open communication.
Platform-by-Platform Guide
TikTok
Minimum age: 13 (under-13s can use a limited-features version) What it is: Short-form video platform driven by algorithm recommendations
Why they love it: Endless entertainment, creative content, trends, humour What to know: The algorithm is powerful and can create "rabbit holes" into specific content types. TikTok has faced criticism for promoting content related to self-harm, eating disorders, and dangerous challenges.
Safety settings:
- Family Pairing: Link your TikTok account to theirs
- Go to Settings > Family Pairing
- Scan the QR code from your device
- Configure restrictions:
- Restricted Mode: Filters mature content
- Screen Time Management: Set daily limits
- Direct Messages: "No one" or "Friends only"
- Comments: Friends only or off
- Private account: Make their account private
Conversations to have:
- Not everything on TikTok is true or good advice
- Trends and challenges aren't always safe
- The algorithm learns what you engage with—including negative content
Minimum age: 13 What it is: Photo and video sharing, Stories, Reels, DMs
Why they love it: Visual self-expression, following friends and celebrities, group messaging What to know: Strong appearance/comparison culture. DMs can be used for inappropriate contact. Instagram now defaults to private for under-16s.
Safety settings:
- Instagram Supervision: Link accounts via Family Centre
- Go to Settings > Supervision
- Send an invitation to their account
- Configure:
- Private account (on by default for under-16s)
- Activity status: Off (hides when they're online)
- Message controls: "People you follow" only
- Story sharing: Friends only
- Comments: From followers or people you follow only
Conversations to have:
- Posted content is essentially permanent
- Followers aren't the same as friends
- It's okay to unfollow accounts that make you feel bad
Snapchat
Minimum age: 13 What it is: Disappearing photo/video messages, Stories, group chats, location sharing
Why they love it: Feels more casual and private than Instagram, streaks with friends, group chats What to know: "Disappearing" messages can be screenshot. The Snap Map shows location by default. Stories can be more public than teens realise.
Safety settings:
- Family Center: Link accounts
- Go to Settings > Family Center
- Connect with your child's account
- Configure:
- Who can contact me: "My Friends" only
- Who can view my Story: "My Friends" only
- See Me on Snap Map: "Ghost Mode" (hidden) or "My Friends"
- My AI: Can be disabled in settings
Conversations to have:
- Screenshots make "disappearing" messages not disappear
- Location sharing is powerful—Ghost Mode is often wise
- Streaks aren't worth losing sleep over (literally)
YouTube
Minimum age: 13 (YouTube Kids for younger) What it is: Video hosting and recommendation platform
Why they love it: Everything—tutorials, gaming, music, vloggers, learning What to know: Rabbit holes can lead anywhere. Comments sections can be toxic. YouTube is often how young people learn about the world.
Safety settings:
- Family Link (under 13): Full parental controls
- Restricted Mode (13+):
- Go to Settings > General > Restricted Mode
- Note: Not foolproof, and can be turned off
- Supervised accounts (teens): More limited parental oversight
Conversations to have:
- Not all YouTubers are reliable sources
- Comments sections are often toxic—it's okay to not read them
- The algorithm will keep showing you what you engage with
Discord
Minimum age: 13 What it is: Text, voice, and video chat in server communities
Why they love it: Gaming communities, interest-based servers, voice chat with friends What to know: Public servers can expose them to adult content and strangers. Much harder to monitor than other platforms.
Safety settings:
- Privacy settings:
- Settings > Privacy & Safety
- Safe Direct Messaging: "Keep me safe" (scans messages for explicit content)
- Who can send friend requests: "Friends of Friends" or "Server Members"
- Server settings:
- Review which servers they've joined
- Discuss what servers are appropriate
Conversations to have:
- Public servers aren't the same as private chats with friends
- Voice chat with strangers requires caution
- Some servers have adult content despite age restrictions
BeReal
Minimum age: 13 What it is: Once-daily authentic photo sharing (front and back camera simultaneously)
Why they love it: Less curated, more genuine than Instagram What to know: Relatively lower risk than other platforms, but still has location sharing and discovery features.
Safety settings:
- Private account
- Location sharing off
- Discovery: Off
Minimum age: 16 (often ignored) What it is: Messaging with end-to-end encryption
Why they love it: Everyone uses it, group chats for school and friends What to know: Group chats can have many members, including people your teen doesn't know well. End-to-end encryption means you can't monitor messages.
Safety settings:
- Who can see my profile info: "My Contacts"
- Who can add me to groups: "My Contacts"
- Live location: Check they understand when this is shared
The Age Verification Problem
Let's address the elephant in the room: age verification on social media is essentially honour-system.
Your 11-year-old can sign up for Instagram by entering a false birth year. Platforms know this. Parents know this. The government knows this. And yet here we are.
What you can do:
- If your child is under 13, talk honestly about why age limits exist
- If they're already on platforms underage, work with the reality rather than pretending otherwise
- Use family pairing and supervision features—even if they signed up with a false age, these can often still be enabled
- Focus on safety practices rather than policing age compliance
The Online Safety Act is bringing changes, but effective age verification remains elusive.
Monitoring vs Trust: Finding the Balance
There's no consensus among parents (or experts) about how much to monitor teenage social media use.
The Arguments
For monitoring:
- They're still developing judgment
- Risks are real and can escalate
- Knowing you might check encourages caution
- It's no different from knowing who their friends are offline
For privacy:
- Excessive monitoring damages trust
- They need to develop self-regulation
- They'll find workarounds if determined
- Teenagers need space for identity development
Practical Middle Ground
Rather than all-or-nothing, consider:
-
Be transparent. If you're checking their social media, tell them. Secret surveillance damages relationships.
-
Start tight, loosen gradually. More oversight at 13, more privacy at 16, demonstrated trust earns freedom.
-
Focus on patterns, not content. Are they stressed after being online? Withdrawing from offline life? Up late scrolling? These patterns matter more than individual posts.
-
Keep conversations open. They should feel able to tell you about problems without fear of losing their devices.
-
Model good behaviour. Your own phone use is visible to them.
Conversations That Actually Work
The best protection is a teenager who talks to you. That requires ongoing conversations, not one big "social media safety talk."
Making It Normal
- Ask about what they're seeing, without judgment
- Show interest in creators they follow
- Share things you've seen (appropriate content) and discuss them
- Ask for their help understanding platforms you don't use
Warning Signs to Address
- They seem distressed after using phones
- Secretive about who they're talking to
- Receiving messages late at night
- Sudden changes in friendship groups
- Declining offline activities
If They Disclose a Problem
- Thank them for telling you
- Stay calm, even if you're worried or angry
- Don't immediately take away devices (this discourages future disclosure)
- Work together on next steps
- Report content or accounts if appropriate
Reporting and Getting Help
On Platforms
All major platforms have reporting features:
- Long-press or tap on content > Report
- Block users who are inappropriate
- Download data before reporting if needed for evidence
External Help
- CEOP (Child Exploitation and Online Protection): For suspected grooming or abuse. Report at ceop.police.uk
- Internet Watch Foundation: For child sexual abuse material online
- Childline: 0800 1111 (for young people)
- Report Remove: Tool for young people to report sexual images of themselves online
The Online Safety Act
The UK's Online Safety Act (passed 2023, being implemented) will require platforms to:
- Prevent children from accessing harmful content
- Enforce age limits more rigorously
- Give parents more tools for oversight
- Be more accountable for what's on their platforms
Implementation is ongoing, so expect changes to platform features over coming years.
The Long View
Social media isn't going away, and your teenager will be navigating it as an adult without you looking over their shoulder.
Your job isn't to protect them from all risk—it's to help them develop the judgment to handle risk themselves. That means:
- Teaching critical thinking about content
- Discussing healthy online relationships
- Modelling good digital behaviour
- Keeping communication open
- Stepping back gradually as they demonstrate responsibility
The goal is a young adult who knows how to use social media in ways that add to their life rather than detract from it. That's built over years of conversations, not through a single set of controls.
Concerned about how much time they're spending online? Read our guide to managing screen time with teenagers.