This toolkit aims to develop set of digital safety resources designed to support women, children, parents, and educators. This toolkit will include guidance on recognising online risks, improving digital resilience, and responding effectively to harmful or suspicious online activity.
Develop toolkit content: safety guidelines, threat recognition, response steps
Create age-specific and audience-specific modules (children, women, parents, educators)
Platform‑Specific Abuse Types and Recommended Prevention and Safeguarding Measures
Four major platforms across three high-risk harm categories affecting women and children:
Cyberbullying: insults, harassment, humiliation, threats.
Grooming / exploitation: sexual communication, manipulation, coercion of minors.
Image-based abuse: unsolicited sexual images, non-consensual sharing, sextortion.
Common Abuse Types
Grooming and child sexual exploitation – Nearly half of online grooming cases in the UK involve Snapchat.
Image-based abuse / cyberflashing – High prevalence due to ephemeral photo/video messaging.
Cyberbullying – About 31–33% of teens report bullying on Snapchat.
Sextortion – Threats to share intimate content for coercion or exploitation.
Safeguarding Measures
Parental guidance & monitoring: Encourage children to only accept known contacts; enable “My Friends” privacy setting for Snaps.
Education programs: Teach children to recognize grooming tactics and avoid sharing explicit images.
Reporting tools: Use Snapchat’s in-app reporting for harassment or suspicious contacts.
Ephemeral content caution: Explain that “disappearing” messages can still be screenshotted or recorded.
AI moderation: Platforms should continue deploying detection for sexual content and harassment targeting minors.
Common Abuse Types
Cyberbullying – Roughly 29.8% of online bullying incidents among teens occur here.
Harassment and misogyny – Women disproportionately targeted with abusive comments or DMs.
Grooming / sexual messages – 6% of online grooming cases in the UK involve Instagram.
Safeguarding Measures
Private accounts & comment filters: Encourage strict privacy settings and use “restrict” feature to limit abusive interactions.
Education on reporting: Make reporting abuse or harassment quick and anonymous.
Parental awareness: Discuss online conduct and warning signs of grooming or harassment.
Mental health support: Provide access to counselors if cyberbullying occurs.
Common Abuse Types
Cyberbullying – About 26.2% of reported incidents.
Harassment / stalking – Women often targeted for gender-based abuse.
Grooming – Accounts for about 10% of known online grooming cases.
Safeguarding Measures
Privacy settings: Limit friend requests and visibility of personal posts.
Reporting & blocking tools: Use the reporting system for harassment, threatening messages, or impersonation
Digital literacy: Educate users about phishing links, fake profiles, and online manipulation.
Community guidelines enforcement: Platforms should prioritize rapid action against repeat offenders.
Common Abuse Types
Grooming / sexual messaging – Around 12% of online grooming cases involve WhatsApp.
Cyberbullying in groups – Less frequent than on public social media but occurs via group chats.
Harassment – Often in the form of repeated messages, threats, or coercion.
Safeguarding Measures
Privacy settings: Set messages to “My Contacts” or limit who can add to groups.
Education: Teach children and women not to share personal info or images with unknown contacts.
Block/report: Immediate use of blocking features for harassing contacts.
End-to-end encryption caution: Even though messages are encrypted, screenshots or content sharing can still occur.
What stands out
Highest grooming and exploitation exposure (48%)
Highest image-based abuse exposure (58%)
Highest cyberbullying among youth (32%)
Why
Ephemeral messaging creates false safety
Private, one-to-one interactions dominate
Heavy use by minors
Easy escalation from chat to images
Insight
Snapchat is not just a bullying platform. It is a primary grooming and sexual exploitation channel for children, especially girls.
What stands out
Cyberbullying almost as high as Snapchat (30%)
High image-based abuse (34%)
Public comments + private DMs amplify harm
Why
Visibility and appearance-based culture
Harassment often tied to gender, body image, sexuality
Abuse often happens publicly, increasing psychological harm
Insight
Instagram abuse is often socially performative, meaning attackers seek attention, dominance, or humiliation rather than secrecy.
What stands out
Cyberbullying still significant (26%)
Grooming and stalking via Messenger and groups
Older teens and adult women more affected
Why
Real-identity profiles enable persistent targeting
Groups and comments fuel mob harassment
Stalking often extends across platforms
Insight
Facebook abuse is often persistent and identity-linked, making it harder for victims to escape.
What stands out
Lowest cyberbullying rate (9%)
Grooming and sexual messaging still present (12%)
Abuse often hidden inside private groups
Why
Encrypted messaging reduces visibility
Abuse happens in trusted-looking group contexts
Often used after initial contact on another platform
Insight
WhatsApp is rarely where abuse starts, but often where it continues or escalates.
Girls are disproportionately affected by grooming, sexual messaging, and image-based abuse
Women experience higher rates of harassment, stalking, and threats
Many victims do not report due to shame, fear, or belief nothing will change
Private messaging platforms hide abuse from adults and authorities
Snapchat
Default minors to “friends only” messaging
Stronger detection of sexualised language in chats
Friction before adding unknown contacts
Automatic comment filtering for misogynistic abuse
Default DM restrictions for teen accounts
Faster action against repeat harassers
Stronger anti-stalking enforcement
Cross-account harassment detection
Better protection for women facing coordinated abuse
Limits on unknown group additions
In-chat reporting for sexual or threatening content
Education about screenshot risks
For Children and Teens
Never share explicit images or personal info with strangers.
Report and block abusive users immediately.
Maintain open communication with parents/guardians about online experiences.
For Women
Use privacy settings aggressively to control who can contact you.
Keep evidence of harassment (screenshots) before reporting.
Seek support from trusted networks, legal authorities, or online safety organisations.
Talk about grooming tactics, not just “stranger danger”
Emphasise that disappearing messages are not safe
Encourage reporting without punishment or blame
Teach consent and digital boundaries early
Make cyber abuse reporting confidential and trauma-informed
Address gender-based harassment explicitly
For Platform
Enhance AI moderation to detect harassment, grooming, and image-based abuse
Strengthen age verification and parental control features.
Conduct outreach programs to educate users on online risks.
For Policy Makers & Law Enforcement
Promote public awareness campaigns about online abuse.
Provide dedicated units to investigate cyberbullying, grooming, and sexual exploitation.
Ensure fast reporting and takedown mechanisms for harmful content targeting women and children.
Treat online abuse as real-world harm
Mandate platform transparency on abuse statistics
Invest in child-focused cybercrime units
Practical Step-by-Step Guide on immediate response, evidence preservation, reporting, and long-term safety
Stop interaction immediately: Don’t respond to threatening messages or friend requests from strangers.
Block the abuser: All major platforms (Snapchat, Instagram, WhatsApp, Facebook) have blocking features.
Tell a trusted adult: Parent, teacher, or school counselor—don’t handle it alone.
Avoid sharing personal info or images: Even with someone familiar, avoid sending explicit photos or sensitive data.
Do not engage with the harasser: Responses can escalate threats.
Restrict communication: Block accounts, restrict messaging, or set comments to private.
Adjust privacy settings: Make profiles private, limit who can see posts, or who can contact you.
Stay in control of your digital footprint: Consider deleting or limiting sensitive posts or past content.
Take screenshots: Include timestamps, usernames, and the message or content itself.
Save files securely: Keep copies offline and in a safe location.
Note context: Record the date, time, platform, and any witnesses if abuse occurs in groups or forums.
Do not delete the original message until authorities advise: Deleting could affect investigations.
Use built-in reporting tools for harassment, bullying, sexual content, or threats.
Snapchat, Instagram, WhatsApp, and Facebook all allow anonymous reports.
Platforms often escalate serious abuse to law enforcement automatically.
Children: Parents should report grooming, sexual threats, or stalking to the police immediately.
Women: Threats, stalking, or harassment can be reported to local police; save the evidence first.
In India: Cyber Crime Police Stations handle online abuse; use the National Cyber Crime Reporting Portal (cybercrime.gov.in).
In the UK: Contact Action Fraud for cyberstalking, online harassment, or threatening messages.
Childline (India, UK) – confidential child support: 1098 (India) / 0800 1111 (UK)
NGOs: NSPCC (UK), Plan International, Internet Matters
Women’s helplines for domestic or online abuse
Abuse can cause anxiety, fear, and trauma - seek help from counselors or therapists.
Schools or workplaces may have digital safety officers or support groups.
Peer support or moderated online groups for survivors can help recovery.
Regularly review privacy settings across all platforms.
Use two-factor authentication to prevent account hijacking.
Report repeat offenders every time, they may face platform or legal action.
Learn to identify phishing, grooming, and online manipulation tactics.
Avoid posting personally identifiable information (school, location, phone number) online.