At ScreenLog, operated by App Sense Limited, we are deeply committed to the safety, well-being, and protection of all users on our platform — especially children. We recognize the unique vulnerabilities of minors in digital environments and have established robust measures to prevent abuse, exploitation, and any form of harmful conduct.
This Child Safety Policy outlines our approach to preventing Child Sexual Abuse and Exploitation (CSAE), reporting obligations, content moderation, compliance with laws, and ways users can help us keep ScreenLog a safe and respectful space for everyone.
We are dedicated to creating and maintaining a platform that is safe for all users, including children. ScreenLog does not permit any form of content or activity that exploits or endangers children, and we enforce strict standards to detect, prevent, and act upon such incidents.
We apply a zero-tolerance policy to any content or behavior involving:
Child Sexual Abuse Material (CSAM)
Child grooming or solicitation
Exploitative communication or images involving minors
Harassment, bullying, or manipulation of underage users
Any violation of these standards leads to immediate account suspension or removal, and in certain cases, referral to law enforcement.
We actively monitor user-generated content through automated systems and manual review to detect policy violations. In addition, ScreenLog empowers users to report content or behavior that appears to threaten the safety of children or violates our guidelines.
Users can report:
Inappropriate conversations or messaging involving minors
Suspicious accounts or impersonation
Sharing of explicit images or media
Bullying, harassment, or abuse
All reports are reviewed by our trained moderation team in a timely manner. Based on the severity of the violation, we take one or more of the following actions:
Content removal
Temporary or permanent user bans
Device or IP bans
Referral to appropriate legal authorities
App Sense Limited complies with all applicable laws and regulations regarding the protection of children, including data privacy, age restrictions, and mandatory reporting obligations.
When we become aware of confirmed CSAE material or behavior, we promptly report such cases to relevant local, national, or international authorities, including law enforcement or child protection agencies. We work in full cooperation with these entities to assist with investigations.
We also comply with international frameworks and collaborate with trusted child safety organizations where applicable.
While ScreenLog is not explicitly targeted toward children, we recognize that minors may access the app. Therefore, we have implemented technical and design safeguards, such as:
Content filters and moderation protocols
Account restrictions and reporting tools
Limited data collection in line with children’s privacy regulations (e.g., COPPA, GDPR-K)
Age-gating and user education to prevent exposure to harmful material
All employees, moderators, and support staff involved in content review or user safety are trained on:
Identifying CSAE content and behavior
Managing user reports with sensitivity and urgency
Legal obligations for reporting and documentation
This Child Safety Policy is reviewed and updated regularly to ensure alignment with new regulations, industry standards, and emerging risks in the digital space.
We encourage all users to actively contribute to a safe environment on ScreenLog by:
Reporting harmful behavior or content through in-app tools
Avoiding engagement with suspicious accounts
Educating children about safe online practices
Reaching out to us if you have any concerns or questions
If you have any questions about this Child Safety Policy or wish to report an issue that threatens child safety on our platform, please contact us at:
📧 appsenselimited@gmail.com
We take all concerns seriously and handle them with discretion, urgency, and care.