Last Updated: December 2024
1.Introduction and Commitment
Lumi places the highest priority on protecting children and young people from harm, exploitation, and abuse in all forms. We enforce a strict zero-tolerance policy for any content, behavior, or activity that puts minors at risk. Our commitment extends beyond content moderation to include proactive measures, swift enforcement actions, and collaboration with child safety organizations and law enforcement. Every member of the Lumi community shares the responsibility to create and maintain a safe environment where young users can engage without fear of exploitation or abuse.
This policy outlines Lumi's comprehensive approach to protecting minors and details our strict enforcement procedures. We maintain absolute compliance with all applicable laws and regulations regarding the prevention and reporting of child sexual abuse and exploitation (CSAE). Any violation of this policy results in immediate and permanent account termination without exception, and all instances of CSAE material are reported to relevant authorities. These measures reflect our unwavering commitment to maintaining a zero-tolerance environment for any activities that put minors at risk.
2.Age Requirements and Verification
Users must be at least 18 years old to access or use Lumi.
We employ advanced technical detection methods to analyze profile photos and videos on a frame-by-frame basis to identify and block minors from accessing the platform. This automated screening is supplemented by human moderation to ensure accuracy. Users are required to verify their age during registration, and any attempt to falsify age information to gain platform access is strictly prohibited. Discovery of underage users or age misrepresentation will result in immediate and permanent account termination.
Lumi reserves the right to request additional age verification at any time and may restrict or suspend access pending successful verification. Our age verification process is regularly updated to maintain effectiveness and adapt to emerging technologies and risks.
3.Prohibited Content
You explicitly acknowledge and agree that any form of CSAE, including but not limited to the sexualization of minors or the depiction of sexual content involving minors, is strictly prohibited on our platform. This policy applies to all users and encompasses any attempt to solicit, share, distribute, or promote such content, regardless of its form or intent.
3.1Child Sexual Abuse and Exploitation
We employ both automated and manual monitoring systems to detect and prevent CSAE. Any attempts to bypass or circumvent these detection systems will result in immediate action. The following activities are expressly prohibited and will be considered violations:
A.Sharing, creating, or distributing content related to child sexual abuse or exploitation.
B.Engaging in any activity that sexualizes minors, including sexualized discussions, roleplaying, or the use of inappropriate attire.
C.Depictions of child nudity or sexualization, including through artwork, drawings, or animations.
3.2Other Prohibited Content Related to Minors
Harmful Acts Involving Minors:
1)Threatening, encouraging, or engaging in physical harm toward minors.
2)Engaging in psychological manipulation, coercion, or abuse of minors.
3)Promoting or depicting dangerous behaviors involving minors.
4)Advocating for or depicting child neglect, trafficking, or exploitation.
Underage Presence and Account Restrictions:
1)The presence of minors in live streams, videos, or any content on the platform is strictly prohibited.
2)Creating, managing, or facilitating accounts on behalf of minors is not allowed.
3)Any attempts to falsify age or bypass platform age restrictions are prohibited.
By using our platform, you agree to comply with these rules and understand the consequences of violating them.
4.Reporting and Enforcement
In our commitment to maintaining a safe environment for all users, we have established comprehensive measures for reporting and addressing instances of CSAE on our platform. This section outlines the various channels available for reporting CSAE content, including our in-app user reporting mechanism and the assistance provided by dedicated organizations worldwide.
4.1In-App Reporting
Our platform provides multiple channels for reporting potential violations of our child safety policies:
A.In-app flagging system enables immediate reporting of concerning content
B.Direct reporting button available on all user profiles and content
C.Email channel: help@parameapp.com for detailed reports
We strongly encourage all users to report any content or behavior that may violate our child safety policies. Your vigilance plays a crucial role in maintaining a safe environment for our community.
4.2External Reporting Channels
Below, we provide a list of reputable organizations and resources where individuals can seek support, report CSAE incidents, and access additional information:
a)International: INHOPE Association
b)North America, Australia, New Zealand: National Center for Missing & Exploited Children (NCMEC)
c)South America and Other Regions: International Centre for Missing & Exploited Children
d)Europe: Law Enforcement Reporting Channels for Child Sexual Coercion and Extortion
e)France: l'Association des Fournisseurs d'Accès et de Services Internet
f)Germany: Jugendschutz
g)Hong Kong: 家庭衝突及性暴力政策組
h)India: Childline 1098 India
i)Japan: Internet Hotline Center Japan
j)South Korea: Korea Communication Standards Commission
k)United States: National Center for Missing and Exploited Children
5.Content Moderation
5.1Review Process
Our multi-layered moderation system combines automated technology with human expertise to ensure rapid detection and response to potential violations:
Initial Screening:
a)Automated pattern recognition systems scan all uploaded content
b)AI-powered image and video analysis technology
c)Real-time filtering of known prohibited content
Human Review:
a)Dedicated team of trained content moderators
b)Prioritized review queue for child safety reports
c)Specialized training in identifying CSAE content
Response Protocol:
a)Immediate content removal upon violation confirmation
b)Account suspension pending investigation
c)Evidence preservation for law enforcement when necessary
d)Partnership with industry-leading content moderation services for continuous improvement
All reports are treated with the highest priority and undergo thorough review within our moderation system.
5.2Response Times
Lumi maintains a dedicated content moderation team that prioritizes reports involving potential harm to minors. All such reports are treated with the highest urgency and processed within 24 hours of receipt. Our team works around the clock to ensure swift action on potential violations, with most cases reviewed and addressed significantly sooner than the maximum response time.
6.Prevention and Education
6.1Policy Enforcement
1)All violations of our Child Safety Policy result in immediate account termination
2)We preserve evidence and cooperate with law enforcement agencies as required by law
3)Our moderation team maintains detailed records of all enforcement actions
6.2Community Education
We maintain platform safety through ongoing user education:
1)Prominently displayed Child Safety Policy across the platform
2)Clear content guidelines during upload and sharing processes
3)In-app notifications about responsible platform usage
6.3Staff Training
All Lumi staff members undergo comprehensive child safety training as part of their onboarding process and receive regular updates throughout their employment. This mandatory training covers identification of potential risks, proper handling of reports, trauma-informed response procedures, and current legal requirements.
Our content moderation team receives additional specialized training on detecting and addressing child exploitation, collaborating with law enforcement, and implementing preventive measures. Training materials and procedures are regularly updated to reflect emerging threats, new regulations, and best practices in online child safety.