SocialNest โ Standards Against Child Sexual Abuse and Exploitation (CSAE)
Last updated: 12 August 2025
Overview
SocialNest has a zero-tolerance policy for child sexual abuse material (CSAM), child sexual exploitation, grooming, and any behavior that harms or endangers minors. We are committed to preventing such content on our platform and to cooperating with law enforcement and recognized organizations to protect children.
Prohibited content
The following are expressly prohibited on SocialNest:
Any visual, written, or audio depiction of sexual abuse or exploitation of a minor.
Sexualized depictions of minors, including digitally created, altered, or drawn images that depict sexual activity or pose a sexualized portrayal of a minor.
Grooming, sexual solicitation, or sexual communication directed at or involving a minor.
Links, instructions, or requests for CSAM or any material that facilitates abuse or exploitation.
Reporting, review & removal
In-app reports: Users should use the in-app "Report" feature on any post, message, or profile that raises concerns. Reports are reviewed by our safety team 24/7.
Email reports: You may also email incidents to support@socialnestapp.com. Please include URLs, screenshots (if safe to share), usernames, and timestamps when possible.
Immediate removal: Confirmed CSAE content will be removed immediately.
Preservation of evidence: When content is reported and suspected to be CSAE, we preserve relevant logs and metadata for potential law enforcement investigations, in accordance with applicable law.
Cooperation with law enforcement & external organizations
SocialNest cooperates fully with law enforcement, regulatory authorities, and recognized child protection organizations. When required by law or where we reasonably believe an offense has occurred, we will disclose account and content information to competent authorities.
Where applicable, we report verified CSAM to the National Center for Missing & Exploited Children (NCMEC) or equivalent national bodies.
Detection & prevention
We use a combination of automated detection (hash-matching, pattern detection) and trained human reviewers to identify potential CSAE.
Our systems are updated regularly to improve detection accuracy and reduce false positives.
We limit access to reports and preserved data to authorized personnel only, and provide training for staff handling CSAE matters.
User responsibilities & enforcement
Users must not upload, share, solicit, or request CSAM or engage in grooming behavior.
Accounts found to be involved in CSAE will be permanently suspended and reported to law enforcement.
We may retain certain data for investigation, safety, and legal compliance purposes in accordance with our Privacy Policy and applicable law.
Confidentiality & safety of reporters
We take steps to protect the identity of reporters and victims. Information is shared only with authorized investigators or as legally required. If you are a victim or are reporting on behalf of a victim, we encourage you to contact local law enforcement as well.
Legal compliance
SocialNest complies with applicable laws and regulations regarding child protection and online safety, including laws in the jurisdictions where we operate. This includes cooperation with national reporting mechanisms and observance of data retention and disclosure rules when required.
How we handle reports (step-by-step)
Receive report via in-app tool or email.
Automatic filters and triage escalate high-risk reports for immediate human review.
Safety reviewers assess the report and remove confirmed content.
Preserve logs and information needed for investigations, and notify law enforcement or child protection organizations where required.
Take enforcement action on user accounts (warnings, suspension, permanent ban) as appropriate.
Contact & reporting
If you need to report CSAE or have questions about this policy, contact us at:
If the situation is an emergency or involves an ongoing risk to a child, please contact your local law enforcement immediately in addition to contacting SocialNest.