Skip to main content

🛡️ How Does Currently Moderate Content?

Here’s how it works:

Mitesh Shethwala avatar
Written by Mitesh Shethwala
Updated over a month ago

At Currently, we’re committed to maintaining a space that is authentic, safe, and respectful — where people can share their real-time lives without fear of abuse, harassment, or harmful content.

To protect this environment, we use a combination of automated systems, community reporting, and human moderation to monitor and manage content.


🔍 1. Community Reporting (Your Voice Matters)

  • Anyone can report a moment, comment, message, or user

  • Tap the ‘⋮’ icon on the content → Select “Report” → Choose a reason

  • Reports are confidential and reviewed promptly by our moderation team

Common report reasons include: hate speech, harassment, nudity, spam, misinformation, impersonation, etc.


🤖 2. Automated Detection Systems

We use machine learning tools and filters to:

  • Detect nudity, explicit content, or graphic violence

  • Flag spammy behavior or bots

  • Identify accounts that break rules repeatedly

These systems operate 24/7 and can take immediate action in some cases (e.g., removing extreme content or limiting reach).


👁️ 3. Human Moderation Team

Every flagged or reported moment is reviewed by our trained moderation team, who:

  • Evaluate context and intent

  • Decide whether to warn, restrict, remove content, or ban users

  • Apply community guidelines fairly and transparently

For repeat or serious violations, accounts may be suspended or permanently banned.


⚖️ 4. Content Guidelines We Enforce

Currently does not allow:

  • 🚫 Nudity or sexually explicit content

  • 🚫 Hate speech or discrimination

  • 🚫 Bullying, threats, or harassment

  • 🚫 Violence or harmful behavior

  • 🚫 Spam, fake accounts, or misleading content

  • 🚫 Invasion of privacy (e.g., sharing others' personal info without consent)


🛠️ 5. What Happens After You Report?

Step

What Happens

You report content

It’s immediately flagged for review

We investigate

Our moderation team reviews the report and content

Action is taken

We may remove the content, issue warnings, or suspend account

You stay informed

In some cases, you’ll be notified when action is taken


✅ Summary

Moderation Method

Role

Community Reporting

Empowers users to flag unsafe or inappropriate content

Automated Filters

Catch rule-breaking behavior before it spreads

Human Moderators

Carefully review reports, apply fair enforcement

Content Guidelines

Define what’s acceptable and what’s not


🙌 Help Keep Currently Real & Safe

If you see something that doesn’t belong here, report it immediately.
Your actions help us protect the platform for everyone.

Need help reporting or want to appeal a moderation decision?
Visit the Help Center or contact Support — we’re here to listen and take action.

Did this answer your question?