PINKVILLA logo

Content Moderation Lead (Mumbai Based)

PINKVILLA
3 hours ago
Full-time
On-site
Mumbai City, Maharashtra, India
Manager

Content Moderation Lead

Location: Mumbai

Experience: 4–8 years (flexible based on fit)

Employment Type: Full-time



About the Role

We are looking for a Content Moderation Lead to own and scale our app’s Trust & Safety + Content Integrity function. This role will ensure the platform remains safe, positive, inclusive, and policy-compliant while supporting rapid community growth across creators, power users, and student communities.

You will be responsible for building moderation systems, defining policies, leading a moderation team/vendor, managing escalations, and improving moderation quality and turnaround times.


Key Responsibilities

1) Moderation Strategy & Execution

  • Own end-to-end moderation across all content formats (posts, comments, messages, profiles, media, live interactions, etc.).
  • Ensure fast and accurate review of content to maintain a safe user experience.
  • Set up scalable moderation workflows for high-volume growth.

2) Policy & Guidelines Development

  • Create and maintain platform policies and community guidelines including:
  • hate/harassment/bullying
  • explicit or adult content
  • violence/self-harm
  • misinformation/spam/scams
  • impersonation/fake profiles
  • copyright/IP violations
  • Define enforcement actions: warnings, removals, temporary bans, permanent bans, appeals.

3) Escalations & Critical Incident Handling

  • Handle high-risk escalations and sensitive cases with urgency and discretion.
  • Coordinate with internal stakeholders (Product, Legal, Support, Growth) on complex issues.
  • Create incident playbooks for crisis situations (viral abuse, brigading, spam attacks, etc.).

4) Moderation Operations & Team Leadership

  • Build, train, and manage a moderation team (in-house and/or vendor).
  • Create SOPs, QA processes, and training modules for consistent decision-making.
  • Run regular audits and coaching sessions to improve accuracy and reduce errors.
  • Manage shift planning and coverage for peak hours/events.

5) Tooling, Automation & Process Improvement

  • Work with Product/Tech to build moderation tooling such as:
  • reporting and flagging systems
  • keyword and pattern detection
  • auto-hiding and queue prioritization
  • creator verification / risk scoring
  • Improve moderation speed and efficiency through automation and better workflows.
  • Build an escalation ladder and routing system (what gets reviewed first, by whom, and why).

6) User Reporting, Appeals & Transparency

  • Own user reporting flows and ensure users feel heard and protected.
  • Design appeal workflows and resolution SLAs.
  • Ensure communication templates are clear, respectful, and consistent.

7) Quality Control & Metrics Reporting

  • Track and report key moderation metrics, such as:
  • average review turnaround time (TAT)
  • accuracy and QA pass rate
  • false positives / false negatives
  • number of reports received vs actioned
  • repeat offenders and ban effectiveness
  • top violation categories and trends
  • Share weekly/monthly insights with leadership to improve platform safety.

8) Cross-Functional Collaboration

  • Work closely with:
  • Community team (to protect healthy engagement)
  • Product/Engineering (to improve tooling and reduce abuse)
  • Legal/Compliance (where required)
  • Support launches, campus drives, and creator campaigns by ensuring moderation readiness.


Other Tasks

  • Build a Trust & Safety roadmap for the next 3–6 months
  • Create a moderation playbook for creators, student communities, and ambassadors
  • Run proactive abuse prevention programs (spam detection, account verification)
  • Create category-specific policies (college groups, creator content, trending formats)
  • Build “high-risk event readiness” (fests, campaigns, viral moments)


Skills & Qualifications

  • 4–8 years experience in content moderation, trust & safety, risk ops, compliance, or community operations
  • Strong understanding of online abuse patterns and platform safety
  • Ability to make consistent judgment calls and manage sensitive situations
  • Experience managing moderation teams/vendors is a strong plus
  • Comfortable working with data, dashboards, and reporting
  • Strong documentation skills (SOPs, training guides, policy writing)


Preferred Candidate Traits

  • High ownership and calm under pressure
  • Strong ethical judgment and fairness
  • Process-driven, detail-oriented, and highly organized
  • Comfortable operating in fast-moving, high-growth environments
  • Good communication skills (especially for escalations and internal coordination)