Pareto.AI logo

Content Integrity Auditor Role

Pareto.AI
Full-time
Remote
United States
Analyst

About us

At Pareto.AI, we’re on a mission to enable top talent around the world to participate in the development of cutting-edge AI models.

In coming years, AI models will transform how we work and create thousands of new AI training jobs for skilled talent around the world. We’ve joined forces with top AI and crowd researchers at Anthropic, Character.AI, Imbue, Stanford, and University of Pennsylvania to build a fair and ethical platform for AI developers to collaborate with domain experts to train bespoke AI models.

Context

As the volume of task submissions continues to grow, so does the need for systematic evaluation of originality and authenticity. To safeguard data quality and ensure that outputs meet client and project standards, we propose creating a Content Integrity Auditor role under the QA Lead. This position will help detect and flag templated, AI-generated, or otherwise non-original content, ensuring our datasets remain high-quality, diverse, and aligned with intended use cases.

Role Overview

The Content Integrity Auditor is a full-time role reporting to the QA Lead. This position will focus on evaluating task submissions for signs of templating, AI-generated content, or plagiarism, while also providing recommendations for remediation and prevention. The auditing specialist will play a key role in maintaining originality standards, refining detection methods, and collaborating with stakeholders to uphold content integrity at scale.

Core Responsibilities

  • Review and evaluate task submissions for non-original patterns, including templated responses, AIGC outputs, or plagiarized material.

  • Document and standardize guidelines for detecting non-original or low-value content.

  • Collaborate with the QA Lead to refine detection frameworks and integrate them into QA workflows.

  • Identify recurring sources of non-original content and propose corrective actions.

  • Provide feedback and guidance to project teams and external contributors to reinforce originality standards.

  • Support training initiatives by helping define clear criteria and examples of acceptable vs. non-original submissions.

  • Prepare periodic reports summarizing findings, trends, and risks in non-original content.

  • Offer flexible support on related quality tasks as needed to address immediate project priorities.

Expected Impact

  • Increased originality and diversity of task submissions.

  • Reduced presence of templated, AI-generated, or plagiarized content in datasets.

  • Clear, standardized processes for detecting and addressing non-original material.

  • Scalable auditing practices that adapt to higher project volumes and evolving risks.

  • Improved trust and alignment with client expectations around authenticity and quality.

Required Qualifications

  • Prior experience in content moderation, QA, or auditing roles focused on detecting non-compliance, low-quality, or policy-violating submissions.

  • Experience with AI/ML datasets or annotation, particularly in evaluating text quality.

  • Understanding of large language models (LLMs), generative AI, and common indicators of AI-generated content.

  • Strong documentation and communication skills to ensure standards are applied consistently across teams.

Preferred Qualifications

  • Familiarity with plagiarism detection, copyright compliance, or originality review in academic or professional contexts.

  • Background in auditing, copyediting, or fact-checking with an emphasis on consistency and originality.

  • Ability to maintain exceptionally high attention to detail and consistency over prolonged periods, even when reviewing repetitive or large volumes of content.

  • Analytical or research experience involving pattern recognition, anomaly detection, or data quality assurance.