Launch pricing ending soon, lock in current rate forever. Go premium!
Accenture logo

Trust & Safety Analyst

Accenture
Full-time
On-site
Hyderabad, Telangana, India
Analyst
Skill required: User-Generated Content Moderation - Content Moderation

Designation: Trust & Safety Analyst

Qualifications:Any Graduation

Years of Experience:3 to 5 years

About Accenture

Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com

What would you do? "Content moderation is meaningful work that helps keep the internet safe. It may also be challenging, at times. In the context of this role, individuals may be directly or inadvertently exposed to potentially objectionable and sensitive content (e.g., graphic, violent, sexual, or egregious). Therefore, content moderators need strong resilience and coping skills. We care for the health and well-being of our people and provide the support and resources needed to perform their role responsibilities. Active participation in Accenture’s well-being support program, designed specifically for the Trust & Safety community, provides valuable skills to promote individual and collective well-being. " "Identify the spam content to provide genuine search results. Help the victims to remove their explicit videos / Photographs from global sites. Remove the Personal Identifiable information reported by the user from various search results. Addressing ad blocking through improving ad experiences across the web. Reviewing of photos, videos, and text-based content and make judgments as to whether reviewed content is in violation of our Clients terms of services. The content may cover may be sensitive in nature. Ensuring every piece of content in violation of clients terms of services is accurately identified and flagged for action in a timely manner. " "1. Review videos all workflow for violations of policies to ensure consistent implementation 2. Have a deep understanding of policies and guidelines guidelines, and how to interpret them in order to enforce on standard and non-standard situations when needed. 3. Comprehend the policy and community guidelines to take informed decisions that balance the user safety and platform integrity. "

What are we looking for? "We are seeking highly analytical and detail-oriented individuals to join a specialized team focused on misinformation detection and GenAI-manipulated media assessment. This cross-functional role plays a vital part in supporting efforts to maintain platform integrity, reduce user exposure to harmful or deceptive content, and support compliance with evolving global standards. As an Altered and Synthetic Content Assessment Analyst you will assess whether content—particularly video—has been synthetically altered using Generative AI or contains misleading narratives. Your structured evaluations will feed into the final decision-making processes led by the client full-time content policy teams. " "Minimum Skills ? 2+ years’ Experience in content moderation, investigative journalism, or media analysis preferred. ? Familiarity with social listening tools, content evaluation platforms, or basic NLP/data analysis tools a plus ? Exceptional verbal and written skills, with the ability to collaborate effectively across global teams and present to senior stakeholders ? Strong written analysis and documentation skills. Ability to clearly summarize complex narratives and technical findings ? Excellent critical thinking, attention to detail, and ability to apply nuanced policy in ambiguous situations. Good-to-Have Skills ? Domain Knowledge: Strong understanding of content Moderation ? Creator Economy Knowledge: Experience with influencer marketing, creator partnerships, or branded content platforms. ? Problem-Solving Mindset: Track record of identifying process inefficiencies and implementing scalable solutions"

Roles and Responsibilities:

"? Analyze emerging and existing content for misleading or deceptive elements in alignment and manipulated media policies. ? Conduct contextual verification by evaluating people, events, and circumstances depicted in the content using internal tools and limited open-source checks. ? Identify high-risk narratives (e.g., health misinformation, election manipulation, deepfakes) based on severity, spread potential, and platform impact. ? Identifying altered/fabricated video content and highlighting content. ? Use structured internal tools and workflows to submit findings; your role is advisory, not enforcement based. ? Conduct plausibility reviews to help determine whether individuals, speech, or events in the video are potentially altered or misrepresented. ? Submit detailed, policy-aligned assessments with clear rationale and supporting context for review to clients. ? Track patterns and trends across flagged content to support broader strategy and detection tooling. ? Maintain consistent quality and adhere to established benchmarks for accuracy, neutrality, and timeliness."
Show more Show less