About the team
Our Operations team moderates millions of videos daily at a global level to ensure the contents are safe for our users. As a Quality team, we need to review and audit the contents which were moderated by our moderators, to handle appeals, providing feedback and support on policy questions.
It is possible that this role will be exposed to harmful content as part of the core role/as part of project/ in response to escalation requests/by chance.
Some content viewed may violate our community guidelines which include but are not limited to bullying; hate speech; child abuse; sexual assault; torture; bestiality; self-harm; suicide; murder.
Content Quality Assurance - English Speaking, Malaysia - Trust and Safety that interacts with includes images, video, and text related to every-day life, but it can also include (but is not limited to) bullying; hate speech; child safety; depictions of harm to self and others, and harm to animals
Responsibilities
- You will perform daily sampling tasks for audit purposes;
- You will have to thorough data analysis, feedback problems & handle necessary escalation and weekly feedback on moderation quality;
- You will provide feedback and analysis on Policy updates to identify areas of improvement and potential gaps;
- You will cooperate with moderation teams and POC for the Singapore market to ensure the moderation quality is up to standard;
- You will participate in the development of quality inspection standards and continuously optimize the process and system platform.
Laporkan kerja