- Insights
- People & Culture
- Article
Attracting and hiring top talent in content moderation
With the proliferation of social media and other online forms of communication, the amount of user-generated content (UGC) available today continues to grow across the globe.
But UGC is not always funny cat videos, five-star restaurant reviews or photos of your friend’s perfectly cooked steak. At times, UGC can be inappropriate, misleading, disturbing and even illegal. And, according to a TELUS Digital survey, the amount of inappropriate and inaccurate UGC is on the rise — with more than half of respondents (54%) saying there is more now than before the start of the pandemic (March 2020).
The explosion of digital technology has provided unique opportunities for businesses to interact with consumers in new and engaging ways. At the same time, brands shouldn’t overlook their responsibility to create and maintain digital experiences that are truthful, welcoming and safe. But the sheer amount of content that needs to be reviewed is immense, which makes content moderation — the monitoring and screening of UGC — challenging for brands. For example, in 2020 Statista reported that more than 500 hours of video was uploaded to YouTube every minute. That equates to approximately 720,000 hours of video every day. And according to Facebook, there are one billion stories shared daily across its family of apps.
Developing a content moderation strategy that meets this challenge head on is critical for businesses, as exposure to deepfakes, spam, toxic content and bad actors negatively impacts user experiences and can damage a brand’s reputation. In fact, according to the aforementioned TELUS Digital survey, 45% of Americans said they quickly lose trust in a brand if exposed to toxic or fake UGC on its channels, and more than 40% of respondents said they would go so far as to disengage from a brand’s community after as little as one exposure.
Today, brands are turning to technology and a specialized group of people to protect users and provide enhanced customer experiences (CX).
The human component of content moderation
Moderating content in the online world is not a simple task. To be effective, a content moderation strategy must combine the best of both humans and technology. Even with the help of artificial intelligence (AI) and machine learning (ML) algorithms, the need for human content moderation remains extensive.
Thankfully, there are people who are proud to fill these roles. Often touted as the heroes of the online world — and affectionately known as ‘digital first responders’ at TELUS Digital — content moderators are the first line of defense against explicit content and fraudulent activities. “Content moderators are motivated by the fact that they are doing more than just a job, they are doing something beneficial for the community,” said Karen Muniz, regional director of talent acquisition at TELUS Digital. “They realize the importance of content moderation, are extremely proud of the work they do and are thankful that they are protecting others from being exposed to, what can sometimes be, disturbing content.”
Best practices for attracting and hiring content moderators
When it comes to hiring content moderators, it’s not just a numbers game. A successful talent acquisition strategy should be focused on finding the right people to fill vacancies.
To ensure they are providing safe, trusted platforms for their customers, brands can focus on the following recruitment and hiring best practices specific to content moderation.
Create a multi-channel application sourcing strategy
Today, there are a plethora of resources available for sourcing high-quality talent. Brands should not only utilize traditional platforms like virtual job fairs, job boards or a dedicated career portal on their website, but also focus their efforts on applications that have gained popularity in the talent search space: social media. According to Careerarc’s 2021 Future of Recruiting Study, social and professional networks are the most used resource to recruit talent today, topping employer referrals, job boards and job ads. The study found that 72% of employers use Facebook to attract and recruit talent and 96% use LinkedIn. And it’s not just employers, 86% of job seekers are using social media in their job search.
Sourcing talent on social media delivers benefits that extend further than just providing a gateway to a wider pool of candidates, says Muniz. “Candidates who are familiar with social media, its purpose, content and technology are better able to understand the role of a content moderator and perform within the expectations of the position.”
In addition to the aforementioned platforms, nurturing regional partnerships is another talent sourcing strategy that deserves attention, says Muniz. “Candidates are very interested in employers who are visible in the community,” she adds. “Partnering with local organizations provides an opportunity for employers to increase their talent pipeline and to showcase their company culture.”
Set realistic job expectations when hiring
Content moderators have to review raw UGC, some of which can be disturbing. Setting very specific role expectations — starting in the interview process and extending to new hire training — that outlines the type of content that moderators might see is essential.
As part of setting realistic expectations for applicants, it’s also important to showcase the resources that will be available to them as part of the job. “Candidates need to know that the employer they choose will have their best interests in mind as a valued team member, and they will not be treated as just a number,” Muniz explains. “This is especially true for content moderator jobs where sound mental health is an important component of the role. Employers have a responsibility to provide resources that protect the mental health of employees, ensuring that they are able to continue thriving in both their professional and personal lives.”
Robust well-being programs should enable and empower content moderators to perform well and enjoy their work, while maintaining their mental and physical health. Effective well-being programs should include the following characteristics:
- Physical infrastructure such as quiet/relaxation spaces and games areas.
- Emotional support including dedicated benefits, wellness coordinators, individualized and group sessions and activities, and mental health assistance facilitated by licensed medical providers.
- Operational practices such as: break policies that ensure adequate time for moderators to de-stress and decompress; workflow rotations to ensure moderators are continuously switching among content types and sensitivity; team building activities; and continuous training focused on stress management and resilience.
- An appropriate feedback structure that collects moderator input at various stages of their tenure. “At TELUS Digital, our wellness surveys help us understand what we may not know about the mental health of our team members and we use that information to continually evolve our overall wellness program,” said Muniz. “Exit interviews influence our recruitment strategies, as they help us identify ways we can improve our screening practices to assess our candidates more effectively, provide insight into information that could be added to the onboarding process to better prepare candidates and more.”
- Post-separation support for content moderators who have left the organization. This includes, but is not limited to, around-the-clock mental health and wellness services for up to a year after employment and extended health insurance coverage for those who require additional psychological support.
Hire candidates with the right set of skills
Because of the challenging nature of content moderation work, hiring team members with the right temperament, skills and experience contributes to creating a resilient team.
In addition to problem solving, decision making and critical thinking skills, digital experience is a key competency for content moderators. This is especially pertinent because content moderators will need to be proficient in using the online platforms they’ll be moderating and able to effectively utilize the digital tools provided to them to assist in handling large amounts of UGC.
A culturally diverse worldview is also something recruiters should be looking for in their candidates, says Muniz. “One of the main responsibilities of content moderators is having the ability to be objective when making decisions. Hiring individuals that are able to understand different personal, social and cultural perspectives fosters an objective environment in content moderation.”
But above all, says Muniz, emotional intelligence (EQ) — the ability to perceive, control and evaluate emotions — is the most important characteristic to look for in content moderators since they need to have sound coping mechanisms as well as emotion regulation and stress management skills in order to handle the content they are reviewing.
To ensure candidates are the right fit for a position, various psychological evaluations can be administered. For example, a Personal Profile Analysis (PPA) can help classify the behavioral preferences of those applying, and screening processes can be structured to evaluate the level of correlation between the applicant’s behavioral traits in comparison to the traits that were identified as essential for the job. Word association exercises can also be completed to evaluate if a candidate has any strong or negative feelings about a topic or if they display any inherent biases that could affect their ability to make objective judgements on the content they will be reviewing on a day-to-day basis.
The need for content moderation outsourcing
From improving user experiences to driving growth and revenue, to achieving compliance with regulations, to protecting a brand’s reputation and more — it is clear that the business benefits of content moderation are many. Brands have a responsibility to create digital experiences that are safe for their customers, and a workplace environment that protects the mental and physical well-being of their digital first responders. This monumental task can feel daunting for business leaders, but they shouldn’t let it overwhelm them. Thankfully, brands have the opportunity to leverage the expertise of a qualified trust and safety partner to help take existing content moderation operations to the next level, or build them from the ground up.
If you are interested in discussing how to launch or enhance your own content moderation and/or trust & safety programs, reach out to our team of experts.