Advertisement

Opinion | Why Twitter’s firing of content moderators is a sign of worse to come

  • With essential workers such as content moderators cut, hate speech has quickly increased on the social media platform
  • This does not bode well for a platform that already had content moderation problems, and which might descend into chaos

Reading Time:4 minutes
Why you can trust SCMP
5
A view of Twitter’s office in New York on November 18. Amid mass firings and resignations since Elon Musk took control, the social media company’s future is in doubt. Photo: EPA-EFE
News cycles dedicated to Elon Musk’s troubled takeover of Twitter and his subsequent mission to trim layers of the workforce have dominated discussions in the tech space. After Musk took control of the platform late last month, more than half of Twitter’s 7,500 global workers have lost their jobs or resigned due to new policies early this month.
Advertisement
In addition, an estimated 4,400 out of Twitter’s 5,500 contractors were fired. While most of these contract workers focused on aspects such as engineering, real estate, and marketing, some of them were involved in the crucial work of content moderation.

Content moderators are among the most essential workers in any major social media platform: not just Twitter, but also companies such as Facebook and TikTok. Content moderators are responsible for pulling inappropriate or graphic content such as videos of suicide, hate speech, violence, pornography and disinformation and reviewing content reported for various violations of company policy.

In this sense, moderators are an invisible army essential for protecting Twitterati and other social media users from the depths of human depravity. Given the real-world impact of social media, content moderators also play a vital role in maintaining peace and reducing hate speech and hate crimes.

It is their job to pull disinformation during election cycles and to remove videos posted by terrorist groups such as Islamic State, for example. But the violent nature of the videos reviewed has resulted in content moderators experiencing symptoms of post-traumatic stress disorder or feelings of isolation, and thus a high turnover rate (some employees quit after about a year).

Advertisement

In early signs of the content moderation staff cull, hate speech directed at various racial, religious, and sexual minorities has increased. For example, the number of tweets using the racial slur directed at African-Americans rose by 500 per cent in the first 12 hours following Musk’s takeover. There has also been an uptick in anti-Semitic hate speech on the platform.

loading
Advertisement