With more than 2 billion monthly active users, Facebook appears to be succeeding in its mission to “bring the world closer together.” But as anyone who has spent time casually scrolling through their news feed knows, not everything posted on the social network is positive and unifying. Along with heated political arguments and unflattering pictures of friends, users can also post violent or offensive content that is far too extreme to be allowed on the site. Fortunately, users can report these posts to Facebook’s moderators who determine whether or not the content is fit to stay online.
All told, the site’s team of 7,500 content moderators sorts through more than 10 million potentially rule-breaking posts every week. This huge amount of offensive content can take a psychological toll on employees as they confront the worst of humanity day after day. As one former moderator put it: “There was literally nothing enjoyable about the job. You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see.” This anonymous staffer certainly isn’t alone in their fierce criticism of Facebook’s content moderation. In fact, a recent lawsuit filed against the social network claims the plaintiff developed post-traumatic stress disorder while working as a moderator for the site.
The complaint says that the employee worked at Facebook for nine months and encountered thousands of violent images during that time. As a result, her PTSD can become triggered “when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises or is startled.” The suit also claims that the company did provide sufficient mental health services or training methods for content moderators. “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job,” said lawyer Korey Nelson, who is currently seeking class-action status for the suit.
Facebook stands by its methods, though, and plans to fight the lawsuit. “We recognize that this work can often be difficult,” said Facebook’s director of corporate communications Bertie Thompson. “That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”
- What could Facebook do to make its content moderation strategy less stressful for employees?
- Should Facebook depend more on automated programs that can search for offensive content without the help of humans? What would be the advantages and disadvantages of this system?
Sources: Sandra E. Garcia, “Ex-Content Moderator Sues Facebook, Saying Violent Images Caused Her PTSD,” The New York Times, September 25, 2018; Helen Holmes, “A Former Content Moderator Is Suing Facebook Because Her Job Gave Her PTSD,” Observer, September 24, 2018.