The panic assaults began after Chloe watched a person die.
She spent the previous three and a half weeks in coaching, making an attempt to harden herself in opposition to the day by day onslaught of disturbing posts: the hate speech, the violent assaults, the graphic pornography. In just a few extra days, she’s going to turn out to be a full-time Facebook content material moderator, or what the corporate she works for, an expert companies vendor named Cognizant, opaquely calls a “process executive.”
For this portion of her schooling, Chloe must average a Facebook submit in entrance of her fellow trainees. When it’s her flip, she walks to the entrance of the room, the place a monitor shows a video that has been posted to the world’s largest social community. None of the trainees have seen it earlier than, Chloe included. She presses play.
The video depicts a person being murdered. Someone is stabbing him, dozens of occasions, whereas he screams and begs for his life. Chloe’s job is to inform the room whether or not this submit must be eliminated. She is aware of that part 13 of the Facebook group requirements prohibits movies that depict the homicide of a number of folks. When Chloe explains this to the category, she hears her voice shaking.
Returning to her seat, Chloe feels an overwhelming urge to sob. Another trainee has gone as much as overview the subsequent submit, however Chloe can not focus. She leaves the room, and begins to cry so arduous that she has bother respiratory.
No one tries to consolation her. This is the job she was employed to do. And for the 1,000 folks like Chloe moderating content material for Facebook on the Phoenix website, and for 15,000 content material reviewers round the world, immediately is simply one other day on the workplace.