Let’s be honest here; the internet is a swamp filled with danger at every turn. It’s perilous for children to navigate, and it’s only gotten worse over the years. The situation doesn’t come as a surprise to many of us. Life is messy and evil people exist to harm, and harm they are doing all across the internet. Companies like TikTok, Facebook, Twitter, MeWe, and other platforms are attempting to battle back by hiring moderators to review content that could be harmful.
Estimated reading time: 3 minutes
But in doing this exposes real human beings to graphic and violent images that could stick with them forever. One content moderator for TikTok is suing the company for psychological trauma due to the constant exposure to violent videos that included assault, beheadings, and suicide.
For as long as 12 hours each day, Candie Frazier and other moderators reviewed “extreme and graphic violence,” including videos of “genocide in Myanmar, mass shootings, children being raped, and animals being mutilated” in an effort to filter out such content from being viewed by TikTok users, according to the lawsuit. The legal action was filed in federal court in California last week against TikTok and its parent company, ByteDance.
Frazier developed “significant psychological trauma including anxiety, depression, and posttraumatic stress disorder” as a result of her exposure to the videos, according to the lawsuit, which is seeking class-action status. The legal challenge, which alleges that TikTok violated California labor law by failing to provide a “safe work environment,” requests compensation for moderators who were exposed to the material. It also asks that TikTok and ByteDance provide mental health support and treatment to former and current moderators.The Washington Post
Frazier was not a direct employee for the company but instead worked for Telus International who outsources workers to various larger firms.
A TikTok spokesperson declined to address the lawsuit directly, writing in a statement that the company does “not comment on ongoing litigation.” The statement said that at TikTok, “we strive to promote a caring working environment for our employees and our contractors.” It added that it will “continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”
A spokesperson for Telus International, which is not a defendant in the suit, said in a statement that the company was “proud of the valuable work our teams perform to support a positive online environment,” adding that the company has a “robust resiliency and mental health program in place.”The Washington Post
The internet has given us a lot of good, but with that good also comes the bad, and for now, real people are working to filter that bad from reaching our eyes and ears. But with that comes a cost; moderating social media is not for the faint of heart, and it certainly could traumatize a person who’s not prepared for the evil they’re going to encounter.
What do you think? Is this lawsuit warranted? Who’s responsible for the mental well-being of social media moderators? Please share your thoughts on any of the social media pages listed below. You can also comment on our MeWe page by joining the MeWe social network.
Last Updated on December 28, 2021.