Policing of objectionable online content wrecks PTSD among content moderators
September 14th, 2017
Filed under: Mental Disorder, Mental Health, PTSD by Rachael

Today, life without social media is unimaginable. It has become an essential part of life for people around the world as an interactive, educational, informational or entertainment platform. It comes in many forms, such as blogs, forums, podcasts, photo sharing, social bookmarking, widgets, video and many more. The emergence of social networking sites (SNSs) began in the 1990s and since then has captured the imagination of the world. Today, it is a multibillion-dollar industry.

SNSs allow users to upload and share photos and videos and interact with friends and family who may even be located thousands of miles away in various parts of the world. Some of the most popular SNSs are Facebook, Twitter, LinkedIn, Google+, YouTube, Instagram, Pinterest, Tumblr, Snapchat, reddit and Flickr to name a few.

With a barrage of information, the need for content moderation has become more important than before. The content moderators play a crucial role in expunging offensive materials. Since such SNSs are accessed by people of all ages, such as children, elderly people, etc., Any kind of exposure to abhorrent content can play a havoc. In a bid to avoid such exposure, the major implications of objectionable updates that could be spiteful, racist, sexist and criminal in nature are borne by the content moderators.

Key responsibilities of content moderators

While surfing websites and signing into the many SNSs, a large number of content moderators silently toil away in the background to sift appropriate content from inappropriate ones. They try to retain some semblance of sanity and normalcy in the content viewed online by people.

This entire process shields people from indescribable cases of sexual assaults, horrible brutality, child abuse and pornography. Without content moderators, users’ stroll through the World Wide Web would have been full of audiovisual shocks as they would have stumbled upon objectionable materials in the form of photos, videos and texts.

The job of a content moderator comprises responsibilities like examining content (images, videos and texts) to assessing their violation of the prevailing company policies. With the proliferation of the SNSs, users empowered by uploading images and videos. Though SNSs have been successful in raising the concerns of the deprived, they have become a platform of depraved minds too.

Travails of being a content moderator

Monitoring and moderating content on the web may protect the world from viewing outrageous information, but it has a disastrous effect on the mental health of the people forced to view these images and videos as part of their job. This job is a double-edged sword. It may protect the world, but increases the risk of developing post-traumatic stress disorder (PTSD) among content moderators.

Recently, a complaint was filed by two Microsoft employees, Henry Soto and Greg Blauert, against the company for the trauma endured while executing their jobs as content moderators. The plaintiffs highlighted that the psychological impact their jobs increased to such an extent that seeing their own children triggered stress. Their disorder was also triggered by the content of the adults who came across as the “potential abusers.”

They were no longer able to look at any “child-related content” on computers and had constant “fears for the safety of children they met.” Soto initially divulged to psychiatrists that he was suffering from sleep disturbances, nightmares, anxiety and experienced disturbing images. These symptoms were followed by visual hallucinations, panic attacks in public, isolation and depression.

The constant exposure to inhuman videos and images turns most of the workers in this line intensely paranoid as they continue to witness the proof of infinite levels of human depravity. They start to distrust people around them and suspect the worst in them.

Find solutions for making jobs less strenuous

People in the profession of content moderation often go through burnout that occurs on an average between three and five months. Some measures that companies can carry out to make the job less stressful is to include mandatory rotations out of the program, more breaks from the job, meetings with a trained psychologist and a spousal wellness program, as well as remedies to reduce the impact of continuously viewing toxic images.

If you or your loved one is struggling with an anxiety disorder, it is imperative to seek professional help. The Anxiety Treatment Advisors of Colorado offers a variety of evidence-based treatment plans. Call at our 24/7 helpline number 866-891-2539 to connect with the best anxiety disorders treatment centers in Colorado. Alternatively, you can also chat online with our experts for accessing information pertaining to PTSD treatment in Colorado.

Suffering from anxiety?

Request help today

A treatment expert will get back to you shortly