YouTube Limits The Working Hours For Content Moderators
MobileAppDaily

YouTube Puts Limitation For Content Moderators To View Disturbing Videos

The contract based content moderators didn't get the health benefits like Google employees

google published date 14th March, 2018 Akash Singh Chauhan

YouTube New Update

YouTube, took an initiative to uplift the condition of contract based content moderators. Susan Wojcicki, YouTube CEO announced that the company is limiting the time that moderators spend while watching disturbing videos by four hours per day. During a Q&A session at Austin Wojcicki talks about the difficulties that social media sharing and other platforms are facing in curating the content.

In order to make sure that no disturbing videos get slipped over the platform, YouTube hires content moderators. These moderators work part-time for the company and watch the videos to restrict the videos that are not good for the users. The tech companies use the combined effort of the algorithm and human moderation to abort illegal content on the platform. This requires content moderators to work manually over the requested or posted content and filter out the potentially harmful content.

YouTube uses the Content ID system to monitor the uploaded videos on the platform. The system directly restricts the content violating YouTube's policies like copyrighted television, film, and music. However, the videos containing violence, suicide, murder and other disturbing elements need to be moderated by the humans. This makes the content moderators watch disturbing videos for long and it could affect their mental health. “This is a real issue and I myself have spent a lot of time looking at this content over the past year. It is really hard,” said Wojcicki. So, the YouTube announcement sets a new rule for the moderators to watch the disturbing videos only for four hours a day.

The social media sharing platform like Facebook, Twitter, and YouTube recently been criticized for being unable to put a curb on fake news, hoaxes, and violent content. Even Facebook and Twitter are facing the accusation for getting involved in Russia’s invasion in 2016 US Presidential election. YouTube recently announced to hire 10,000 more people for find out the limitation of Content ID system. This means the situation for content moderators is going to get worse with that. Google didn't reveal that how much hours the moderators had to watch the disturbing content prior to this four hours per day limitation.

Akash Singh Chauhan

Akash Singh Chauhan

Akash Singh Chauhan is a senior writer at MobileAppDaily and he mainly covers all the latest happenings and tweaks in mobile app technology. Being an Engineering graduate he is always compelled to the technology and tries to discover new trends in the tech world. Along with any tech news he also never misses a single episode of ‘Dragon Ball’.

Was this article helpful?

Show Comments
0