The contract based content moderators didn't get the health benefits like Google employees
YouTube, took an initiative to uplift the condition of contract based content moderators. Susan Wojcicki, YouTube CEO announced that the company is limiting the time that moderators spend while watching disturbing videos by four hours per day. During a Q&A session at Austin Wojcicki talks about the difficulties that social media sharing and other platforms are facing in curating the content.
In order to make sure that no disturbing videos get slipped over the platform, YouTube hires content moderators. These moderators work part-time for the company and watch the videos to restrict the videos that are not good for the users. The tech companies use the combined effort of the algorithm and human moderation to abort illegal content on the platform. This requires content moderators to work manually over the requested or posted content and filter out the potentially harmful content.
YouTube uses the Content ID system to monitor the uploaded videos on the platform. The system directly restricts the content violating YouTube's policies like copyrighted television, film, and music. However, the videos containing violence, suicide, murder and other disturbing elements need to be moderated by the humans. This makes the content moderators watch disturbing videos for long and it could affect their mental health. “This is a real issue and I myself have spent a lot of time looking at this content over the past year. It is really hard,” said Wojcicki. So, the YouTube announcement sets a new rule for the moderators to watch the disturbing videos only for four hours a day.
The social media sharing platform like Facebook, Twitter, and YouTube recently been criticized for being unable to put a curb on fake news, hoaxes, and violent content. Even Facebook and Twitter are facing the accusation for getting involved in Russia’s invasion in 2016 US Presidential election. YouTube recently announced to hire 10,000 more people for find out the limitation of Content ID system. This means the situation for content moderators is going to get worse with that. Google didn't reveal that how much hours the moderators had to watch the disturbing content prior to this four hours per day limitation.
She is a content marketer and has more than five years of experience in IoT, blockchain, Web, and mobile development. In all these years, she closely followed the app development, and now she writes about the existing and the upcoming mobile app technologies. Her essence is more like a ballet dancer.