First Online Safety Act Revealed For Child Abuse Prevention

First Online Safety Act Revealed For Child Abuse Prevention

Date: November 09, 2023

The growth of children using online social media platforms is dangerously prone to child abuse and grooming, which the latest online safety act prevents.

Ofcom, an online regulatory body, has released its first Online Safety Act Guideline that targets child sexual abuse material (CSAM). It stems from the fact that more than monitored children are using social media platforms, are highly vulnerable to receiving unsolicited messages, and fall prey to child abuse easily. About 1 in 10 online child social media users has received a nake image or adult content in their private message inboxes. Even if they don’t fall prey to the senders, watching such content can generate long-term trauma that may dramatically hinder their future growth.

The new regulations set by Ofcom require compliance by nearly 100,000 services, many of which operate outside the UK. Right now, the regulations are sent to leading tech companies for review, and they have asked for a response either in the form of an objection or suggestion. Once finalized, all the tech players who cater to children as their audience will need to comply, but that’s not happening anytime soon.

What Are The New Regulations?

To tackle the spread of illegal content online, especially among children, Ofcom has sent a guideline of 1,500 pages that covers many aspects. But the below-mentioned regulations are the most powerful tools and toughest to achieve.

  • Scanning of all media sent to children aged 11-17 for potential CSAM content and blurring it automatically. 
  • Removing adult stranger users’ accounts from friend suggestions.
  • Controlling the exposure of adult content to children.
  • Setting default rules of content moderation to highest for all underaged users with a strict allowance of an adult.

When Will It Switch On?

The government has urged all tech players to comply with the new regulation guidelines, but executing the new rules will not happen before 2025. The companies, however, have responded that they already have these rules implemented in the form of advanced algorithms, and the government is just codifying the existing mechanism. 

However, there are certain conditions that, if met, can compel about 20,000 small online businesses to comply with the Online Safety Act. The tech giants are also concerned about the general user privacy of content and personal information. If they gain the power and legal authority to scan private encrypted messages, the sole purpose of privacy goes to waste. It can raise a huge rebellion against all compliant platforms. It still needs a lot of fine-tuning to finally be a fixed rule that the world leaders in tech can follow and help prevent child abuse online.

Arpit Dubey

By Arpit Dubey LinkedIn Icon

Arpit is a dreamer, wanderer, and a tech nerd who loves to jot down tech musings and updates. With a logician mind, he is always chasing sunrises and tech advancements while secretly preparing for the robot uprising.

Have newsworthy information in tech we can share with our community?

Post Project Image