Proposed measures to improve children’s online safety


INSIGHT
Published
May 8th '24
Share
Facebook

As the UK’s online safety regulator, Ofcom has published a package of proposed measures that social media and other online services must take to improve children’s safety when they’re online.

 

In this article, Ofcom explain some of the main measures and the difference they expect them to make. Whether you are a parent, carer or someone working with children, this can help you understand what is happening to help children in the UK live safer lives online.

 

Protecting children is a priority

Protecting children so they can enjoy the benefits of being online, without experiencing the potentially serious harms that exist in the online world, is a priority for Ofcom.

 

Ofcom is taking action – setting out proposed steps online services would need to take to keep kids safer online, as part of their duties under the Online Safety Act.

 

Under the Act social media apps, search and other online services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography. They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.

 

What will companies have to do to protect children online?

Firstly, online services must establish whether children are likely to access their site – or part of it. And secondly, if children are likely to access it, the company must carry out a further assessment to identify the risks their service poses to children, including the risk that come from the design of their services, their functionalities and algorithms. They then need to introduce various safety measures to mitigate these risks.

 

The consultation proposes more than 40 safety measures that services would need to take – all aimed at making sure children enjoy safer screen time when they are online. These include:

 

  • Robust age checks – draft Codes expect services to know which of their users are children in order to keep protect them from harmful content. In practice, this means that all services which don’t ban harmful content should introduce highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access.
  • Safer algorithms – under the proposals, any service that has systems that recommend personalised content to users and is at a high risk of harmful content must design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see.
  • Effective moderation – all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content and large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.

 

What difference will these measures make?

Ofcom believe these measures will improve children’s online experiences in a number of ways. For example:

 

  • Children will not normally be able to access pornography.
  • Children will be protected from seeing, and being recommended, potentially harmful content.
  • Children will not be added to group chats without their consent.
  • It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on.

 

The consultation follows proposals Ofcom has already published for how children should be protected from illegal content and activity such as grooming, child sexual exploitation and abuse, as well as how children should be prevented from accessing pornographic content.

 

Next steps

Consultation is open until 17 July and welcome any feedback on the proposals. Ofcom expect to finalise proposals and publish their final statement and documents in spring next year.

 

Source:  Office of Communication (Ofcom)

 

About Ofcom

Ofcom is the regulator for the communications services that we use and rely on each day.

 

About us

LS Consultancy are experts in Marketing and Compliance, and work with a range of firms to assist with improving their documents, processes and systems to mitigate any risk.

 

We provide a cost-effective and timely bespoke copy advice and copy development services to make sure all your advertising and campaigns are compliant, clear and suitable for their purpose.

 

Our range of innovative solutions can be tailored to suit your unique requirements, no matter whether you’re currently working from home, or are continuing to go into the office. Our services can be deployed individually or combined to form a broader solution to release your energies and focus on your clients.

 

Contact us today for a chat or send us an email to find out how we can support you in meeting your current and future challenges with confidence.

 

Explore our full range today.

 

Need A Regulatory Marketing Compliance Consultant? A Bit More About Us

 

Contact us

 

Why Not Download our FREE Brochures! Click here.

 

Call Us Today on 020 8087 2377 or send us an email.

 

FOLLOW US

Connect with us via social media and drop us a message from there. We’d love to hear from you and discuss how we can help.

 

Facebook | Instagram | LinkedIn | X (formally Twitter) | YouTube

 

Contact us