Protecting people from hate and terrorism under the online safety regime


INSIGHT
Published
Feb 22nd '24
Share
Facebook

The use of online services to incite and radicalise vulnerable people, including children, towards hate and violence poses a major risk. It can have horrific consequences and in the severest of cases can lead to mass murder, often targeting minorities and protected groups.

 

On 15 March 2019, a far-right terrorist killed 51 Muslim worshippers at two mosques in Christchurch, New Zealand. The tragic attack was livestreamed for 17 minutes and was viewed over 4,000 times. The UK Government has recently taken steps to proscribe a number of far-right groups as terrorist organisations, highlighting the increasing threat from far-right terrorism. ISIS propaganda online was also instrumental in convincing Britons to leave the UK for Raqqa, Syria as well as inspiring terrorist attacks in several countries.

 

Ofcom report into the racially-motivated Buffalo attack in May 2022 in the US further emphasised the global and multiplatform nature of the risk to UK internet users. It showed that terrorist and hate groups form part of borderless networks and jump between online services, inciting others, mobilising or planning attacks.

 

The Buffalo attacker said he was radicalised on the imageboard 4Chan, stored his diary and manifesto on Discord, and livestreamed the attack on Twitch. Even though the stream of the attack lasted for only two minutes, copies of the footage were still disseminated globally across multiple platforms, significantly exposing users of online services in the UK to harm in the form of trauma and an increased risk of having hate, violence and terrorism being incited against them.

 

Evidence is key to Ofcom’s work in this area

Building evidence base and scaling teams in this important harms area has been, and will continue to be, vital for Ofcom. Ofcom has commissioned two reports from the Institute of Strategic Dialogue (ISD) to build their understanding of user experiences of online terrorism, incitements to violence and hate. They have also used formal information-gathering powers against the UK’s video-sharing platforms (VSPs) to put the systems and processes of services such as TikTok, BitChute, Twitch, Vimeo and Snapchat under the microscope. A recent report into VSPs’ terms and conditions, including those for terrorism, and incitement to hatred and violence, concluded that many adults would struggle to understand them – children even less so.

 

In November 2023, Ofcom published draft proposals setting out steps they expect relevant online platforms to take to assess and mitigate the risk of illegal content such as hate and terrorism, which we will develop and elaborate over time.

 

  • Set clear and accessible terms and conditions that explain how users will be protected from illegal terrorist and hateful content    .
  • Assess the risk of terrorist and hateful content being disseminated and to take steps to mitigate the risk they’ve identified.
  • Design content moderation systems to swiftly take down illegal content that may be terrorist, violence inciting and hateful. Prioritisation policies for content moderation should factor in the potential viral nature and severity of content.
  • Content moderation teams must be adequately resourced and trained to deal with hateful and terrorist content, including to meet increases in demand caused by external events, such as crises and conflicts.
  • When services  are making changes to their recommender systems, they should test them to assess the impact the changes will have on the dissemination of illegal hateful and terrorist content.
  • User reporting and complaints processes for illegal terrorist and hateful content should exist on all services and be easy to find, access and use.
  • Accounts should be removed if there are reasonable grounds to infer they are run by or on behalf of a terrorist organisation proscribed by the UK Government.
  • For search services, content moderation systems should result in illegal terrorist or hateful content being de-indexed or de-prioritised.

 

Responding to emerging issues

Following the beginning of the crisis in Israel and Gaza in October 2023, Ofcom reached out to several key civil society, independent research organisations and international regulatory counterparts through the Global Online Safety Regulator’s Network. Ofcom sought to understand the scale of illegal and harmful content on online services relating to the crisis.

 

They shared their concerns about reductions in trust and safety teams and the knock-on effect on online services being able to cope with the scale of harmful content being posted, in particular spikes in anti-Muslim and antisemitic hatred. Some expressed worries around whether terms and conditions were clear and accessible especially in relation to illegal terrorist and hateful content and whether they were being swiftly enforced. Ofcom also sent a letter to regulated video-sharing platforms about the increased risk to their users encountering harmful content stemming from the crisis in Israel and Gaza, and the need to protect users from such content.

 

The continued development of regulation in respect of illegal hateful and terrorist content continues to be an ongoing priority for us. This will be particularly challenging as emerging technologies, such as generative AI, evolve at pace and Ofcom strive to make sure proposals remain effective and current. Ofcom will also be stepping up engagement with specific regulated services to better understand, assess and improve their systems pertaining to illegal hate and terrorism.

 

Lastly, international engagement will be a key area of work for us, particularly with other regulators and cross-industry initiatives such as the Christchurch Call, Tech against Terrorism, Global Internet Forum for Counter Terrorism and the EU Internet Forum. Ofcom is keen to identify opportunities for collaboration and partnership to better protect UK users from a fast changing, global and multi-platform harms area.

 

Source:  Office of Communication (Ofcom)

 

About Ofcom

Ofcom is the regulator for the communications services that we use and rely on each day.

 

 

About us

LS Consultancy are experts in Marketing and Compliance, and work with a range of firms to assist with improving their documents, processes and systems to mitigate any risk.

 

We provide a cost-effective and timely bespoke copy advice and copy development services to make sure all your advertising and campaigns are compliant, clear and suitable for their purpose.

 

Our range of innovative solutions can be tailored to suit your unique requirements, no matter whether you’re currently working from home, or are continuing to go into the office. Our services can be deployed individually or combined to form a broader solution to release your energies and focus on your clients.

 

Contact us today for a chat or send us an email to find out how we can support you in meeting your current and future challenges with confidence.

 

Explore our full range today.

 

Need A Regulatory Marketing Compliance Consultant? A Bit More About Us

 

Contact us

 

Why Not Download our FREE Brochures! Click here.

 

Call Us Today on 020 8087 2377 or send us an email.

 

FOLLOW US

Connect with us via social media and drop us a message from there. We’d love to hear from you and discuss how we can help.

 

Facebook | Instagram | LinkedIn | X (formally Twitter) | YouTube

 

Contact us