- Regulator to kick-start 100-day plan after Bill passes to get online safety regime up and running.
Tech firms should start preparing now for new online safety rules, Office of Communications (Ofcom) says, as they set out detailed plans for implementing the new laws.
The UK is preparing to become among the first countries in the world to introduce comprehensive new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.
Ofcom expects the Online Safety Bill to pass by early 2023 at the latest, with powers coming into force two months later.
Immediate action once powers kick in
Within the first 100 days of the powers taking effect, Ofcom will focus on getting the ‘first phase’ of the new regulation up and running – protecting users from illegal content harms, including child sexual exploitation and abuse, and terrorist content. They will set out:
- a draft Code of Practice on illegal content harms explaining how services can comply with their duties to tackle them; and
- draft guidance on how Ofcom expect services to assess the risk of individuals coming across illegal content on their services and associated harms.
To help companies identify and understand the risks their users may face, Ofcom will also publish a sector-wide risk assessment. This will include risk profiles for different kinds of services that fall in scope of the regime. They will also consult on their draft enforcement guidelines, transparency reporting and record-keeping guidance.
Ofcom will consult publicly on all these documents and expect to finalise them in spring 2024. Within three months, companies must have completed their risk assessments related to illegal content, and be ready to comply with their duties in this area from mid-2024 once the Code of Practice has been laid in Parliament.
Ofcom is ready and able to evolve their timelines and plans, should the timing or substance of the Bill change.
Early engagement with high-risk services
As well as expecting tech firms to engage as they consult, Ofcom will also identify high-risk services for closer supervision. The companies who run these sites or apps must be ready – as soon as Ofcom first set of powers come into force in early 2023 – to explain their existing safety systems to us and, importantly, how they plan to develop them.
Ofcom will expect companies to be open with us about the risks they face and the steps they are taking to address them. They will want to know how they have evaluated those measures, and what more they might consider doing to keep users safe. Ofcom will also seek to understand users’ attitudes to those services, and consider evidence from civil-society organisations, researchers and expert bodies.
Where Ofcom consider that a platform is not taking appropriate steps to protect users from significant harm, they will be able to use a range of investigation and enforcement powers.
Action following secondary legislation
Some elements of the online safety regime depend on secondary legislation – for example, the definition of priority content that is harmful to children, and priority content that is legal but harmful to adults. So duties in these areas will come into effect later and timings will be subject to change, depending on when secondary legislation passes.
Ofcom will move quickly to publish draft Codes of Practice and guidance on these areas shortly after secondary legislation passes. Once again, they will consult publicly on these before finalising them.
Mark Bunting, Ofcom’s Online Safety Policy Director, said “We’ll move quickly once the Bill passes to put these ground-breakinglaws into practice. Tech firms must be ready to meet our deadlines and comply with their new duties. That work should start now, and companies needn’t wait for the new laws to make their sites and apps safer for users.”
Maintaining momentum this year
Ofcom’s preparations to take on its new role are continuing apace. They are calling for evidence on the ‘first phase’ areas identified for consultation: the risk of harm from illegal content; the tools available to services to manage this risk; child access assessments; and transparency requirements. Ofcom would like to hear from companies that are likely to fall within the scope of the regime, as well as other groups and organisations with expertise in this area.
In the immediate months ahead, Ofcom will build on work already underway by:
- ramping up engagement with tech firms, large and small;
- publishing first report on how video-sharing platforms such as TikTok, Snapchat, Twitch and OnlyFans are working to tackle harm; and
- undertaking and publishing research on the drivers and prevalence of some of the most serious online harms in scope of the Bill, as well as technical research on how these might be mitigated;
- further developing skills and operational capabilities, building on the expertise Ofcom has already brought in from the technology industry, academia and the third sector; and
- continuing to work with other regulators through the Digital Regulation Cooperation Forum to ensure a joined-up approach between online safety and other regimes.
What the new laws will mean
This is novel regulation and so it is also important to understand what the Online Safety Bill does – and does not – require.
The focus of the Bill is not on Ofcom moderating individual pieces of content, but on the tech companies assessing risks of harm to their users and putting in place systems and processes to keep them safer online.
As well as setting Codes of Practice and giving guidance on compliance, Ofcom will have powers to demand information from tech companies on how they deal with harms and to take enforcement action when they fail to comply with their duties. The Bill will also ensure the tech companies are more transparent and can be held to account for their actions.
It’s also important to recognise that:
- Ofcom will not censor online content. The Bill does not give Ofcom powers to moderate or respond to individuals’ complaints about individual pieces of content. The Government recognises – and agree – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, Ofcom will tackle the causes by ensuring companies design their services with safety in mind from the start.
- Tech firms must minimise harm, within reason. Ofcom will examine whether companies are doing enough to protect their users from illegal content and content that is harmful to children, while recognising that no service in which users freely communicate and share content can be entirely risk-free. Under the draft laws, the duties placed on in-scope online services are limited by what is proportionate and technically feasible.
- Services can host content that is legal but harmful to adults, but must have clear service terms. Under the Bill, services with the highest reach – known as ‘Category 1 services’ – must assess risks associated with certain types of legal content that may be harmful to adults. They must have clear terms of service or community guidelines explaining how they handle it, and apply these consistently. They must also provide tools that empower users to reduce their likelihood of encountering this content. But they will not be required to block or remove legal content unless they choose to.
- This plan is based on the current understanding of the Bill as it stands, and the likely timing for passage of legislation (including secondary legislation) under the Bill. At the time of publication, the Bill has passed Committee stage in the House of Commons and is subject to amendment as it passes through the rest of the Parliamentary process. Consequently, the timelines and requirements described are provisional and may change. Ofcom will continue to look for opportunities to bring forward implementation as the legislative timetable becomes clearer (including the likely timing of relevant secondary legislation), and will provide a further update on implementation plans if they change significantly.
- All services in scope of the Bill have a duty to protect users from illegal content. They must assess, among other things, the risk of individuals coming across illegal content on their platforms, and how that risk is affected by the design of their service. Tech firms must also establish whether children, in significant numbers, can access any part of their service. Companies must put in place measures to mitigate and manage the risks of illegal content and, if they’re likely to be accessed by children, material which is harmful to children, as well as allowing their users to report content and complain. Providers of pornographic material have a standalone duty in the Bill to ensure that children cannot normally access their services. This duty is separate from the requirement on user-to-user and search services to conduct a children’s access assessment and from the duties on those services which are likely to be accessed by children to take steps to protect children from harmful content – which would include user-generated pornographic content or pornographic content in search results. To ensure consistency in Ofcom’s approach to regulating pornographic content across the board – whether published by users or companies – Ofcom are currently expecting to consult on guidance and Codes covering the protection of children from pornographic material together in autumn 2023, after secondary legislation has been passed.
- Ofcom will identify services that will be the subject of this focused engagement through sector risk assessment and other relevant information, and notify them in advance of the engagement beginning.
- The duties regarding content that is legal but harmful to adults only apply to so-called Category 1 services, the largest and highest-risk services. These services also will be required to produce transparency reports, as will the biggest search services (‘Category 2a’) and other smaller but potentially risky services (‘Category 2b’). The Government anticipates around 30-40 services will be in one of these three special categories, so most services will not be subject to these duties.
- Current expectation is that Ofcom will set out draft Codes of Practice and risks guidance on protecting children from legal harms, as well as a sector-wide risk assessment, in autumn 2023. They will consult publicly and expect to finalise them within a year, at which point firms should expect to be ready to comply with these duties. Ofcom expect to set out draft Codes of Practice and risks guidance on protecting adults from legal harms in early 2024. Once again, theywill consult publicly and expect to finalise them within a year, at which point companies should expect to be ready to comply with these duties.
Our range of innovative solutions can be tailored to suit your unique requirements, no matter whether you’re currently working from home, or are continuing to go into the office. Our services can be deployed individually or combined to form a broader solution to release your energies and focus on your clients.
Why Not Download our FREE Brochures! Click here.
Call Us Today on 020 8087 2377 or send us an email.
We welcome individual bloggers / Professional Writers / Freelancers to submit high quality contents. Find out more…
You can see our Google reviews here.