Online safety regulator Ofcom has published its first set of codes and guidance under the Online Safety Act, setting out the duties tech firms must comply with regarding illegal harms.
The landmark safety laws will require platforms to put a range of safety measures in place which Ofcom says will help better protect users – including better moderation, built-in safety tools, clear ways to report harmful content and clear accountability to senior staff over safety compliance issues.
Here is a closer look at Ofcom’s announcement and the wider legislation.
– What is the Online Safety Act?
Passed in late 2023, the Online Safety Act is the UK’s first major legislation to regulate social media, search engine, messaging, gaming, dating, pornography and file-sharing platforms.
At its core, the Act places a range of new safety duties on sites, which will compel them to protect users from illegal and other harmful content.
It will do so by putting robust safety features in place to prevent the content appearing on sites in the first place, also acting swiftly to remove it when it does.
The new duties will be set out in a range of codes of practice and other guidance published by Ofcom over the coming months, with each one focusing on a specific content area.
The Act gives Ofcom the power to fine firms that fail to meet these duties – potentially up to billions of pounds for the largest sites – and in serious cases can seek clearance to block access to a site in the UK.
– What has Ofcom published now?
The regulator has released its first codes of practice, which specifically focus on illegal harms online.
This is content such as that linked to terrorism, hate, fraud, child sexual abuse and assisting or encouraging suicide, Ofcom says.
The codes are designed to help platforms comply with the new rules by setting out best practice on the measures and structures they should have in place by the time the duties are expected to come into force in three months’ time.
The largest platforms will be expected to do the most to protect users, in particular children.
The codes of practice outline that sites should have senior staff accountability for safety, have strong moderation and reporting tools in place, as well as robust measures to protect children from abuse and exploitation online.
The first set of codes also call for measures to be put in place to tackle pathways to online grooming, use automated tools to detect child sexual abuse material and take steps to protect women and girls, identify fraud and remove terrorist accounts.
🚨 A major milestone in online safety.
Sites and apps must now start taking action to protect people from illegal harm online.
Key measures include:🔒 Senior accountability for safety🛠️ Improved content moderation and reporting systems🛡️ Protecting children from abuse pic.twitter.com/PhIp2TuW8c
— Ofcom (@Ofcom) December 16, 2024
– What has been the response?
While many have welcomed steps being taken to better regulate social media, some campaigners have expressed their disappointment at Ofcom’s approach, arguing it has not been forceful enough.
The Molly Rose Foundation, which was set up by the family of Molly Russell, the 14-year-old who ended her life in 2017 after viewing suicide content on social media, said it was “astonished” and disappointed” with Ofcom’s codes, adding there is not “one single targeted measure” for social media sites to “tackle suicide and self-harm material that meets the criminal threshold”.
Maria Neophytou, acting chief executive at the National Society for the Prevention of Cruelty to Children, said the charity was “deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material”.
She said Ofcom’s proposals will “at best lock in the inertia to act and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement”.
– What has Ofcom said about the codes?
Dame Melanie Dawes, Ofcom chief executive, said the introduction of the Online Safety Act meant that sites would no longer be “unregulated, unaccountable and unwilling to prioritise people’s safety over profits”.
“The safety spotlight is now firmly on tech firms and it’s time for them to act,” she said.
“We’ll be watching the industry closely to ensure firms match-up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”
Ofcom has said it will continue to roll out further codes and proposals in 2025, including more on the response to child sexual abuse material.
– What happens next?
For tech firms, they now have until March to start putting Ofcom’s proposals into place on their sites to ensure they are in line with those aspects of the Online Safety Act as it begins to come into force.
Meanwhile, Ofcom has said it will continue to publish more codes of practice in the early months of next year on a range of other harms included in the Act, including guidance for pornography publishers expected in January, guidance on protecting women and girls in February, and details on additional protection for children around harmful content promoting suicide, self-harm and eating disorders in April.