On 26 October 2023 the Government’s Online Safety Bill received Royal Assent, becoming the Online Safety Act 2023. The legislation appoints Ofcom as the regulator of “certain internet services” by imposing technology companies a ‘duty of care’ to their users.
During the passage of the Bill, English PEN campaigned alongside colleagues at Scottish PEN and other civil liberties NGOs in calling for the Government to rethink its proposals. Instead of imposing an ill-defined duty of care, we argued that a duty to protect established human rights would oblige those companies to keep their users safe from harm, while also protecting free speech and privacy.
The Government agreed to some major changes to the legislation (most notably the removal from the bill of the ‘legal but harmful’ concept for adults). However, the ‘duty of care’ remains central to the new Act, along with several other provisions that we consider to be ill-judged and a threat to human rights.
Will the Codes Chill Free Speech?
A central concern is that regulation of this nature will chill legitimate freedom of expression. In November 2023, Ofcom published a consultation document ‘Protecting people from illegal harms online’ that runs to more than 1,700 pages. Compliance with the regulatory regime will be an onerous task for the tech giants, and we fear that smaller companies (who may be the most innovative, and who may be serving to niche and minority groups) will find the regulatory burden too much. The effect of regulation will be that many companies stop publishing user-generated content. Those companies that continue to operate social networks will ‘err on the side of caution’ and over-censor their users. Speech that is legal offline will be impossible to post online. The chill will grow, and online discourse will become less diverse as a result.
Ofcom must be mindful of the chilling effect when it publishes guidance and the Codes of Conduct demanded by the Online Safety Act. The test for these codes should be whether technology companies can understand precisely how different kinds of content must be moderated, without recourse to expensive regulatory lawyers. Ordinary users should be able to read the codes and understand whether the content they wish to post online will be censored. The worst possible outcome is if the codes are vague and overbroad. Rules that are poorly worded, and inconsistently policed, squeeze the space for political dissent and artistic expression.
An additional concern is that the nebulous concept of a ‘duty of care’ will prove impossible to police. A predictable outcome is that the technology companies find they cannot deliver the kind of content moderation that the Bill’s supporters said was possible. We fear that, when the regulator fails to deliver the hoped-for outcomes, there will be calls for ever-tighter regulation. Freedom of expression will be further eroded, while the people we hope to protect remain vulnerable.
End-to-End Encryption
In recent years the major technology companies have all implemented End-to-End Encryption (E2EE) on private messaging services such as WhatsApp. Messages are encoded on the sender’s device and decoded on the recipient’s device. The company running the service cannot decode or read the messages that they deliver.
Throughout the passage of the Bill, campaigners expressed concern at the presence of a ‘spy clause’ that threatened to undermine E2EE. This clause has unfortunately found its way into the Act. Section 121 gives Ofcom the power to require tech companies to use an “accredited technology” to scan user generated content. Buried in sub-section 121(2)(a)(iii) is a power to require the companies scan content “whether communicated publicly or privately.”
Regardless of the precise form of the technology, its use on private messaging will be incompatible with E2EE. It would necessarily require the technology companies to create a ‘back door’ on their apps that would allow them to scan every message sent via their network. And if the backdoor exists, it will eventually be exploited by organised crime and rights-abusing regimes.
Some companies have threatened to prevent their messaging apps from being used in the UK, if Ofcom insists that this technology is deployed. This would be a disaster for the British economy and cause huge disruption to the way people conduct their business and their cultural life. Ofcom must resist the temptation to exercise its powers under section 121.
The International Example
English PEN is part of an international network of PEN centres. Our approach to freedom of expression is informed by the experiences of fellow writers around the world, many of whom live and write under oppressive regimes. One lesson we have learned is that authoritarian regimes deliberately exploit ambiguous laws to persecute their critics.
We fear that the example set by the Online Safety Act will give authoritarians the ‘cover’ they need to enact ambiguous internet regulation in their own countries. Their laws may follow the same structure as the Online Safety Act, but impose a broader conception of ‘harm prevention’ onto social media users posting in their country. Political dissent and the discussion of alternative lifestyles may be suppressed under the guise of preventing ‘harm.’
The Online Safety Act 2023 is therefore a missed opportunity. The United Kingdom parliament could have produced ‘world leading’ legislation which strengthened human rights and offered social media users some protections against the commodification of their personal data and their emotions. Instead, the new regulatory system will catalyse the erosion of freedom of expression and privacy around the world, while failing to properly protect users in the UK.