The UK has published an Online Harms White Paper, setting out its proposals for new online safety laws. Like the Irish Government’s proposals (discussed here), the UK proposals aim to make online platforms more responsible for users’ online safety, especially children and other vulnerable groups. The new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, including social media platforms, file hosting sites, public discussion forums, messaging services, and search engines. The 12-week consultation period on the new laws runs until 1 July 2019.
The UK consultation paper seeks views on a number of issues including:
- the online services falling within the remit of the regulatory framework;
- options for appointing an independent regulator responsible for enforcing the new framework;
- the regulatory body’s enforcement powers;
- potential redress mechanisms for online users; and
- measures to ensure regulation is targeted and proportionate for the industry.
The proposals include a new statutory ‘duty of care’ to make companies take more responsibility for the safety of their users, and tackle illegal and harmful content or activity on their services. Compliance with this duty of care will be overseen and enforced by an independent regulator. The regulator will issue codes of practice setting out how companies can fulfil their new legal duty.
The Government is consulting on the regulator’s suite of enforcement powers, which may include the power to levy substantial fines, block access to non-compliant sites, and impose liability on individual members of senior management. The regulator will also have the power to require annual transparency reports from companies in scope, outlining the prevalence of harmful content on their platforms and what countermeasures they are taking to address these. These reports will be published online by the regulator, so that users and parents can make informed decisions about internet use.
The new laws aim to tackle a range of harms including, inciting violence and violent content, encouraging suicide, disinformation, cyber bullying, and children accessing inappropriate material. There will also be stringent requirements for companies to ensure terrorist and child sexual exploitation and abuse (CSEA) content is not disseminated online.
The consultation paper sets out high-level expectations of companies, including some specific expectations in relation to certain harms. For the most serious online offending, such as CSEA and terrorism, the Government expects companies to go much further and demonstrate the steps taken to combat the dissemination of associated content and illegal behaviours.
The Government has stated that the new laws “will increase the responsibility of online services in a way that is compatible with the EU’s e-Commerce Directive, which limits their liability for illegal content until they have knowledge of its existence, and have failed to remove it from their services in good time“. However, it remains to be seen how this hosting exemption can be reconciled with the new laws, insofar as the regulator may potentially levy substantial fines on companies for failing to monitor and prevent certain categories of harmful content on their platforms.