NEWS

The UK government’s online harms white paper: implications for freedom of expression
Parliament must be fully involved in shaping the government’s proposals for online regulation as the proposals have the potential to cause large-scale impacts on freedom of expression and other rights.
03 Jun 19
Social media apps on phone, Jason Howie/Flickr

Some of the most popular social media apps. The UK government is deliberating whether to regulate them. Credit: Jason Howie/Flickr

[vc_row][vc_column][vc_column_text]

Recommendations

  • Parliament must be fully involved in shaping the government’s proposals for online regulation as the proposals have the potential to cause large-scale impacts on freedom of expression and other rights.
  • The proposed duty of care needs to be limited and defined in a way that addresses the risk that it will create a strong incentive for companies and others to censor legal content, especially if combined with fines and personal liability for senior managers.
  • It is important to widen the focus from harms and what individual users do online to the structural and systemic issues in the architecture of the online world. For example, much greater transparency is needed about how algorithms influence what a user sees.
  • The government is aiming to work with other countries to build international consensus behind the proposals in the white paper. This makes it particularly important that the UK’s plans for online regulation meet international human rights standards. Parliament should ensure that the proposals are scrutinised for compatibility with the UK’s international obligations.
  • More scrutiny is needed regarding the implications of the proposals for media freedom, as “harmful” news stories risk being caught.

 

Introduction

The proposals in the government’s online harms white paper risk damaging freedom of expression in the UK, and abroad if other countries follow the UK’s example.

  • A proposed new statutory duty of care to tackle online “harms” combined with substantial fines and possibly even personal criminal liability for senior managers would create a strong incentive for companies to remove content.
  • The “harms” are not clearly defined but include activities and materials that are legal.
  • Even the smallest companies and non-profit organisations are covered, as are public discussion forums and file sharing sites.

The proposals come less than two months after the widely criticised Counter-Terrorism and Border Security Act 2019. The act contains severe limitations on freedom of expression and access to information online (see Index report for more information).

 

The duty of care: a strong incentive to censor online content

The proposed new statutory duty of care to tackle online harms,  combined with the possibility of substantial fines and possibly even personal criminal liability for senior managers, risks creating a strong incentive to restrict and remove online content.

Will Perrin and Lorna Woods, who have developed the online duty of care concept, envisage that the duty will be implemented by applying the “precautionary principle” which would allow a future regulator to “act on emerging evidence”.  

Guidance by the UK Interdepartmental Liaison Group on Risk Assessment (UK-ILGRA) states:

“The purpose of the Precautionary Principle is to create an impetus to take a decision notwithstanding scientific uncertainty about the nature and extent of the risk, i.e. to avoid ‘paralysis by analysis’ by removing excuses for inaction on the grounds of scientific uncertainty.”

The guidance makes sense when addressing issues such as environmental pollution, but applying it in a context where freedom of expression is at stake risks legitimising censorship – a very dangerous step to take.

 

Not just large companies

The duty of care would cover companies of all sizes, social media companies, public discussion forums, retailers that allow users to review products online, non-profit organisations (for example, Index on Censorship), file sharing sites and cloud hosting providers. A blog and comments would be included, as would shared Google documents.

The proposed new regulator is supposed to take a  “proportionate” approach, which would take into account companies’ size and capacity, but it is unclear what this would mean in practice.

 

Censoring legal “harms”

The white paper lists a wide range of harms, for example, terrorist content, extremist content, child sexual exploitation, organised immigration crime, modern slavery, content illegally uploaded from prisons, cyberbullying, disinformation, coercive behaviour, intimidation, under 18s using dating apps and excessive screen time.  

The harms are divided into three groups: harms with a clear definition; harms with a less clear definition; and underage exposure to legal content.  Activities and materials that are not illegal are explicitly included. This would create a double standard, where activities and materials that are legal offline would effectively become illegal online.

The focus on the catch-all term of “harms” tends to oversimplify the issues. For example, the recent study by Ofcom and the Information Commissioner’s Office Online Nation found that 61% of adults had a potentially harmful experience online in the last 12 months. However, this included “mildly annoying” experiences. Not all harms need a legislative response.

 

A new regulator

The white paper proposes the establishment of an independent regulator for online safety, which could be a new or existing body. It mentions the possibility of an existing regulator, possibly Ofcom, taking on the role for an interim period to allow time to establish a new regulatory body.

The future regulator would have a daunting task. It would include defining what companies (and presumably also others covered by the proposed duty of care) would need to do to fulfil the duty of care, establishing a “transparency, trust and accountability framework” to assess compliance and taking enforcement action as needed.

The regulator would be expected to develop codes of practice setting out in detail what companies need to do to fulfil the duty of care. If a company chose not to follow a particular code it would need to justify how its own approach meets the same standard as the code. The government would have the power to direct the regulator in relation to codes of practice on terrorist content and child sexual exploitation and abuse.

 

Enforcement

The new enforcement powers outlined in the white paper will include substantial fines. The government is inviting consultation responses on a list of possible further enforcement measures. These include disruption of business activities (for example, forcing third-party companies to withdraw services), ISP blocking (making a platform inaccessible from the UK) and creating a new liability for individual senior managers, which could involve personal liability for civil fines or could even extend to criminal liability.

 

Undermining media freedom

The proposals in the white paper pose a serious risk to media freedom. Culture Secretary Jeremy Wright has written to the Society of Editors in response to concerns, but many remain unconvinced.

As noted the proposed duty of care would cover a very broad range of “harms”, including disinformation and violent content. In combination with fines and potentially even personal criminal liability, this would create a strong incentive for platforms to remove content proactively, including news that might be considered “harmful”.

Index has filed an official alert about the threat to media freedom with the Council of Europe’s Platform to promote the protection of journalism and safety of journalists. Index and the Association of European Journalists (AEJ) have made a statement about the lack of detail in the UK’s reply to the alert. At the time of writing the UK has not provided a more detailed reply.

 

Censorship and monitoring

The European Union’s e-commerce directive is the basis for the current liability rules related to online content. The directive shields online platforms from liability for illegal content that users upload unless the platform is aware of the content. The directive also prohibits general monitoring of what people upload or transmit.

The white paper states that the government’s aim is to increase this responsibility and that the government will introduce specific monitoring requirements for some categories of illegal content. This gets close to dangerous censorship territory and it is doubtful if it could be compatible with the e-commerce directive.

Restrictions on freedom of expression and access to information are extremely serious measures and should be backed by strong evidence that they are necessary and will serve an important purpose. Under international law freedom of expression can only be restricted in certain limited circumstances for specific reasons. It is far from clear that the proposals set out in the white paper would meet international standards.

 

Freedom of expression – not a high priority

The white paper gives far too little attention to freedom of expression. The proposed regulator would have a specific legal obligation to pay due regard to innovation. When it comes to freedom of expression the paper only refers to an obligation to protect users’ rights “particularly rights to privacy and freedom of expression”.  

It is surprising and disappointing that the white paper, which sets out measures with far-reaching potential to interfere with freedom of expression, does not contain a strong and unambiguous commitment to safeguarding this right.

 

Contact: Joy Hyvarinen, Head of Advocacy, [email protected][/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_basic_grid post_type=”post” max_items=”4″ element_width=”6″ grid_id=”vc_gid:1560957390488-02865151-710e-1″ taxonomies=”4883″][/vc_column][/vc_row]