On 9 March 2019 the House of Lords Select Committee on Communication published its report – Regulating in a digital world. The report is ambitious in its scope. The Committee looks at regulation of the digital world.

The question of whether regulation is needed has gained new prominence after a series of concerns have hit the headlines: harmful online content, abusive and threatening behaviour, cybercrime, misuse of data, and political misinformation and polarisation.

The role of the law is dealt with relatively briefly.

"11. The internet is not an unregulated 'Wild West', as it has sometimes been characterised. Criminal and civil law generally applies to activity on the internet in the same way as elsewhere. For example, section 1 of the Malicious Communications Act 1988 prohibits the sending of messages which are threatening or grossly offensive; it applies whether the message is through the post or through any form of electronic communication. There is also legislation which specifically targets online behaviour, such as the Computer Misuse Act 1990."

and

"16.The transnational nature of the internet poses problems in enforcing regulation, including conflicts of law, confusion about which jurisdiction applies and in seeking redress against foreign actors. But individual countries are not powerless in enforcing their own laws. Professor Derek McAuley and his colleagues at the Horizon Digital Economy Research Institute, University of Nottingham, explained how the General Data Protection Regulation (GDPR) identifies jurisdiction by focusing on where the impact of processing occurs, namely the location of the data subject: "So generally, it is the case that services targeted at specific jurisdictions through localisation, whether through language or tailored local content, and generating revenue from such localisation should be required to obey the regulation within that jurisdiction."

Obeying the law online

The obvious issue is how the users of the internet are compelled to obey the law, and whether platforms should bear some of the responsibility for their users. The Select Committee thinks they should. Not discussed, but implicit in the findings, is the reality of the cost of using the traditional courts to combat unlawful behaviour online.

While the report is wide ranging, one aspect is illegal (as distinct from harmful) content. The examples given as "Terrorism-related, child sexual abuse material, threats of violence, infringement of intellectual property rights". The conclusion the Committee reaches, for everything from terrorism related content to breach of copyright is:

"193.Online platforms have developed new services which were not envisaged when the e-Commerce Directive was introduced. They now play a key role in curating content for users, going beyond the role of a simple hosting platform. As such, they can facilitate the propagation of illegal content online. 'Notice and takedown' is not an adequate model for content regulation. Case law has already developed on situations where the conditional exemption from liability under the e-Commerce Directive should not apply. Nevertheless, the directive may need to be revised or replaced to reflect better its original purpose."

So the current hands off model might change. It is suggested that platforms should invest in their moderation systems to remove content which breaks the law or community standards more quickly and to provide a fair means of challenging moderation decisions.

Anyone who has looked at moderation, takedowns and challenge to moderation decisions will know this is a hugely complex issue, particularly when the issues to be decided range from hate speech to unfair competition.

A duty of care online

The proposals however go beyond improving existing systems and processes. Instead a "duty of care" is recommended. The report noted that:

"In the offline world the owners of physical spaces owe a duty of care to visitors. In line with the parity principle which was considered in chapter 2, Professor Woods and Mr Perrin argue that owners of online services should also be required to "take reasonable measures to prevent harm". Professor Woods told us that this approach avoided "some of the questions about making platforms liable for the content of others". In particular, action against online service providers "should only be in respect of systemic failures" rather than individual instances of speech."

The obvious question arises – does this extend to the harm caused by unlawful activity, such as trademark infringement. If a platform is facilitating by hosting trademark infringement, then are they breaching the duty of care? If that is the case, then they are at risk of enforcement action by Ofcom if the Committee's recommendations are followed.

This note cannot do justice to the breadth and depth of the Committee's work. But the recommendations should give any business that hosts content cause for concern. If the intention is that the website owner becomes responsible for "harm", then the way in which websites are designed, operated and moderated will need to change. This is particularly so if the proposal for a Digital Authority is accepted.

A new Digital Authority

That proposal is that:

"243.The Digital Authority will co-ordinate regulators across different sectors and multiple Government department. We therefore recommend that it should report to the Cabinet Office and be overseen at the highest level."

We have discussed elsewhere the challenges of multiple regulators. However what seems to be happening is a move from hands off, to hands on, when it comes to control of the internet.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.