As part of its Digital Single Market strategy, the European Commission has identified the issue of combatting illegal content online as a key challenge for online platforms. It has now released guidance to online platforms containing a set of principles for tackling illegal content. The guidelines focus on three stages of the process: detecting illegal content; removing it; and preventing it from re-appearing.

The EU's Digital Single Market strategy ("DSM Strategy") was announced in May 2015.  The European Commission's goal was to create a Digital Single Market across the EU, with significant initiatives focused on sectors such as e-commerce, media and entertainment, telecoms and the provision of online services.  The EU's aim is that actions taken as part of the DSM Strategy should break down regulatory barriers and enable online and digital entities to operate in a free and fair market, and also to move the current market away from the fragmented and eclectic mix of laws and regulations that currently govern the online environment in the EU's 28 Member States.  The Commission believes that the need to comply with multiple local laws across different Member States can be difficult for businesses, and increase costs for consumers.  We wrote earlier this year about how the Commission was doing in progressing its DSM Strategy initiatives.

One key part of the DSM Strategy targeted actions designed to combat illegal content online, including potential regulation of online platforms (an umbrella term describing a wide variety of online services). The Commission decided in 2016 not to introduce any new laws specifically to regulate online platforms' operations in Europe. However, in what the Commission sees as a quid pro quo for not legislating, it has pushed forward its idea of guidance to online platforms about illegal content. The Commission has now released that guidance, in the form of a non-binding communication to online platforms, containing a set of guidelines for tackling illegal content online (the " Guidelines"). The Guidelines focus on three stages of the process: detecting illegal content, removing it, and preventing it from re-appearing.

In the Guidelines, the Commission stresses once again that illegal content online is a key issue that needs to be tackled.  It ties the need to remove online content not only to the protection of users and society at large but also to an economic benefit for the EU as a whole. In particular, the Commission argues that a failure to tackle illegal content online "can undermine citizens' trust and confidence in the digital environment and threaten the further economic development of platform ecosystems and the Digital Single Market". It is perhaps this aim to protect both society as a whole and the economic value of the Digital Single Market that feeds into the Commission's understanding of what "illegal content" really means, and the wide range of players required to address it.

What Are Online Platforms?

While high-profile news stories might suggest that the Commission is only targeting social media giants and search engines, the reality is that the term "online platform" covers a whole range of online services. According to the Commission, online platforms include: "online advertising platforms, marketplaces, search engines, social media and creative content outlets, application distribution platforms, communications services, payment systems, and platforms for the collaborative economy." So, as well as the obvious candidates to help in the fight against illegal content online, some unsuspecting sites – such as online marketplaces, and platforms which act as online repositories for sharing content or software (along with many more types of services) – are also in the Guidelines' crosshairs.

What Is Illegal Content?

The Commission makes clear that online platforms have a responsibility to protect their users and society at large, and to prevent criminals from exploiting their services. The most obvious examples appear in the context of terrorist material online or racist speech that incites violence and hatred. Interestingly, however, the Commission is at pains to stress that illegal content is an umbrella term that covers more than obvious, outwardly offensive material. It also includes things like IP infringement, illegal commercial practices, and defamatory online activities. This wide definition places more of an onus on online platforms, which have to engage more often with these "commercial" activities than obviously criminal ones.

A Question of Balance

The detection and removal of potentially illegal content has an obvious impact on the interests of platform users, particularly where automatic detection and filtering tools are used. In response to this, the Commission emphasizes the need for a balance between robust safeguards and fundamental rights. This is a theme that runs through the Guidelines. For example, automatic detection and filtering tools should be used to detect illegal content, but only where there is a serious risk of harm or where content has already been identified as illegal. Similarly, online platforms should act quickly when removing illegal content, but should be transparent about their removal policies, and should have safeguards in place to prevent over-removal, such as an appeal process for accused users, and repercussions for those who report illegal content in bad faith.

Liability Exemptions (Article 14 E-Commerce Directive)

Under Article 14 of the EU E-Commerce Directive, online platforms offering hosting services ("Service Providers") benefit from a liability exemption which provides that they are not liable for information stored on their platforms at the request of third parties. This exemption benefited many sites, shielding them from liability in respect of their users' uploads.

The exemption is only available as long as Service Providers do not play "an active role" i.e. they essentially act as a mere repository for user content. There has been a fear that greater involvement in the detection and removal of illegal content might be constituted as an active role, and cause the Article 14 exemption to fall away. Moreover, there has been some concern over a current proposed new copyright directive which also seems to suggest that an active role might be considered to include a wider range of activities than had been previously thought.

However, the Commission stresses that, in its view, taking proactive measures to detect and remove illegal content online would not be considered an active role in the context of Article 14, and therefore Service Providers who implement such measures would still be able to benefit from the exemption. This is the case under both current legislation and the proposed new copyright directive.

The Commission also makes it clear that Article 14 requires Service Providers to "act expeditiously to remove or disable access to the content" when they become aware of it. The question then becomes: what measures must online platforms take to "become aware" of illegal content on their sites?

Illegal Content Detecting and Notification

The Commission identified three main sources of knowledge about illegal content for online platforms:

  1. court orders or administrative decisions; 
  2. competent authorities, trusted flaggers, holders of IP rights, or users themselves; and 
  3. the online platform's own investigation or knowledge.

With regard to notices from other sources, the Commission's key guidance is that online platforms should have the right mechanisms in place both to communicate effectively with administrative bodies and also provide an adequate forum to allow users and rights holders to notify the platform about infringing content on its site. If the content is available to the general public, notification mechanisms should also be open to the general public (rather than only to members of the site, for example).

Trusted flaggers were identified by the Commission as "specialised entities with specific expertise in identifying illegal content, and dedicated structures for detecting and identifying such content online" – the Commission suggests that notices from such entities should be prioritized and fast-tracked.

It is also important, according to the Guidelines, that online platforms do not simply rely on notification from other sources, but are proactive in identifying illegal content on their platforms, using and developing automatic detection and filtering technologies where there is a risk of serious harm. The Commission also pointed to Article 15 and Recital 47 of the E-Commerce Directive, which provide that while there is no general obligation for online platforms to monitor their users, this does not preclude them from having to comply with a specific request to do so.

Removing Illegal Content

On the question of what online platforms should actually do once illegal content is identified, the answer is generally straightforward: remove it as quickly as possible.

However a contextual approach should be adopted. Where there is obviously illegal content, it can be removed quickly. Where the question is more complex, as may be the case with a defamatory statement or controversial IP infringement, more consideration should be given to the decision, and questions can be referred to third parties or relevant competent authorities. One would assume that most online platforms will be facing these increasingly complex issues more often than the straightforward ones.

As mentioned above, online platforms should be transparent in their removal policies and should provide accused users with a right to respond to any notices made about their content.

Preventing Re-Appearance of Illegal Content Online

The Guidelines suggest that online platforms' duties do not simply stop once illegal content is removed: rather, they should take steps to prevent that content from finding its way back online by banning or suspending users who repeatedly upload illegal content of the same nature.  Once again, automatic and filtering technologies should be used for content that has already been assessed as illegal, and platforms should keep any tools that they have in this regard up-to-date to keep up with the increased sophistication of criminals online.

Conclusion

The Guidelines, while they are non-binding and do not alter the current legislative framework, do suggest some onerous measures for online platforms to take when detecting and removing illegal content, particularly because the term "illegal content" can encapsulate a wide range of activity. Measures also require a balancing act between protecting society and the fundamental rights of users. While it remains to be seen how concrete the Guidelines become, there seems to be a growing implication that online platforms should take more responsibility for the content on their sites and should be more proactive in managing that content.

Rayhaan Vankalwala, a trainee solicitor in Morrison & Foerster's London office, assisted in the preparation of this client alert.

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved