A harmonised and coherent approach to removing illegal content does not exist at present in the EU. Indeed, different approaches exist in the EU depending on Member States, content category, or type of online platform. A more aligned approach would make the fight against illegal content more effective. It would also benefit the development of the Digital Single Market and reduce the cost of compliance with a multitude of rules for online platforms, including for new entrants. It is important to stress that what constitutes “illegal” content is determined by specific legislation at the EU level, as well as by national law.
Online platforms may become aware of the existence of illegal content in a number of different ways, through different channels. Such channels for notifications include (i) court orders or administrative decisions; (ii) notices from competent authorities (e.g. law enforcement bodies), specialised “trusted flaggers”, intellectual property rights holders or ordinary users, or (iii) through the platforms’ own investigations or knowledge.
In accordance with EU and/or national law, national courts and, in certain cases, competent authorities can issue binding orders or administrative decisions addressed to online platforms requiring them to remove or block access to illegal content. Online platforms should therefore have the necessary resources to understand the legal frameworks in which they operate. Online platforms should systematically enhance their cooperation with competent authorities in Member States, while Member States should ensure that courts are able to effectively react against illegal content online, as well as stronger (cross-border) cooperation between authorities.
Online platforms and law enforcement or other competent authorities should appoint effective points of contact in the EU, and where appropriate define effective digital interfaces to facilitate their interaction. Platforms and law enforcement authorities are also encouraged to develop technical interfaces that allow them to cooperate more effectively in the entire content governance cycle. Cooperation also with the technical community can be beneficial in advancing towards effective and technically sound solutions to this challenge.
The removal of illegal content online happens more quickly and reliably where online platforms put in place mechanisms to facilitate a privileged channel for those notice providers which offer particular expertise in notifying the presence of potentially illegal content on their website. These are so-called “trusted flaggers”, as specialised entities with specific expertise in identifying illegal content, and dedicated structures for detecting and identifying such content online.
Compared to ordinary users, trusted flaggers can be expected to bring their expertise and work with high quality standards, which should result in higher quality notices and faster take-downs. In order to ensure a high quality of notices and faster removal of illegal content, criteria based notably on respect for fundamental rights and of democratic values could be agreed by the industry at EU level. This can be done through self-regulatory mechanisms or within the EU standardisation framework, under which a particular entity can be considered a trusted flagger, allowing for sufficient flexibility to take account of content-specific characteristics and the role of the trusted flagger.
In the effective fight against illegal content online, ordinary users should be empowered to signal illegal content to online platforms and have confidence that justified notices will be considered and acted upon swiftly. Online platforms should establish an easily accessible and user-friendly mechanism that allows their users to notify content considered to be illegal and which the platforms host. Where the content is publicly available, such reporting mechanisms should also be available to the general public, without needing to be signed-in as a user. To improve the efficiency and accuracy of the assessment of potentially illegal content, such mechanism should allow for easy notification by electronic means.
Users should normally not be obliged to identify themselves when reporting what they consider illegal content, unless this information is required to determine the legality of the content (e.g., asserting ownership for intellectual property rights (IPR)). This is especially the case where their safety can be at risk or where revealing one’s identity could have legal implications. Users should be encouraged to raise their notification via trusted flaggers, where these exist, whenever they wish to maintain anonymity vis-à-vis platforms.
Online platforms should provide a clear, easily understandable and sufficiently detailed explanation of their content policy in their terms of service. These should reflect both the treatment of illegal content, and content which does not respect the platform’s terms of service. All restrictions on the kind of content permitted on a particular platform should be clearly stated and communicated to their users. This explanation should also cover the procedures in place to contest removal decisions, including those triggered by trusted flaggers.
Online platforms should publish transparency reports with sufficiently detailed information on the number and type of notices received and actions taken, as well as the time taken for processing, and the source of the notification. These reports should also include information on counter notices, if any, and the response given to these. The Commission encourages the publication of this information on a regular basis and at least once per year.
Online platform should offer simple online counter-notice procedures. When a counter-notice is filed, online platforms should provide a reply, and in case of a negative decision the reasons should be specified. When available in the Member State concerned, platforms are encouraged to allow the use of out-of-court dispute settlement bodies to resolve disputes about counter-notices.