Global Platform Liability Trends: Moving Forward


This is the latest in a four-part blog series on global intermediary liability laws. You can read additional articles here:

As this blog series has sought to show, the heightened attention to issues such as hate speech, online harassment, disinformation and the amplification of terrorist content continues to challenge policymakers around the world adopt stricter regulations for online speech, including more responsibilities for online intermediaries.

EFF has long championed efforts to promote freedom of expression and create an environment conducive to innovation in a way that balances the needs of governments and other stakeholders. We recognize that there is a delicate balance to be struck between addressing the very real problem of platforms hosting and amplifying harmful content and activities while simultaneously providing sufficient protection to such platforms so that they are not no incentive to remove protected speech from users, thereby promoting freedom of expression.

Today, as global efforts to change long-standing intermediary liability laws continue, we now use a set of questions to guide how we review these proposals. We approach the new platform regulation proposals with three main questions in mind: Are intermediary liability regulations the problem? Will the proposed solution solve this problem? And can the inevitable side effects be mitigated?

We hope that policymakers will point in the right direction on Internet policy and affirm the important role of online intermediary immunity in promoting an environment conducive to user freedom of expression. Below we outline our recommendations on how to proceed.

Our recommendations

Online intermediaries should not be held responsible for user content

Intermediaries are essential pillars of the Internet architecture and fundamental drivers of free speech, as they enable people to share content with audiences on an unprecedented scale. Immunity from liability for third-party content plays a vital role in the success of online intermediaries. This is one of the fundamental principles that we believe should continue to underpin internet regulation: platforms should not be held responsible for the ideas, images, videos or speech that users post or share online. line.

Regulators should ensure that online intermediaries continue to enjoy full liability waivers and are not held liable for content provided by users, as they are not involved in co-creating or modifying such content in a way that substantially contributes to the illegality. Any additional obligation must be proportionate and must not impede freedom of expression and innovation.

No mandatory content restriction without order of a judicial authority

When governments choose to impose positive obligations on online platforms, it is crucial that any rules governing the liability of intermediaries are provided for by law and are precise, clear and accessible. These rules must follow due process and respect the principle that it should be for independent judicial authorities to assess the illegality of content and decide whether content should be restricted. More importantly, intermediaries should not be held responsible for choosing not to remove content simply because they received a private notification from a user. In jurisdictions where knowledge of illegal content is relevant to the liability of online intermediaries, regulators should follow the principle that actual knowledge of illegality is obtained by intermediaries only if they receive an order of a court or similar authority which acts with sufficient guarantees of independence, autonomy and impartiality.

No mandatory monitoring or filtering

The obligation for platforms to monitor what users share online has a chilling effect on the discourse of users, who modify their behavior and refrain from communicating freely if they know they are being actively observed. It also infringes on users’ right to privacy and their right to privacy. Policy makers should therefore not impose on digital service providers the obligation to positively monitor their platforms or networks to detect illegal content that users post, transmit or store. Nor should there be a general obligation for platforms to actively monitor facts or circumstances indicating illegal activity by users. The use of automated filters that assess the legality of third-party content or prevent the (re)uploading of illegal content should never be mandatory, especially since filters are error-prone and tend to block too much legitimate material. Likewise, no liability should be based on an intermediary’s inability to detect illegal content, as this would incentivize platforms to screen, monitor and filter user speech.

Limit the scope of withdrawal orders

Recent cases have demonstrated the dangers of content takedown orders around the world. In Glawischnig-Piesczek versus Facebook, the Court of Justice of the EU has ruled that a court in a member state can order platforms not only to remove defamatory content globally, but also to remove identical or “equivalent” content. This was a terrible result, as the content in question may be considered illegal in one state, but is clearly legal in many other states. Additionally, by referring to “automated technologies” to detect similar language, the court opened the doors to surveillance through filters, which are notoriously inaccurate and likely to over-block legitimate material.

Internet law reforms are an opportunity to recognize that the Internet is global and that global takedown orders are grossly unfair and infringe on the freedom of users. New rules should ensure that court orders – and in particular injunctions – are not used to superimpose the laws of one country on those of all other states in the world. Takedown orders should be limited to the content in question and based on the principles of necessity and proportionality in terms of geographical scope. Otherwise, it is possible that we see the government of a country to dictate what residents of other countries can say, see or share online. This would lead to a “race to the bottom” towards the creation of an increasingly restrictive and fragmented global Internet. A commendable effort to limit the scope of takedown orders has been made in the proposed EU Digital Services Act. It provides that court orders must not exceed what is strictly necessary to achieve its objective and respect the Charter of Fundamental Rights and general principles of international law.

Regulate processes rather than speech

Instead of holding platforms responsible for content shared by users or forcing platforms to analyze every piece of content uploaded to their servers, modern platform regulation should focus on setting standards for platform processes, such as changes to terms of service and algorithmic decision making. Responsible governance, such as notifying and explaining users whenever platforms change their terms of service, can help reduce information asymmetry between users and powerful access control platforms. Users should be empowered to better understand how they can notify platforms of both problematic content and problematic takedown decisions and should be informed of how content moderation works on major platforms. Privacy by default, greater transparency and procedural safeguards, such as due process and effective appeal mechanisms for removal or blocking decisions, can help ensure the protection of fundamental rights online.

Moving in the right direction

We strongly believe that applying burdensome liability clauses to intermediaries for content shared by their users undermines the right to freedom of expression. That doesn’t mean we shouldn’t consider proposals to reform existing regulatory regimes and introduce new elements into legislation that help address fundamental flaws in today’s online ecosystem.

For many users, being online means being locked into a few powerful, non-consensual web-tracked platforms, with their ability to access and share information left at the mercy of algorithmic decision-making systems that organize their life online. Policymakers should put users back in control of their online experiences rather than empowering or even forcing the few large platforms that have monopolized the digital space to control expression and arbitrate access to content, knowledge, goods and services.

Adjustments to internet legislation provide policy makers with an opportunity to review existing rules and ensure the internet remains an open platform for freedom of expression. While the trend towards stricter liability for online intermediaries has appalled us, it has simultaneously reinvigorated our commitment to championing regulatory frameworks that promote freedom of expression and innovation.

The other blogs in this series can be found here:

Previous As Monsoon Season Approaches, Maricopa County Launches New Website to Educate Public About Mosquitoes and West Nile Virus
Next Stratasys stock forming a tradable fund