With the decision of last October 3rd, the European Court of Justice (preliminary ruling C-18/18), provided some clarifications on the interpretation of Article 15(1) of EC Directive no. 2000/31. The provision lays down the principle of provider neutrality by preventing Member States from imposing on Internet Service Providers a general obligation to monitor stored or transmitted information or an obligation to actively seek facts and circumstances indicating illegal activity. However, with this ruling, the Court of Justice clarified that a provider can be ordered to globally remove not only specific information that has been deemed unlawful by a national authority, but also other stored information with identical and/or equivalent content, to be searched for by means of automated techniques.
The dispute arose from the sharing of an article taken from an online information magazine by a Facebook user, relating to Ms Glawischnig – Piesczek, a member of the Austrian parliament. The sharing of the article had in fact automatically generated a thumbnail of the post accompanied by a picture of the deputy, followed by a comment that – according to the deputy – harmed her reputation, since it suggested she had committed crimes without giving any proof.
Following Facebook’s refusal to delete the comment, the deputy had brought an action before the Court of Vienna which, after ascertaining the offensive and defamatory nature of the comment, had prevented Facebook from publishing and disseminating those photographs accompanied by comments of “identical or equivalent content” to that of the comment declared unlawful. This decision was only partially confirmed on appeal: the second instance judge, in fact, stated that the injunction concerning the dissemination of contents of equivalent meaning, in accordance with the principle of neutrality pursuant to the mentioned Article 15, was to be considered limited only to the contents brought – in any way and by anyone – to the attention of the provider.
Following the appeal lodged by the parties before the Austrian Supreme Court, the latter raised questions for a preliminary ruling before the EU Court of Justice, asking to specify whether Article 15(1) of Directive no. 2000/31 prevents national judges from ordering a hosting provider to remove both information declared illegal and other information of identical and/or equivalent content, both in the Member State concerned and worldwide.
On the undisputed premise that Facebook Ireland LTD is a hosting provider, the Court of Justice first of all recalled that, even in the absence of a general obligation of monitoring and active seeking, national judges can impose on the provider a monitoring obligation “in specific cases”, as provided by recital 47 of the Directive. Such specific cases, according to the Court, occur for example when a national authority ascertains the unlawfulness of the information stored by the hosting provider and identifies it specifically. In addition, since damages resulting from the infringements that occur in the information society services spread rapidly and with wide geographical extension, Article 18(1) of the Directive enables national courts to terminate “any” violation, even with provisional measures. Consequently, the Court concluded that national courts can order providers to remove both the specific information declared unlawful, as well as other stored information of identical content, without this implying a general obligation of monitoring or active seeking, inconsistent with the abovementioned Article 15.
Furthermore, according to the Court, the removal/disabling order must also concern those contents conveying the same unlawful message, even if worded differently. Otherwise, in fact, the authority’s provision could easily be circumvented by slight formal changes. However, these kinds of measures are consistent with the provision of Article 15 only to the extent that they do not impose disproportionate obligations on the provider. To this end, the Court specified that ordering a hosting provider to seek and remove “equivalent” contents is legitimate only when these contents can be searched for and identified with automated techniques on the basis of specific elements to be individuated in the order (e.g. the name of the offended person, the circumstances in which the infringement was ascertained, a content equivalent to that declared unlawful), provided that any differences in the wording do not force the provider to carry out an independent assessment of those similar contents.
In the end, the Court observed that Article 18(1) of the Directive does not provide for geographical limitations on the application of the measures ordered by national judges, so that such measures can have a worldwide scope. Of course, it is up to national authorities to take due account of the applicable international law provisions.