In the case of Eva Glawischnig-Piesczek v Facebook Ireland Ltd (Case C-18/18), the Advocate General (AG) of the Court of Justice (CJEU) was asked to clarify the scope of the obligation that may be imposed on a host provider to remove illegal information. Article 15(1) of the e-Commerce Directive 2000/31/EC (the Directive) prohibits Member States from imposing a general monitoring obligation on host providers, and the CJEU considered whether that provision precludes a court, in the context of an injunction to remove notified illegal content, from ordering a host provider to seek and identify identical or equivalent illegal content. The CJEU also considered the territorial scope of a removal obligation, and whether removal could be ordered on a worldwide basis.
In his Opinion, AG Szpunar concluded that a host provider may be ordered to remove not only notified illegal content, but to seek and identify among the information disseminated by any user of that platform, information ‘identical’ to that which has been characterised as illegal by a court. In addition, a host provider may be ordered to seek and identify information ‘equivalent’ to that characterised as illegal, but only among the information disseminated by the original user, and not by any user. The AG also considered that since the Directive does not regulate the territorial scope of an obligation to remove information disseminated via a social network platform, it does not preclude a host provider from being ordered to remove such information on a worldwide basis. Whilst the AG’s Opinion is not binding on the CJEU, it will be of persuasive value.
Ms Eva Glawischnig-Piesczek (a member of the Austrian National Council, and chair of ‘the Greens’ parliamentary party, and the party’s federal spokesperson), applied to the Austrian courts for an injunction to be issued ordering Facebook to bring an end to the publication of a defamatory comment. A Facebook user had shared, on their personal page, an article from an Austrian online news magazine, entitled ‘Greens: Minimum income for refugees should stay’. The user had also published, in connection with the article, a disparaging comment about Ms Glawischnig-Piesczek. The publication had the effect of creating a thumbnail of the magazine’s website, containing the title and a summary of the article, as well as a photograph of Ms Glawischnig-Piesczek.
As Facebook did not respond to her request for the relevant comment to be deleted, Ms Glawischnig-Piesczek sought an order requiring Facebook to cease publication and/or dissemination of photographs of herself, if the accompanying message disseminated the same allegations as the comment and/or ‘equivalent content’.
The Court at First Instance made the interlocutory order applied for, and Facebook disabled access in Austria to the content initially published. The Austrian Supreme Court, subsequently considered that the statements at issue were intended to damage Ms Glawischnig-Piesczek’s reputation, to insult her, and to defame her. The Supreme Court asked the CJEU whether an injunction made against a host provider could be extended worldwide, to statements with identical wording and/or having equivalent content of which it is not aware. The CJEU was asked to interpret the Directive in that context.
The Directive provides a website host provider, such as Facebook, with an exemption from liability in respect of information posted on its servers by third parties, to the extent that it is not aware of the illegal nature of that information. Once it becomes aware of its illegality, it is obliged to delete the information or block access to it.
The AG responded to the questions referred by the Austrian Supreme Court as follows:
(i) Can a host provider be ordered to remove information identical to the information characterised as illegal?
The Directive does not preclude a host provider from being ordered, in the context of an injunction, to seek and identify among the information disseminated by any users of that platform, information ‘identical’ to that which has been characterised as illegal by a court that issued that injunction. In the AG’s view, such an obligation does not require sophisticated techniques that might represent an extraordinary burden, nor an excessive breach of the right to freedom to conduct a business which a host provider enjoys under Article 16 of the EU Charter of Fundamental Rights.
(ii) Does the removal obligation extend to equivalent information?
A host provider may also be ordered to seek and identify ‘equivalent information’ to that characterised as illegal, but only among the information disseminated by the relevant user that disseminated that illegal information. In the AG’s view, an obligation to identify ‘equivalent information’ from any user would require the general monitoring of all information disseminated via a social network platform, and would not ensure a fair balance between the fundamental rights concerned. It would also require costly solutions to be developed by a host provider. The AG acknowledged that the reference to ‘equivalent information’ or to information ‘having equivalent content’ gives rise to difficulties of interpretation, but was satisfied that it could be inferred that the reference to ‘equivalent information’ meant information that ‘scarcely diverges’ from the original information or to situations in which ‘the message remains essentially unaltered’. The AG emphasised that a court adjudicating on the removal of ‘equivalent information’ must respect the principle of legal certainty and ensure that the effects of its injunction are clear, precise and foreseeable. In doing so, it must weigh up the fundamental rights involved and take account of the principle of proportionality.
(iii) Does the removal obligation apply worldwide or only in the relevant Member State?
Since the Directive does not regulate the territorial scope of an obligation to remove information disseminated via a social network platform, it does not preclude a host provider from being ordered to remove information disseminated worldwide. The AG noted, however, that the implementation of a removal obligation should not go beyond what is necessary to achieve the protection of the injured person. Therefore, instead of removing the content, the court might, where appropriate, order that access to the information be disabled with the help of geo-blocking. The AG distinguished the current case from the Google (Territorial scope of de-referencing) (C-507/17) case, on the basis that the latter concerned the Data Protection Directive 95/46/EC. In that case the AG opined that a service provider may be required to delete search results on an EU, rather than on a global basis.
(iv) Does the removal obligation apply as soon as the operator becomes aware of the illegal information?
In the AG’s view, the Directive does not preclude a host provider from being ordered to remove information ‘equivalent’ to the information characterised as illegal, provided it has been made aware of that equivalent information by the person concerned, third parties or another source, as in that case, the removal obligation does not entail general monitoring of information stored.
This is a significant opinion about the potential scope of a host provider’s obligation to remove illegal content, which could have far-reaching consequences. It remains to be seen whether the CJEU will follow the AG’s Opinion.