CJEU rules on platform liability under copyright law, safe harbours, and injunctions
Do platforms like YouTube and cyberlocker Uploaded directly perform copyright-restricted acts under Article 3 of the InfoSoc Directive? At what conditions is the hosting safe harbour under Article 14(1) of the Ecommerce Directive available? What may be the requirements for injunctions under Article 8(3) of the InfoSoc Directive?
These are some of the questions that the German Federal Court of Justice had referred to the Court of
Justice of the European Union (CJEU) in two separate cases, which were subsequently joined: YouTube, C-682/18 and Cyando, C-683/18.
Justice of the European Union (CJEU) in two separate cases, which were subsequently joined: YouTube, C-682/18 and Cyando, C-683/18.
The Opinion of the Advocate General (AG) Øe was issued a little less than a year ago [Katpost here].
This morning, the Grand Chamber of the CJEU delivered its much-awaited ruling. Here are a couple of immediate impressions:
- First, unlike the AG, the CJEU was cautious not to venture into any analysis of the regime under Article 17 of the DSM Directive vis-à-vis the InfoSoc Directive. Instead, it was adamant that “[t]he interpretations provided by the Court in answer to those questions do not concern the set of rules established by Article 17”;
- Secondly, it diverged from the AG Opinion in relation to some crucial points, including the issue of communication to the public (in respect of which the AG had been much more trenchant than the Court) and safe harbour availability.
Let’s see more in detail how the Court reasoned.
Communication to the public: need for an indispensable and deliberate intervention
The Grand Chamber began its analysis by reviewing its earlier (and consolidated) case law on Article 3 of the InfoSoc Directive. It also highlighted the importance of both striking a fair balance between different rights and interest and conducting an individual(ized) assessment which considers several complementary criteria. Such criteria inter alia include a consideration of the indispensable role played by the platform operator and the deliberate nature of its intervention, in a context in which – as it is the case here - the initial uploads of unlawful content are initiated by users of the platform acting autonomously.
While the Court considered it evident that users in such cases do perform acts of communication to the public if their uploads can be accessed by an indeterminate number of persons, whether the same could be stated in relation to the platform operator was less clear.
All this said, the Court considered that, in line with Ziggo, the role of the platform operator in such a case qualifies as indispensable. However, the indispensability of one’s own intervention is not the only criterion to consider. In other wrods: it is necessary but not sufficient. In a situation like the one at hand another key consideration is whether the platform operator acts deliberately, that is with full knowledge of the consequences of such an intervention.
To determine if that is the case, the Court provided a (non-exhaustive and non-cumulative) number of factors for national courts, including the referring court, to consider when making a determination to this effect. It is necessary to consider whether the platform operator:
- Refrains from implementing appropriate technological measures that can be expected from a diligent operator in the specific circumstances at issue to “counter credibly and effectively copyright infringements on that platform”;
- Participates in selecting protected content illegally communicated to the public;
- Provides tools specifically intended for the illegal sharing of protected content or that it knowingly promotes such sharing, which may be attested by the fact that that operator has adopted a financial model that encourages users of its platform illegally to communicate protected content to the public via that platform.
The mere circumstance that a platform operator has abstract knowledge that illegal content may be shared by its users is insufficient, and so is the circumstance that it operates on a for-profit basis. However, in either case, the assessment leads to a different outcome if a rightholder has provided a specific notification and the platform operator has refrained from acting expeditiously to remove or disable access to that content. In relation to the for-profit intention, the Court refused to consider the GS Media presumption as having general applicability.
As expected, the CJEU did refrain from providing a definite answer as regards the question whether YouTube and/or Uploaded perform directly copyright-restricted acts. However, it appeared inclined to consider that the former might not fall within the scope of application of Article 3 of the InfoSoc Directive while, depending on the resolution of some factual circumstances that are disputed between the parties to the national proceedings, things might be different for the latter. It will be for the referring court of make a determination in both cases.
Safe harbour availability
Turning to the questions on safe harbours, in line with Google France and L’Oréal, the Grand Chamber noted the safe harbours are only available to intermediary service providers. Unlike what the AG stated (as well as some scholars and the European Commission seem to think), this means that the safe harbours do not apply irrespective of the type of liability at hand: if the referring court, for instance, found the platform operators at issue in the background proceedings directly liable for copyright infringement, these would be automatically disqualified from the hosting safe harbour.
All this said, if a platform took proactive steps to avoid that infringing activities are performed by users of its service, this should not automatically exclude the application of the hosting safe harbour. This conclusion is in line with earlier case law and the approach proposed under the Good Samaritan approach in the draft Digital Services Act (DSA; Article 6).
Turning to the notion of ‘actual knowledge’ in Article 14(1)(a) and also considering Article 15 of the Ecommerce Directive, the CJEU found that “the illegality of the activity or information must be a matter of actual knowledge or must be apparent, that is to say, it must be specifically established or readily identifiable”. In this respect and among other things, the circumstance that a platform has implemented an automatic indexing system does not automatically entail that it has also specific knowledge of illegal activities.
Considering the (thorny) issue of content of notices, the Court recalled the principles expressed in L’Oréal and considered that a notice should contain “sufficient information to enable the operator of that platform to satisfy itself, without a detailed legal examination, that that communication is illegal and that removing that content is compatible with freedom of expression”.
Injunctions under Article 8(3) and German ‘interferer liability’
The final issue concerned a question of compatibility of the German ‘interferer liability’ with EU law. In other words: does Article 8(3) of the InfoSoc Directive preclude a situation in which the rightholder is not able to obtain an injunction against an intermediary whose services are used by a third party to infringe the rights of that rightholder unless that infringement has previously been notified to that intermediary and that infringement is repeated?
To this, the Court answered in the negative.
It noted that the InfoSoc Directive itself leaves the conditions for and modalities of injunctions to national legislatures. At the same time, national rules must comply with the InfoSoc Directive itself and the system established in Articles 12 to 15 of the Ecommerce Directive.
Comment
At a first sight, today’s judgment appears rather inscrutable, if not altogether sphinx-like. The early reactions I have seen, likely supported by the rather misleading title of the press release, have been in the sense that the CJEU has ruled that platforms like YouTube and Uploaded do not communicate to the public under Article 3. In turn, this would mean that Article 17 of the DSM Directive is a novel regime that does not at all ‘clarify’ the law (recital 64 of the DSM Directive), but rather changes it.
This interpretation is, in my view, incorrect.
First of all, the CJEU considered that, unlike what some commentators thought in the aftermath of Ziggo, it is not just ‘rogue’, piracy-focused platforms that perform copyright-restricted acts. In this sense, the operators of any online platform may be considered eligible to be caught within the application of Article 3 of the InfoSoc Directive, depending on the circumstances at issue and the individual assessment mandated by the Grand Chamber.
Secondly, unlike what happened during the hearing and the AG discussed in his Opinion, the Court refrained from engaging with the DSM Directive’s Article 17. However, it follows from the point above that Articles 3 of the InfoSoc Directive and 17 of the DSM Directive are not per se antithetic.
Thirdly, in line with Article 17, the Court concluded – correctly – that if a platform does directly perform copyright-restricted acts, then it is ineligible for the hosting safe harbour.
It is likely that today’s ruling will have far reaching implications, including for the application of Article 17 of the DSM Directive, the Ecommerce Directive and, once adopted, the DSA. It is also likely that AG Øe will engage with it when he delivers his Opinion in the Polish challenge to Article 17 (C-401/19) next month.
In the meantime, The IPKat – alongside IFIM and BLACA – is organizing a rapid response webinar NEXT WEEK to discuss the implication of today’s ruling. You can sign up here.
[Originally published on The IPKat on 22 June 2021]
Comments
Post a Comment