The value gap proposal in the JURI Committee Report as a consolidation of the existing framework (not the end of the internet)

Mind the gap, please
Following a number of delays, a few days ago the JURI Committee (Legal Affairs) of the European Parliament finally adopted the text of the Report on the proposed Directive on copyright in the Digital Single Market (DSM Directive) as drafted by its Rapporteur, MEP Voss. 

This development, which would allow the European Parliament to begin trilogue negotiations (negotiations between the Council and the European Parliament to reach a compromise between the respective versions of the DSM Directive) follows the earlier vote and, with it, agreed negotiating mandate in the Council of the European Union [here]

This week (5 July) the plenary of the European Parliament will vote on whether the trilogue negotiations, on the basis of the mandate represented by the JURI Committee Report, may begin [see this helpful Politico infographic] 

The text of the proposed DSM directive has attracted significant attention since its release by the EU Commission in September 2016. Among other things, the provision in Article 13 (value gap or transfer of value) has been commented extensively. According to critics, this proposal – if adopted – would affect dramatically the functioning of the internet and introduce brand-new obligations for online actors (online content sharing service providers).

But would this be actually the case?

The structure and content of Article 13 in the JURI Committee Report

Compared to earlier versions of the value gap provision, the most notable elements of the JURI Committee version are probably those outlined below.

Definition of online content sharing service providers

Online content sharing service providers that would be subject to Article 13 obligations are defined (Recital 37a and Article 2) as ‘information society service providers one of the main purposes of which is to store and give access to the public or to stream copyright protected content uploaded / made available by its users and that optimise content, including amongst others promoting displaying, tagging, curating, sequencing the uploaded works or other subject-matter, irrespective of the means used therefor, and therefore act in an active way.’

Non-commercial service providers (eg online encyclopaedias), providers which allow content be uploaded with the authorization of all rightholders concerned, providers of private, open source software developing platforms, and online marketplaces whose main activity is online retail of physical goods are outside the scope of the definition of online content sharing service providers and, therefore, Article 13 of the DSM Directive.

Nature and obligations of online content sharing service providers

The JURI Committee Report (Article 13 and Recitals 37-39c) states that providers within the definition above:

  • are deemed to make acts of communication to the public and are therefore responsible (and potentially liable) for user-uploaded content (UUC) made available through their services;
  • if they make acts of communication to the public, are not eligible for the safe harbour within Article 14 of the Ecommerce Directive. In any case, the safe harbour would not apply when a service provider plays an active role, including by optimizing the presentation of the uploaded works or subject-matter or promoting them, irrespective of the nature of the means used therefore;
  • are under an obligation to conclude ‘fair and appropriate’ licensing agreements with relevant rightholders (the latter are, however, under no obligation to issue licences), and this obligation also applies to information society service providers that automatically reproduce or refer to significant amounts to copyright works and make them available to the public for the purpose of indexing and referencing (Article 13b) (the latter appears to resemble a law adopted in France in 2016);
  • the licences granted also cover any liability of users of the service for non-commercial UUC in line with the terms of the relevant licence;
  • shall, in cooperation with rightholders, take ‘appropriate and proportionate’ measures to ensure the functioning of the licensing agreements concluded for use of relevant works on their services;
  • are under an obligation to prevent the availability of infringing content by adopting proportionate and effective measures – based on information provided by rightholders – while not preventing the availability of lawful UUC (this obligation subsists also when the safe harbour protection applies and in the absence of licensing agreements);
  • are under an obligation of transparency towards rightholders and users alike regarding the use and implementation of relevant measures.


Obligations of EU Member States

In the JURI Committee version EU Member States:

  • are under an obligation to ensure that the implementation of the measures referred to in Article 13 is: (i) proportionate; (ii) strikes an appropriate balance between different fundamental rights protected by the EU Charter of Fundamental Rights; and (iii) is in accordance with the prohibition of general monitoring within Article 15 of the Ecommerce Directive.
  • must ensure that: (i) providers put in place effective and expeditious complaints and redress mechanisms to prevent misuses or limitations to the exercise of relevant exceptions and limitations (any complaint filed under such mechanisms is to be processed without undue delay); (ii) the measures adopted by online content sharing service providers to prevent the availability of infringing content be compliant with the GDPR and the Directive on privacy and electronic communications and require no identification of individual users and the processing of their personal data; (iii) users have access to judicial remedies to assert reliance on an exception or limitation; (iv) authors and performers, who do not opt for a non-exclusive usage right for all users free of charge, receive fair and proportionate remuneration for the exploitation of their works, including online.


What the law already says

The version of the value gap proposal approved by the JURI Committee moves from the idea that online content sharing providers make acts of communication to the public and are ineligible for the safe harbour protection in relation to their own copyright-relevant acts.

Commentaries on the JURI Committee Report have rapidly emerged [eg, herehereherehere]. Criticisms have focused, in particular and among other things, on the following aspects:

  • Online content sharing service providers do not make acts of communication to the public;
  • Online content sharing service providers are eligible for the safe harbour protection;
  • Online content sharing service providers may not be required to implement filtering systems;
  • Article 13 of the DSM Directive would seriously impair freedom of expression/information, as well as data protection/privacy, thus breaching users’ fundamental rights.
The current legal framework – as developed at the level of both the Court of Justice of the European Union (CJEU) and national courts – seems to have moved already towards the direction envisaged by the value gap proposal. It may be indeed the case that Article 13, also if adopted in the form proposed by the JURI Committee, would not represent a dramatic shift from the way in which the law has developed up till now. In relation to critics’ legitimate and important concerns, guidance appears in fact already available under the existing framework.

Responsibility and liability for unauthorized acts of communication to the public

The right of communication to the public within Article 3(1) of the InfoSoc Directive has been progressively construed by the CJEU through its nearly 20 judgments on this point. In more recent cases the Court has focused in particular and among other things on the ‘indispensable intervention’ of the user/defendant, and referred to the user’s profit-making intention.

In Filmspeler [here], the Court held that an intervention enabling a direct link to be established between those who make available infringing works and users of such works, “is quite different from the mere provision of physical facilities, referred to in recital 27 of [the InfoSoc Directive].” Such an intervention – made “with full knowledge of the consequences” of such a conduct – facilitates access to unlicensed content that would be otherwise more difficult to locate and triggers the liability of the subject who makes it.

Coherently with this understanding, in its 2017 judgment in The Pirate Bay [analyzed more in detail here], the CJEU concluded that the operators of an online platform may be liable for unauthorized acts of communication to the public.

The undertaking by the platform operators of indexing, categorization, deletion, or filtering activities – no matter how they are performed – excludes any assimilation to the mere provision of facilities. The making available and management of an online sharing platform must be therefore considered an act of communication for the purposes of Article 3(1) of the InfoSoc Directive.

Although the platform at issue in The Pirate Bay was (and still is) principally devoted to piracy, it appears questionable to hold the outcome of that case would be only applicable to egregious scenarios like the one at stake there. Acts of communication may be made also by the operators of other, non-piracy focused, platforms giving access to UUC. National case law has begun emerging and confirmed this point: a court in Austria (in the context of interim proceedings) has recently ruled that YouTube makes acts of communication to the public. The Federal Court of Justice in Germany is also expected to rule on whether YouTube may be regarded as primarily responsible (and liable) for acts of communication to the public in September.

Overall, the evolution – also at the judicial level – has been in the sense of acknowledging that certain platforms may have moved away from a nature of pure, passive hosts.

Thrilled to be in the water ...
where's that safe harbour?
Safe harbour protection

With regard to the unavailability of the safe harbour for hosting providers within Article 14 of the Ecommerce Directive to platforms liable for unauthorized acts of communication to the public, this should not come as a surprise (although in its original proposal the EU Commission suggested that the safe harbour could be still available to platforms making acts of communication).

The safe harbours in the Ecommerce Directive are only available to passive providers. This is clear from the language of the relevant provisions as well as CJEU case law, including Google France and eBay.

Furthermore, the Ecommerce Directive (Recital 44) explicitly excludes the applicability of safe harbours in case of direct infringements of mere conduit and caching providers in collaboration with recipients of their services. For hosting providers it appears fair to assume that the same regime applies. The safe harbour relates in fact to possible liability of a hosting provider on a secondary basis for third-party infringements, not direct infringements by the provider (Recital 46 and Article 14(2)).

Although some scholars have suggested a different reading of the Ecommerce Directive (holding that Ecommerce safe harbours would apply irrespective of the form of liability), the CJEU appears to have taken a different direction in The Pirate Bay. Unlike the Opinion of the Advocate General, the decision contains no references to the Ecommerce Directive and envisages further hypotheses of liability than what the Opinion does.

Lack of references to the Ecommerce Directive in that judgment suggests that the exemptions/limitations from liability envisaged in the latter would be inapplicable in case of primary infringement by platform operators. This is coherent with the idea that the insulation offered by the safe harbours is only available to information society service providers that act as mere intermediaries.

Filtering
Filtering obligations

With regard to the suggestion that EU law prohibits the imposition of monitoring (filtering) obligations, this is true with regard to general monitoring only (as also confirmed in eBay and McFadden).

Some commentators have referred to the twin decisions in Scarlet and Netlog, arguing that the CJEU has clarified that an obligation like the one that would be imposed under Article 13 of the DSM Directive would be absolutely contrary to EU law.

It is suggested that this is an incorrect reading of those judgments, which are narrower in scope than what has been claimed. In fact, what the CJEU found incompatible with EU law in those case (see also the operative part of the decision) would be only a filtering system imposed on a provider that would: (1) filter information which is stored on its servers by its service users; (2) which applies indiscriminately to all of those users; (3) as a preventative measure; (4) exclusively at its expense; and (5) for an unlimited period, which is capable of identifying electronic files containing copyright material, with a view to preventing those works from being made available to the public without a licence.

In The Pirate Bay the CJEU held that liability for unauthorized acts of communication to the public arises in case of actual and constructive knowledge and – potentially – also in cases in which knowledge is presumed (in a GS Media sense). In this sense, operators of platforms with a profit-making intention would have an ex ante reasonable duty of care and be subject to an ex post notice-and-takedown system, which would also include an obligation to prevent infringements of the same kind, eg by means of re-uploads of the same content. This appears in line with eBay, in which the CJEU clarified the obligations of a ‘diligent economic operator’ and also held that an injunction against an intermediary may be aimed not just at repressing existing infringements but also preventing new ones from occurring. Also national courts have reached similar conclusions regarding preventing re-uploads of infringing content, eg in Germany and Italy.

This, in substance, is what the language of the value gap proposal also suggests and does not appear to be at odds with developments already occurred at the CJEU and national case law levels.

Fundamental rights, including freedom of expression/privacy and data protection/privacy

Finally, some have argued that the value gap proposal would ban memes and GIFs and ‘censor’ the internet. Yet Article 13: (a) requires undertaking a balancing of different rights and interests, and (b) is without prejudice to available exceptions and limitations (which remain optional for Member States to introduce, irrespective of whether the value gap proposal is adopted or not).

Some have also suggested that automated filtering systems are not in a position to determine whether a certain use of a copyright work falls under an available exception. Whilst this might be true (yet platforms that already have filtering systems in place are flooded with parodies, quotations, reviews, etc), the proposal also clarifies that Member States must ensure that systems to prevent misuses or undue limitations to the exercise of relevant exceptions and limitations are in place, together with complaint and redress mechanisms.

If there is a problem with copyright exceptions and limitations in the EU (and the freedom to make and post GIF and memes), this has existed for a long time, well before and independently from the release of the proposal for a DSM Directive [The IPKat discussed it here]. It is due to: (a) the fact that Article 5 of the InfoSoc Directive leaves Member States free to pick and choose what exceptions and limitations transpose in their legal systems (with the sole exclusion of temporary copies) and (b) as a matter of fact, national transpositions of Article 5 exceptions and limitations have been different across the EU, with the result – if one did not consider the work of the CJEU in this area – that there appears to be no real level playing field for copyright exceptions and limitations across the EU.

Finally, some critics have also noted that the obligations imposed on providers to prevent the availability of infringing content would be contrary to data protection/privacy principles and Article 8 of the EU Charter. This concern is difficult to fully grasp in abstracto, as the proposal refers expressly not only to the respect of fundamental rights but also to the GDPR and the Directive on privacy and electronic communications.

Conclusion

While the proposal in Article 13 (read in light of Recitals 38 and 39) raises practical questions (eg different approaches will likely be required depending on the type of content at issue, information regarding content will need to be provided accurately by rightholders to comply with the eBay decision, providers will need to make technical choices and adopt appropriate filtering systems, etc), some of the concerns raised against it relate to issues that have been already addressed within the existing EU framework, as also interpreted at the judicial level.

In this sense, the adoption of the value gap proposal would hardly signal a major departure from the law as it has already developed under existing legislative instruments: it would rather represent a consolidation and possibly a clarification thereof. While there is room to improve the text of Article 13 further, its main tenets do not appear at odds with EU law, including fundamental rights.

[Originally published on The IPKat on 2 July 2018]

Comments

Popular posts from this blog

Filmspeler, the right of communication to the public, and unlawful streams: a landmark decision

Italian court finds Google and YouTube liable for failing to remove unlicensed content (but confirms eligibility for safe harbour protection)

Brands and online ecommerce platforms: a tainted relationship?