The drafting of the new EU Copyright Directive was never going to be an easy task. As has been seen over the years, policy discussions involving digital service providers and the intellectual property rights community are often polarizing, and middle ground can be difficult to find. However, the existing legal framework – which dates from 2001 - needed a refresh, in order to take account of the new online environment in which user-generated content is a key feature, while acknowledging the challenges that authors face and their need for fair remuneration.
Unfortunately, as is now so often the case in Brussels, the new law is being drafted with a small set of large Internet companies in mind. This blinkered approach to rule-making frequently results in unintended and negative consequences for other parts of the Internet ecosystem, and indeed for end users, many of whom are often unaware that such policies are being created.
Monitoring and Filtering User-Generated Content - A Flawed Approach
The draft copyright proposal has been undergoing EU Parliamentary and Council scrutiny since it was tabled by the European Commission in 2016, and it has been heavily criticised by civil society organisations, numerous industry associations, renowned academics and research institutions. Articles 11 (the so-called “snippet tax”, by which Internet aggregators would be forced to pay publishers for displaying snippets of their articles online) and Article 13 have been the most contentious proposals. Under the latter, licensing arrangements with rights-holders are encouraged and Internet platforms would no longer be able to avail of safe harbour protections, being held legally responsible for any content that their users upload. In order to avoid such liability, platforms would have to turn to technological solutions such as upload filters, effectively requiring a general monitoring of the Internet. Furthermore, the proposal as currently drafted casts the net widely, covering any Internet platform which “optimises” content - most online services in other words. For a more in-depth analysis of some of the challenges that the legal text presents, see this interview with one of the leading copyright experts in the European Parliament, German MEP Julia Reda.
The proposal has piqued the interest of the UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye, who recently contributed to the debate by writing a compelling 9 page letter. Kaye argues that Article 13 in particular places pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter at the point of upload. Such activity will, Kaye stresses, subject users to restrictions on the freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions.
Organisations such as Engine have published reports highlighting the shortcomings of filtering, describing how “content filtering technologies are at best capable of simply identifying the contents of a file, not making the often complex determination as to whether the use of a particular file constitutes an infringement”. And many other notable Internet visionaries, including Tim Berners-Lee and Vint Cerf, have also shared their concerns, adding that “far from only affecting large American Internet platforms (who can well afford the costs of compliance), the burden of Article 13 will fall most heavily on their competitors, including European startups and SMEs”. Allied for StartUps puts an even finer point on this, stating that developing filtering technology (at cost) will do little to attract investors to Europe and the most responsible decision will be to move operations outside of the EU - sentiments also expressed in this report.
Photo by Alex Knight / Unsplash
The text that was voted in the JURI Committee of the European Parliament on 20th June is convoluted and littered with contradictory asks. Furthermore, as many have pointed out, the text is potentially in contravention of European law and the EU Charter of Fundamental Rights. A number of EU Member States have also previously raised legal questions during Council discussions. Considerable doubts remain and there is still a chance to improve the draft.
The next steps in the legislative process are a Parliament Plenary vote (possibly as early as the first week in July), negotiations between the Parliament and the Council and then final Parliamentary approval, all likely taking us up to end of the year / start of 2019. Once the text has been approved, Member States will then have to transpose the Directive nationally. It is imperative that efforts are quickly made to straighten out the definitions, remove some of the ambiguities and undo some of the damage that this proposal, as currently drafted, may do to the open Internet as we know it today.
Although not a content distribution platform, Cloudflare is passionate about the freedom of expression and Internet innovation. We believe, as many do, that Internet filters have their considerable limits, and their widespread adoption will only serve to stagnate start-up activity and stifle creativity. Furthermore, any policies that effectively strengthen the monopoly of large Internet providers, leaving smaller companies in the lurch and scrambling to comply, will only cement the gate-keeper position of the larger players.
Our Call to Action
Interested to get involved in the debate and express your opinion to the relevant politicians? There are a variety of online tools you can use such as Mozilla’s ChangeCopyright online tool, Vox Scientia’s webform and Save the Internet’s website. And this would be our call to action:
- Further clarity on definitions and scope: exclude any service of a mere technical, automatic and passive nature from the scope of the Directive
- No filtering: remove any obligation or incentive for Internet platforms to implement content recognition technologies
- No monitoring: remove any obligation on Internet platforms to monitor the information which they transmit or store
- Due process: allow users the possibility to appeal any restrictive measures via redress mechanisms