SaveTheInternet

Urheberrechtsreform

Characteristics

What:

EU-Guideline
 

Name:

Law to adapt the copyright law to the requirements of the digital single market

(split in 2 Laws: "Law I to..." and "Law II to...")
 

Aka:

Article 13, Article 17, Upload Filter, Censorship Machine, Memefilter
 
 

TL;DR:

The current draft bill for the national implementation of Article 17 stipulates that real-time upload filters should be used on all platforms during uploading. Only when the filter is activated should the user be given the opportunity to take position and comment. All uploads already on the platform are also to be checked by filters and automatically removed when copyrighted material is detected. There is no possibility for users to mark already uploaded content as legal. There shall also be no possibility for human verification on the part of the platforms.

 

 
 
 

Quotes:

 
This is almost certainly a general monitoring obligation, which the European Court of Justice has said violates the fundamental rights. – Julia Reda
 
We have to make sure that all the people who protested last year know that this legislative process is now coming up and that we have to take action now.“ – Julia Reda, same Video (Minute 13:29)

 

 

 

Dangers:

– Susceptible for Errors of technical filter mechanisms,
– Blocking of legal content (e.g. parodies),
– Restriction of freedom of information and opinion,
– Abuse for censorship purposes
 

Actual Goal:

Protection of rights holders from illegal use of their works,
fair compensation
 

Current status:

Drafts of the design at national level in Germany,
Public consultation of the BMJV (Federal Ministry of Justice and Consumer Protection) on 6th of November 2020
 
 
 
 



The long history of the copyright reform

In 2001, a reform of the European copyright law was decided for the last time. The EU had failed to deal with the new digital realities. Platforms offered their users the opportunity to express themselves creatively and to exchange information with each other. An Internet culture of free sharing and the diverse interaction with content of all kinds was created. The film and music industry felt compelled to demand a legal framework for the protection of their works due to the free and sometimes illegal content available on the Internet. Furthermore, newspaper publishers saw themselves in an unstoppable downward trend. This problem gave rise to the controversial Articles 11 and 13.

It was only in February 2018 that the German coalition agreement between the governing parties CDU/CSU and SPD stipulated that so-called "upload filters" were not appropriate and should be rejected. [1]These would not only be able to delete unwanted content retroactively, but would also be able to prevent the upload of such content independently. Platforms like Facebook, Twitter and YouTube would have to censor a lot of content! However, EU policy simply ignored this rejection at federal level. The reform, which has been aimed at since 2016, continues to be equipped with the demand for filters and has been initiated on the European level by the German CDU member of parliament Axel Voss of all people, together with the controversial "ancillary copyright law".



The adoption of the copyright law reform at EU-Level


The reform text came from the EU Legal Affairs Committee with a narrow majority and was rejected in the first draft in the Parliament. Improvements were demanded and many modifications, including good alternatives to filters, were put on the table. In the second vote, the parliament decided in favor of a reform in general, although the directive was given the harshest versions. It should be noted that many parliamentarians supported the reform on the basis of Articles 14-16, which are intended to strengthen the rights of journalists, including those towards publishers and users.

After long trilogue negotiations, on the 4th of February 2019, France and Germany reached a compromise on the question of who should be forced to use upload filters. [2]However, this compromise can hardly be described as such, as it is based almost exclusively on the French requirements and provides for wide-ranging filtering requirements for platforms.



Major problem of the EU directive:


In concrete terms, the "compromise" stipulates that profit-oriented platforms must fulfill the following conditions in order to be exempt from a general filtering obligation:


   1. The platform must be younger than 3 years and have an annual sales volume of less than 10 million euros.

   2. The platform must have less than 5 million users per month.


If even one of these two conditions is not met, a platform would be forced to implement upload filters. In particular, the first criteria would mean that, within a maximum of three years, all existing platforms available in the EU would be subject to the filtering obligation, regardless of their size or whether or not they are addressed to the public.

All platforms, whether or not they meet the criteria, must prove that they have made "best efforts" to obtain licenses from all rights holders whose content their users could possibly upload. Kurz: Plattformbetreiber müssten zu jedem jemals kreierten urheberrechtlich geschützten Inhalt eine Lizenz zu dem genannten Preis erwerben. Die einzige Alternative dazu wäre eine umfangreiche Selbstzensur mit Uploadfiltern, die wiederum kostspielig, technisch fehlerhaft und ein Angriff auf die Informations- und Meinungsfreiheit wäre.


After the Council on 15 April 2019 approved the reform adopted by the Parliament with the disputed articles, it is up to the member states to rewrite it into national laws within the next two years.



The design of the directive in Germany


- Preparatory work through "stakeholder dialogue" - users, platforms and civil rights activists should draw up a policy paper to make it easier for countries to implement.
-Drafting in two laws, first without Art.3 + 17. urgent procedure envisaged 
-The first is to be issued in an express procedure and then be valid immediately, the second in june(?) 2021
- The first is probably in the evaluation of the statements, no infos since january 2020 (except statement of the conference of the cultus ministers)

ancillary copyright: 

128×128 Pixel, 

3 seconds Video.

Publisher participation

Qualitative journalism is not encouraged.

Our statement to the BMJV here: https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Stellungnahmen/2020/Downloads/013120_StellungnahmeAnpassungdes_Urheberrechts_safetheinternet_DiskE.pdf?__blob=publicationFile&v=2

 


The design of Article 17 (former Article 13) in Germany


The biggest pitfall in the practical design of Article 17 is the question of how to operate an automated content filtering system that actually only prevents illegal uploads and allows legal uploads. There are several considerations for mechanisms how to restrict fully automated upload filters.


A first German discussion draft proposed the so-called "pre-flagging". Here, the uploading user would be allowed to independently mark uploads that contain works of third parties as legitimate. Upload filters could then only access unmarked uploads and filter out obviously wrongly marked content at a higher threshold.

The Commission then published a counter-proposal, the so-called "Match and Flag". Here, an upload filter is to scan the upload in advance for works by third parties and make a pre-selection. If the upload is classified as "probably illegal", it will be deleted. If the situation is unclear, the uploader should be given the opportunity to declare that the use is legitimate. Then a human review is initiated, until a decision is made the upload will remain online.

The most recent draft is now the draft of the BMJV, and very similar to the previous "Match and Flag" draft of the Commission. However, on closer inspection, the upload filters are the most powerful in this draft.


§8 of the article provides that an upload filter checks the upload in real time during uploading and notifies the user if it detects third-party works. Only then the user can comment on it. A general "pre-flagging" by users should not be possible. This is also in line with the wishes of the major platforms, which fear that pre-flagging will result in many unjustified markings, which will then cause the considerable human effort required to check uploads afterwards.

If legislation were to require platforms to use automated real-time filters, this would be unreasonable for smaller platforms, as the corresponding technical solutions would be too costly and ultimately lead to a dependency on the large platforms.

Furthermore, it must be remembered that the copyright reform will also apply to all uploads that are already on the platform.

The commission draft provided that only "probably illegal" uploads may be removed automatically. In the case of unclear uploads, the user can comment or a human review takes place and the upload remains online during this process.

In contrast, the latest draft of the speaker's paper plans to automatically remove all detected uploads immediately. The distinction between "probably illegal" and " unclear" is no longer made and human verification is no longer necessary. This means that uploads that have already been uploaded cannot be marked as legal at any time.


And now?

Paul Keller, President and founding member of COMMUNIA Association, which promotes the free use of and access to knowledge and cultural assets, recently formulated concrete ideas and thoughts on how to make Article 17 fairer and more humane.

He pleads for the possibility that users can mark every upload as legitimate even after it has been uploaded. These should then no longer be able to be automatically deleted by filters.

He shares the considerations about a "probably illegal" standard and says at the same time that the threshold for this must be as high as possible, because even such a filter can still block potentially legal issues. In addition, the criteria of this threshold must be transparent and users must be able to challenge it in court.

Anything that does not reach this threshold must be protected from being blocked and must not be removed during the check.

The database for works that are in the public domain or open source should be publicly available. So everybody can consult it and the platforms can use the same reference.

Ultimately, the use of filters should not be required nationally for all platforms, as it would be unreasonable for smaller platforms.


Link to the current draft of the BMJV: https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/RefE_Urheberrecht.pdf?__blob=publicationFile&v=7



Sources