SaveTheInternet


Terreg

Characteristics

 

What:

EU-Guideline

 

Name:

Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on the prevention of the distribution of terrorist content online
 

Aka:

TERREG, TOC, #Errorfilter, Uploadfilter auf Steroiden 

 

 






Dangers:

 
1. Large platforms win, while small platforms die out
 
The new regulation requires platforms to delete "terrorist content" within a one-hour time period. So-called "trolls" could cause active damage to platforms by actively uploading terrorist content. Control by moderators is therefore impossible, which is why the platforms have to rely on upload filters. The most efficient upload filters are developed by larger platforms. Smaller platforms have no funds to develop these technologies or to buy them from the big platforms. Blogs, Wikipedia or all kinds of forums may have to cease operations because they cannot comply with this regulation. 
 
2. The technology behind upload filters is prone to errors
 
Once again, the EU is building on a technology that confused, for example, the fire at Notre Dame with a video of the attacks of September the 11th, 2001. Since smaller websites in particular cannot guarantee the use of functioning upload filters, they will rely on so-called geoblocking (country blocks). The same mechanism could also be used to prevent European users from accessing non-European websites. This could lead to the fear that only inner-European websites could be used. The fact that the original idea of the "World Wide Web" would then be lost would only be the tip of the iceberg.
 
3. What should be filtered if there is no definition of what "terrorism" is?
 
The EU could not commit itself to what is considered terrorism throughout the EU. For this reason, an extremely vague concept was established in the regulation. The big problem here is that authorities in one country may use the obligation to delete throughout the EU. In extreme cases, authorized governments could exploit TERREG to have unwanted opinions deleted on the German Internet or across the EU. This way, criticism from opposition members could be deleted not only in Hungary, for example, but in the entire EU, even though the content would still fall under the freedom of opinion in some countries. An example? Hungary currently regards environmentalists as "eco-terrorists".
Furthermore, it has not yet been clarified which authorities are to be given the authority to delete. Independence is therefore not guaranteed. It would be possible for both the village policeman and the minister of the interior to order deletions.
 
4. Internet platforms as provisional courts
 
If a platform receives a deletion order from an authority, it should decide on the basis of its general terms and conditions whether the upload must be deleted or not. The user does not even have to be informed about his blocking or the deletion of his content. This puts the platforms in a high position of power, which can open up the space for e.g. opinion manipulation.

Quotes

 
 
"Terrorism cannot be stopped with a regulation to delete illegal content online. It can at most be a step towards a comprehensive strategy. We must be clear about this before we adopt a regulation, which sacrifices some of our fundamental freedoms, and therefore the very foundations of our liberal democracy, just to eliminate one of the many risk factors of radicalization". - Julia Reda
 
 
As soon as the protests against upload filter censorship machines are over, the German government breaks all public promises behind closed EU doors. The German government rejects the demand of the European Parliament to exclude filter obligations.“ – Patrick Breyer
 
 
"This puts journalism under general suspicion.“ – Petra Kammerevert (SPD Member of the European Parliament)

 
 
 
 

Background information

A noble concern
 
On September 12, 2018, Jean-Claude Juncker, then President of the EU Commission, declared that the Commission would propose new rules to remove terrorist propaganda from the Internet within one hour. This is the crucial time window during which the greatest damage can be done. In many of the recent attacks in the EU it has become clear how terrorists misuse Internet platforms to spread their message. This was the birth of TERREG. The idea is now becoming a threat in practice to small blogs, the European digital economy and ultimately to freedom of expression.
 
The EU is therefore planning a regulation to reduce the circulation of terrorist online content. What are the criticisms?
 
For example, the regulation should apply to all online platforms that host uploaded content. So this also applies to small platforms and start-ups. And all of them are to be equally obliged to remove illegal content within one hour upon request. But this is exactly what cannot be demanded of websites that are only operated by small organizations or even individuals. They cannot provide the required 24/7 presence. In addition, one hour is also too short, for example, to perform a basic plausibility check to determine whether the upload is really illegal content or just part of journalistic reporting. Finally, not even the Commission's definition of terrorist content is really clear.
 
Why is the Commission's definition of terrorism so vague?
 
Because the EU has not yet been able to define what is considered terrorism. This is interpreted differently in the member states. As a result, only a vague definition has been laid down in the regulation. On the one hand, the definition seems to cover classic propaganda material of terrorist organizations, including construction manuals for weapons. On the other hand, it would currently also include legal content such as scientific or journalistic articles.
Durch die Verordnung erhalten in jedem Mitgliedsland Behörden EU-weit geltende Rechte. Eine ausländische Behörde kann damit in Deutschland Inhalte aus dem Internet löschen lassen, die sie selbst nach der vagen Definition als Terrorismus ansieht. Dann ist die Plattform am Zug. Sie soll die Illegalität des Uploads in Bezug auf ihre eigenen Nutzungsbedingungen feststellen – nicht etwa nach Gesetzeslage. Das heißt im Übrigen auch umgekehrt, dass sich die öffentlichen Ämter und Strafverfolgungsbehörden, die eine Löschung anordnen, formal an den Maßstäben der Plattformen orientieren müssen. Und diese sind grundsätzlich profitorientierte Privatunternehmen. Bereits jetzt sind die AGB der großen Plattformen so formuliert, dass sie alle Inhalte verbieten, die in irgendeinem Land unerwünscht sein könnten. Das führt dazu, dass auch prinzipiell legale Inhalte gelöscht werden (Overblocking).
 
At the same time, this gives more power to both the authorities and the large platforms.
 
The platforms should formally establish the criteria according to which something is considered terrorism and is deleted. Elsewhere in the Commission's proposal it is even clearer that the large platforms will be strengthened. This is because the regulation requires the platforms to use "proactive methods" to prevent uploads with terrorist content. This implies upload filters. With all the error-proneness that is already known from previous discussions: automated filters do not recognize context, satire or exaggeration. They only recognize the choice of words, the photo, etc. The development of filter technology is complex and costly, and so only the large, multi-million dollar platforms can afford it.
 
This means that upload filters are back in discussion in a new context.
 
The German government supports the regulation of "proactive methods" to the platforms. Germany holds the EU Council Presidency until the end of the year and thus represents the national governments in negotiations with the European Parliament on current legislative procedures.
 
This conflicts with the coalition agreement of 2018 as well as the discussions about the copyright reform, where the German government rejects upload filters as " unreasonable"!
 
 
 

Sources