Do you also just love sponsored ads that come up repeatedly by online platforms and you feel you are being followed by you with targeted advertising? I am probably not the only one who is extremely annoyed by this. In the following post, I will explain what I did to get this field regulated. 

But let’s take it step by step. I will start with the numbers…

1373. At this year’s Committee on Culture, we voted on many amendments.

1374. There were so many in the Digital Services Act.

1375.  And so many in the Digital Markets Act.

The reason I bring these figures up is that we have never voted on so many amendments at once in this term, as well as the fact that I was directly involved in drafting the aforementioned 773, and especially since these two Acts are extremely important.

For both reports, I was the shadow reporter. That means I monitored the report for my group Renew Europe, introduced amendments, gave voting indications and negotiated with reporters of other groups for the final text in the form of joint compromise amendments.

Other committees are collaborating to create legislative proposals, which will then be merged into the European Parliament’s common position. Meanwhile, Member States are seeking a separate common position in the Council. The two positions, the amendments to the original proposal of the European Commission, will be combined in trialogues into a law that will be binding at the level of the entire Union.

The Digital Markets Act (DMA) will define and set the so-called gatekeepers who, due to their size (read Google, Amazon, Apple), often act as an entry point for a wide variety of companies for the entire digital economy. They offer different services and at the same time use online and business users data from one or more segments of the platform for a competitive advantage in their other product or service offer (Amazon, Google, Apple – think of search engines) or have a monopoly/duopoly for small companies accessing users (Googleplay, Apple AppStore).

The Digital Services Act (DSA) establishes rules for online service providers, protects the fundamental rights of users, allows the same level of conditions of competition in the complex structure of platform obligations, user empowerment, transparency of algorithms and adjustment of parameters for online content dissemination, and above all advertising transparency and rules to remove illegal content.

Let me get to the point: what is it about? Basically, it is about radically changing the rules online. The Digital Services and Markets Acts are highly technical and rather complex, so I will not describe what they each cover in detail, but I will present the key points of our committee in conjunction with my amendments.

The legislative documents are complementary. With DMA, the majority of the competences will remain at EU level, although Member States will have the option to suggest which platform should be designated as the gatekeeper. The competences of the DSA, on the other hand, will be in the domain of the Member States through the so-called Digital Coordinator; an independent agency responsible for overseeing the performance of online giants in their practices.

Since online platforms have long before become public spaces, it is essential to introduce clear rules, which is what we are doing with the Acts. What should be different after the amendments concerning advertising I (successfully!) added to the text?

Platforms must disclose the users, who are shown advertising, why you received this ad, why exactly this ad was sent to you, and – most importantly – that you should not be targeted for advertising by default unless you consent to it in advance. It was also important for me that one could choose which data they shared and to set the parameters of the algorithms. In other words, you can actually choose how the world is ‘presented to you’.

In presenting the world, it is, of course, important to distinguish between legal, illegal and harmful content. Legal content must not be removed, and companies cannot be held responsible for that at all. However, there are currently only three types of illegal content: child pornography, terrorist content and copyright. We have separate legislation for all three, and the DSA will provide horizontal rules for all.

Due to my strong opposition to having this content checked exclusively by automated systems, algorithms or artificial intelligence, I emphasized throughout the text (in the end successfully) that the very blocking of this content must be human controlled. I also added that web giants must take into account the language of the Member State itself and employ moderators who can speak the language of that country, such as Slovene in our case. They can afford it with no doubt.

The texts contain some other very good decisions. Unfortunately, there are also some, which are unacceptable to me. That is why I abstained from voting in the end. One of the decisions is that platforms should remove all copies of some illegal content and prevent it from appearing again. In turn, this would mean that platforms would have to monitor all publications. This would consequently result in the use of artificial intelligence, which could lead to censorship due to errors. The second decision addresses the issue of preventing blocked users from registering, which is technically almost impossible, but if it were, it would mean the end of online anonymity.

I have, of course, supported other good proposals and I will continue to do so in the future as well. For example, we wanted to ensure that media that are regulated or self-regulated do not have their content removed, as they are editorially responsible. Although I was concerned that groups with bad intentions (such as foreign Chinese or Russian interests or EU media houses subordinated to politics) could be hiding behind this exception, the good overweighted the bad. Platforms often download content from independent media due to abuses of their reporting systems, which jeopardizes public debate and access to information. We prefer solving the common rules regarding the media in the upcoming legislation in this field.

As for the timeline: Expect that legislation at European level will be adopted next year, and I hope that it will be implemented into national legislation shortly after.

The initial proposal of the European Commission, which we want to change or supplement with our position, was good, but it was necessary to supplement enough grandfathering provisions to prevent abuse. My biggest concern is with countries where there are no independent agencies and judiciary; where/since they are subject to political parties that could abuse good regulation – to their advantage, of course. I hope we can resolve this by the time the regulation is adopted.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *