Digital Service Act (DSA)


Digital Service Act (DSA)

Article
Accueil Effisyn S.D.S
 
| Effisyn S.D.S | Numérique | Réglementation Européenne  Vu 2447 fois
Article N°27734

Digital Service Act (DSA)

Europe is ceasing to be naive when it comes to digital technology, and under the impetus of European Commissioner Thierry Breton, we are seeing a number of legal texts arrive to regulate digital technology in Europe for the benefit of European citizens. Here, we take a look at the DSA (Digital Service Act) and try to highlight the key elements of this text. Enjoy your reading!
#DSA #EuropeanSovereignty #regulation

The DSA is a new European regulation on the digital domain. For Commissioner Thierry Breton, who is one of its advocates, and more generally for the European Commission, the laudable aim is to protect European citizens by regulating the activities of digital platforms in the fight against hate, manipulation, disinformation and counterfeiting…


While the obligations set out in this text are due to come into force on February 17, 2024, the very large online platforms and search engines are concerned from August 25, 2023.
Before going back over the content of this text, it should be noted that it is provoking quite strong reactions, particularly on social networks, where many people see it as the introduction of a tool for censorship and control of information. And this concern is legitimate. Who, in fact, decides that one piece of information is legitimate and the other is "conspiracy"? The management of the health crisis has shown us that official information is not always trustworthy! On the other hand, we can only conclude that certain hateful and aggressive behaviours, such as harassment, raise real questions, and that "moderation" by platforms alone doesn't work…


Returning to the DSA, what are the main points to remember? This rich document is divided into several sections, covering the various aspects of the issue (obligation, control, penalties, etc.) and differentiating between the major players and the others. The sections are as follows:
Returning to the DSA, what are the main points to remember? This rich document is divided into several sections, covering the various aspects of the issue (obligation, control, penalties, etc.) and differentiating between the major players and the others. The sections are as follows:
  • General provisions
  • Duty of care for a safe and transparent online environment
    • Provisions applicable to all intermediary service providers
    • Additional provisions applicable to providers of hosting services, including online platforms
    • Additional provisions applicable to online platform providers
    • Additional provisions applicable to providers of online platforms enabling consumers to conclude distance contracts with professionals
    • Additional systemic risk management obligations imposed on providers of very large online platforms and very large online search engines.
    • Other provisions concerning due diligence obligations
 
  • Implementation, cooperation, sanctions and enforcement
    • Competent authorities and national coordinators for digital services
    • Competences, coordinated investigation and consistency control mechanisms
    • European Digital Services Committee
    • Supervision, investigation, enforcement and control of providers of very large online platforms or search engines
    • Common enforcement provisions
    • Delegated and implementing acts
  • Final dispositions
First of all, the text seeks to provide a number of definitions (article 3), although it is regrettable that the one defining "illicit content" lacks precision and remains rather vague and catch-all.
Next, the responsibilities of the various players in the digital sector are defined, with these being adjusted according to their level of involvement. Apart from the transparency reporting obligation (article 24), the articles in section 3 (Additional provisions applicable to online platform providers) do not apply to digital players qualified as micro-enterprises. The same applies to section 4 (Additional provisions applicable to online platform providers enabling consumers to conclude distance contracts with professionals).
As far as the first sections are concerned, from my point of view, the obligations are not exorbitant, as the digital players are being asked to do:
  • Transparency measures for content and products
  • The possibility of removing illicit content at the request of the competent judicial or administrative authorities within the framework of the legislative framework.
  • To define a single source of contact with the authorities
  • To define a single point of contact for its customers/suppliers for complaints
  • Implement effective means to protect minors and their private data
We then have a section, section 5, which is specifically aimed at very large digital platforms and search engines, and imposes additional obligations on the management of systemic risks. In my opinion, it is in this section that the most questions arise as to the relevance of the actions requested and, above all, the material possibility of implementing them. The ball starts rolling with article 34 and the need to set up a diligent analysis of all systemic risks in the Union... An analysis which will have to be renewed every year.


We're talking here, for example, about the distribution of illegal content. For an international platform with millions of users, operating in territories with differing legislation, the task is likely to prove titanic. Even assuming that they can easily regionalize the distribution of content without altering the interest that their users will have in using them, who will prevent an EU user with a VPN (Virtual Private Network) from accessing content that is illegal in the EU but legal in another territory?


The major platforms and search engines are also required to be able to take certain specific emergency measures in the event of a major crisis in the EU (health, terrorism, climate, natural disaster, etc.) under the impetus and recommendations of the competent EU authorities, such as adapting the moderation process, adapting general terms and conditions, algorithmic systems, promoting reliable information, etc.
Then we have an article (article 35) on risk mitigation. The suggested means of implementation include adapting design, conditions of use, and so on. Of course, it may also be necessary to modify algorithms. This last point is not necessarily a bad idea, but are the algorithms used under control and are we sure of how they work (are they black boxes or not)?
There's also talk of measures to protect children, which is not done most of the time, with the exception of certain atypical and ethical players such as the social networking media platform Smartrezo.


Other measures taken include an independent audit of these risk factors and the corrective measures put in place. This measure, which is logical in itself, will come up against the reality principle. Who will be the auditors, who will give them their accreditation, and who will finance these audits? Is this the end of the "free" model, except that it isn't, since most of the time these players use and resell your data (with or without your consent)...


The European Commission is not losing its way, and for the very large players and search engines, it is taking advantage of the situation to introduce a monitoring fee (so that the Commission has the means to implement the necessary actions induced by the DSA). A new cost line in sight for digital players...
Of course, the DSA provides for various types of sanctions in the event of non-compliance with the injunctions, such as a temporary restriction on service for users, or financial penalties of up to 6% of the player's worldwide sales (this is starting to sting...).


After reading this dense, technical and not always sufficiently precise text, what are the first intentions I draw from it? I must stress that this is a personal impression, and I invite all readers to immerse themselves in the text to form their own opinion.
First of all, from an overall point of view, I think this is a text that's heading in the right direction. Indeed, we've all seen that self-moderation by the digital players themselves, particularly the American and Chinese giants, without the slightest control, shows its limits. Ensuring the age (and therefore the identity) of users, even if they use a pseudonym, is not a problem in my view, but we still need to ensure that measures to protect the personal data of European citizens are up to scratch, and in particular that this data is not transferred to the US or elsewhere. This measure prevents minors under the age of 13 from registering. It can be done, and a French platform is doing it (Smartrezo)!



On the other hand, it's true that we'll have to keep a close eye on how this text is applied, as the vagueness of what constitutes "illicit content", or what the promotion of "reliable information" might mean, raises questions... This is probably the aspect that will require the most attention and amendment, in order to clarify matters and ensure that this is not a case of technocratic censorship, where only one "official" truth would be tolerated... Thierry Breton tried to defuse the issue by declaring: "Content moderation does not mean censorship. In Europe, there will be no Ministry of Truth" (Le Monde Informatique – 25-08-23) May he be right.


In conclusion, this text was necessary in view of the limitations observed in the operation of today's platforms. However, we need to be vigilant, because digital technology, with its power and mass distribution, can also be an effective means of controlling the masses, and perhaps a proven tool of foreign influence. Look at the way in which algorithms and moderation differ between Chinese TikTok and the one fed to young Westerners (the content proposed and put forward by the algorithms is not the same, educational and science-awakening content, and the infamous mush of ineptitude in the West)... But the Americans have nothing to envy them: they use their digital giants to impose their vision of the world on the entire planet. It's up to us to find our own way, with protection tools like the DSA, but also by creating and promoting our own platforms that integrate our vision of the world and a certain number of cultural aspects! For example, a platform that offers quality informative content and promotes the economic and social fabric of different territories, while also offering educational content, particularly on understanding the risks of the Internet and how the media work....
 

Emmanuel MAWET

Lien :https://effisyn-sds.com/2023/09/06/digital-service-act-dsa/?lang=en

  • 0
    • j'aime
    • Qui aime ça ? »
  • 0
    • je n'aime pas
    • Qui n'aime pas ça ? »
  •  
 

Réagissez, commentez !

  • Aucun commentaire pour l'instant
rechercher un article, une vidéo...