How Digital Services Act obliges web giants to act
These are digital calamities that can hit any teenager. A predator registering on an online video game site to persuade very young players to send him intimate photos in exchange for money. A group of ill -intentioned college students who divert the photo of an unfortunate classmate on social networks to make fun of it. Or, an advertisement published on an obscure website offering for sale a rare species of parrot.
These images and videos can all fall under the law. However, it is not easy to put order in the West West of the Internet. Faced with the problem, the European Union has made the web giants bend by adopting Digital Services Act (DSA). This regulation, fully applied since February 2024, aims to protect Internet users and further empower platforms like YouTube or X (ex-owner). These are ordered to remove their illegal content and to be more transparent on the operation of their system.
Thanks to the DSA, around thirty associations specializing in digital have the status of trust signals. Precious recognition thanks to which they can force websites and social networks to withdraw illicit content.
Associations designated by Arcom
In Europe, France climbs on the first step of the podium, with no less than seven signalers. Certain associations, as a point of contact, are devoted to the protection of Internet users in a global manner. Others, such as CRIF, have a digital service to combat specific forms of online offenses, such as anti -Semitic insults. All were freshly designated by the Arcom, a French regulatory authority for audiovisual and digital communication, to carry out a mission of web sentries.
First appointed in November 2024, e-child, intended for the protection of children on the Internet, conducts prevention missions in schools and has a whole team of lawyers and psychologists, digital specialists. “Our professionals very often receive calls from parents who have spotted very inappropriate content for their child and do not know what to do,” explains Samuel Compening, deputy director of the association. They see our number on the correspondence book or the website of their teenager. Often, too, they are young people who join us, panicked. They saw an image of them disseminated without their consent, and ask us how to make it disappear. »»
Coercive status
As soon as these requests are made, the team determines the illicit character of the content. It can be published on Instagram, giant of social networks, as well as on a site with a limited number of users. Then, the web sentries contact the publication hosting site to ask for its deletion. Average duration of the operation: only a few hours. Because the status of confidence signaller acts as a sesame. “Platforms have the obligation to create a discussion channel with our lawyers and pay priority attention to our requests,” explains Alejandra Mariscal, director of the Point de Contact association. If they do not delete the content, we go back the information at the Ministry of the Interior and its team of digital gendarmes takes over for a sanction. »»
For the time being, it is impossible to know the details of these sanctions, due to the lack of the Arch assessment. But the European Commission is already launching procedures against platforms that do not comply with its legislation. Since the end of 2023, two tentacular surveys have been underway against Tiktok and X social networks. Accused of lacking transparency and moderation, the two mastodons could receive a salty fine. Up to 6 % of their global turnover.