EU Probes Meta Over Child Safety Concerns On Facebook And Instagram
(RTTNews) - The European Commission announced that it has opened formal proceedings against Meta Platforms to assess whether the owner of Facebook and Instagram may have breached the Digital Services Act or DSA in areas linked to the protection of minors.
On the Nasdaq Stock Exchange, Meta shares were losing around 1.1 percent to trade at $476.20.
In a statement, the Commission said it is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called "rabbit-hole effects." There is also concern about Meta's age-assurance and verification methods.
The latest development is based on a preliminary analysis of the risk assessment report sent by Meta in September 2023, as well as the company's replies to the Commission's formal requests for information on the protection of minors and the methodology of the risk assessment. The probe is also based on publicly available reports as well as the Commission's own analysis.
On April 24, 2023, Facebook and Instagram were designated as Very Large Online Platforms or VLOPs under the EU's Digital Services Act, as they both have more than 45 million monthly active users in the EU. As VLOPs, both platforms were required to start complying with a series of obligations set out in the DSA four months from their designation, i.e. at the end of August 2023.
According to the Commission, the investigation checks Meta's compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook's and Instagram's online interfaces. These may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called 'rabbit hole' effect.
An assessment like this would help counter potential risks against the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
Further, it will assess Meta's compliance with DSA requirements related to the mitigation measures to prevent minors' access to inappropriate content, mainly age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
The current proceedings will also address Meta's compliance with DSA obligations for measures to ensure a high level of privacy, safety and security for minors. This is mainly regarding default privacy settings for minors as part of the design and functioning of their recommender systems.
Under the formal proceedings, the Commission will now carry out an in-depth investigation and will continue to gather evidence. With the proceedings, the Commission could take further enforcement steps, such as adopting interim measures and non-compliance decisions.
In late April, the Commission had already opened investigation against Meta in relation to both Facebook and Instagram, on deceptive advertising, political content, notice and action mechanisms, and data access for researchers, among others.