The European Union Commission has launched investigations into the operations of two Meta platforms, Facebook and Instagram, citing concerns about the safety of children using the platforms.
The commission, in a statement released on Thursday, said it was concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.
In addition, the commission said it was also concerned about age assurance and verification methods put in place by Meta. The platforms are being investigated under the EU’s Digital Service Act, which all social media platforms operating within the bloc are expected to comply with.
Preliminary analysis
The commission said the launch of the formal investigation was spurred by its preliminary analysis of the risk assessment report sent by Meta in September 2023, Meta’s replies to the commission’s formal requests for information on the protection of minors and the methodology of the risk assessment.
“The Commission will now carry out an in-depth investigation as a matter of priority and will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.
“The opening of formal proceedings empowers the Commission to take further enforcement steps, such as adopting interim measures and non-compliance decisions. The Commission is also empowered to accept commitments made by Meta to remedy the issues raised in the proceedings.
“The opening of these formal proceedings relieves Digital Services Coordinators, or any other competent authority of EU Member States, of their powers to supervise and enforce the DSA in relation to a suspected infringement of Article 28,” the Commission stated.
What you should know
Facebook and Instagram were designated as Very Large Online Platforms (VLOPs) on 25 April 2023 under the EU’s Digital Services Act, as they both have more than 45 million monthly active users in the EU.
As VLOPs, four months from their designation, that is, at the end of August 2023, Facebook and Instagram had to start complying with a series of obligations set out in the DSA. Since 17 February, the Digital Services Act applies to all online intermediaries in the EU.
On 30 April 2024, the commission opened formal proceedings against Meta, in relation to both Facebook and Instagram, on deceptive advertising, political content, notice and action mechanisms, and data access for researchers, as well as on the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the European Parliament elections.
Earlier in February this year, the commission had also opened formal proceedings to assess whether TikTok might have breached DSA in areas linked to the protection of minors, among others.
The Bytedance-owned company is also being investigated on issues concerning advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content.
Under the EU’s DSA, which came into force this month, penalties for confirmed breaches can reach up to 6% of global annual turnover.