Facebook parent company Meta on Thursday was hit with a major investigation from the European Union into alleged breaches of the bloc’s strict online content law over child safety risks.
The European Commission, the EU’s executive body, said in a statement that it is investigating whether the social media giant’s Facebook and Instagram platforms “may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.”
The Commission added that it is concerned about age verifications on the Meta’s platforms.
Meta did not immediately respond to a CNBC request for comment.
The Commission said that its decision to initiate an investigation comes of the back of a preliminary analysis of risk assessment report provided by Meta in September 2023.
The EU said it will carry out an in-depth investigation into Meta’s child protection measures “as a matter of priority.” The bloc can continue to gather evidence via requests for information, interviews, or inspections.
The Commission can consider commitments made by Meta to remedy its concerns.
Meta and fellow U.S. tech giants have been increasingly finding themselves in the spotlight of EU scrutiny since the introduction of the bloc’s landmark Digital Services Act, a ground-breaking law from the European Commission seeking to tackle harmful content.
Under the EU’s DSA, companies can be fined up to 6% of their global annual revenues for violations. The bloc is yet to issue fines to any tech giants under its new law.
In December 2023, the EU opened infringement proceedings into X, the company previously known as Twitter, over suspected failure to combat content disinformation and manipulation.