Facebook and Instagram probed over child safety concerns under landmark EU law
The European Commission has launched a formal investigation into Facebook-owner Meta over child safety and mental health concerns.
The US tech giant has been accused of potentially breaching the Digital Services Act (DSA), which passed into law in the European Union last year in an attempt to prevent social media platforms from creating addictive behaviour, spreading misinformation and hosting online scams.
“Today we open formal proceedings against Meta,” Thierry Breton, the EU commissioner for the internal market, said in a statement on Thursday.
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram... We are sparing no effort to protect our children.”
One part of the investigation aims to uncover what are known as “rabbit hole” effects, which feed users algorithmically-generated content that could have negative impacts on their mental health.
Meta responded to the investigation by claiming that it has been developing tools and methods to protect younger users online for the last 10 years.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a spokesperson said.
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
If found in breach of the DSA rules, Meta faces a fine of up to 6 per cent of its global annual turnover.
The company generated $134.9 billion (£107bn) in revenue in 2023, according to its latest financial results, meaning the fine could be up to $8 billion.