EU investigates Meta for child safety on Facebook and Instagram
European Union have opened a formal investigation into Meta the owner of Facebook and Instagram for potential breaches of content rules relating to child safety, creating addictive behaviour among children and damaging mental health.
The European Commissioner, Thierry Breton, said on Thursday “We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram.”
In a statement, Meta said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them.”
If the Commission is not satisfied with Meta’s response, it can impose a fine equivalent to 6 per cent of its global turnover. More immediately, it could conduct on-site investigations and interview the company’s executives. There is no deadline for the investigations to be finalised.
The committee has launched a series of investigations last month into Meta under the Digital Services Agency over its handling of political content amid concerns that it did not do enough to counter Russian disinformation ahead of the EU elections
One official said it was ‘clearly so easy to circumvent some of the controls’ They also question the platform’s age verification tools. Users are supposed to be at least 13 years old to open a Facebook or Instagram account.
In February, the Commission launched an investigation into TikTok on suspicion that the popular video-sharing app may not be doing enough to address the negative effects on young people. The EU also forced TikTok to suspend plans to reward the Lite app in April after warning that its ‘addictive’ nature could threaten users’ mental health.
Facebook and Instagram are among the 23 ‘very large’ online platforms that must comply with the Data Protection Act. Other platforms include Snapchat, TikTok and YouTube.