Meta Platforms is under investigation by European Union regulators for potential violations of online content rules concerning child safety. This in-depth probe, announced by the European Commission, could result in significant fines for Meta if breaches are confirmed.
The investigation focuses on whether Meta has sufficiently addressed risks to children on its platforms. Concerns have been raised about the impact of Facebook and Instagram’s algorithms, which may contribute to behavioral addictions and the so-called "rabbit-hole effects" in children.
Additionally, the Commission is scrutinising the effectiveness of Meta's age-assurance and verification methods, as there are doubts about their ability to prevent children from accessing inappropriate content.
The Digital Services Act (DSA), which came into effect last year, requires tech companies to implement robust measures against illegal and harmful content. Companies failing to comply with the DSA can face fines amounting to as much as 6 per cent of their annual global turnover.
Meta, which submitted a risk assessment report in September, argues that it has developed over 50 tools and policies over the past decade to protect young users. A spokesperson in a statement for Meta highlighted the company’s focus to ensuring safe, age-appropriate experiences online and expressed a willingness to cooperate with the European Commission.
(Inputs from Reuters)