The European (EU) Commission has formally demanded information from YouTube, Snapchat and TikTok about the algorithms used to promote material to users. The investigation, which began on Wednesday, focusses on how these algorithms may exacerbate systemic dangers associated with the political process, mental health and minors' protection.
In a statement, the EU Commission stated that the requests, filed under the Digital Services Act (DSA), also concern each platform's attempts to reduce the impact of its recommendation systems on the dissemination of unlawful information, such as promoting illegal narcotics and hate speech.
TikTok has also been asked for more extensive information on the measures the company has put in place to prevent bad actors from abusing the app, particularly in terms of the dangers linked with elections and civic discourse.
The IT businesses must provide the requested information by 15 November. Following the deadline, the Commission will review the replies and determine potential further steps, including fines for non-compliance.
The EU has previously initiated non-compliance actions under the DSA to force large tech companies to take better action against illegal and harmful content on their platforms. This examination has also extended to Meta's Facebook and Instagram pages, as well as AliExpress and TikTok accounts.