Meta, the parent company of Instagram, has been issued a deadline by EU tech regulators to provide comprehensive details regarding the measures taken to address child sexual abuse material on its popular photo and video-sharing app.
The European Commission has set a crucial deadline of December 22, warning that a lack of satisfactory response could result in a formal investigation under the newly established EU online content rules.
This directive follows a series of inquiries initiated by the European Commission. In October, Meta received an initial request for information pertaining to the actions taken to combat the dissemination of terrorist and violent content on its platforms. Last month, a second request was issued, specifically focusing on the measures in place to safeguard minors using Instagram.
The latest statement from the European Commission indicates an additional inquiry into Instagram’s recommender system and its role in amplifying potentially harmful content.
These requests for information are in accordance with the EU’s Digital Services Act (DSA), a set of stringent regulations compelling major tech companies to take more proactive steps in monitoring and curtailing illegal and harmful content circulating on their platforms.
Failure to comply with these information requests can lead to formal investigations and, in severe cases, financial penalties. It is noteworthy that other tech giants, such as ByteDance’s TikTok and Elon Musk’s X, have also been subjected to similar requests for information by EU regulators.
The outcome of these investigations holds implications for the broader regulatory landscape in the tech industry. The deadline set by the EU places Meta Platforms under increased scrutiny to demonstrate its commitment to addressing child safety concerns on its widely-used Instagram platform.