The European Commission has taken its first formal step towards investigating X, formerly known as Twitter, under the new Digital Services Act (DSA). This inquiry is aimed at assessing whether X is compliant with the stringent rules outlined in the DSA to ensure online user safety and restrict the spread of harmful content.

The investigation centers on X’s response to hate speech, misinformation, and violent terrorist content, particularly concerning the Israel-Hamas conflict. Under this legally binding request, X must provide information regarding its crisis response protocol by Wednesday, and responses to other queries are expected by October 31.

- ADVERTISEMENT -

The European Commission has made it clear that the outcome of this investigation could lead to the opening of formal proceedings and potential penalties based on X’s replies.

Although representatives for X have not provided an immediate response, the company’s CEO, Linda Yaccarino, has previously shared actions taken by X to address the issue of harmful content on the platform. Despite these measures, many believe they fall short of the required actions.

Kolina Koltai, a researcher at the investigative collective Bellingcat, pointed out that while some progress has been made, it’s insufficient to combat the misinformation problem on X. The platform still struggles with an overwhelming amount of misinformation, indicating that X’s moderation efforts only scratch the surface of the issue.

The Israel-Hamas conflict serves as a significant test for the EU’s Digital Services Act, which became effective in August. The European Union is actively assessing how platforms like X handle illegal content, particularly material related to terrorism or hate speech.

Notably, the changes made by Elon Musk, who acquired Twitter and renamed it X, have introduced financial incentives for users to post content that garners attention. Additionally, X’s workforce, including its content moderation team, has been significantly reduced.

These changes run counter to the requirements outlined in the Digital Services Act, which places the onus on social media companies to enhance their content policing to avoid heavy fines.

The European Commission emphasized that there’s no room for terrorist organizations or violent extremist groups on X. The platform claims to have removed or labeled tens of thousands of pieces of content related to the conflict. However, Koltai notes that there are still unmoderated videos and photos on X that continue to spread misleading claims.

Furthermore, X, now often criticized for actively promoting falsehoods, was found to be the worst-performing platform for online disinformation in a study commissioned by the EU.

This is not an issue unique to X, as rivals like TikTok, YouTube, and Facebook are grappling with a similar surge in unverified rumors and falsehoods concerning the Middle East conflict. European Union officials have sent warning letters to the CEOs of these platforms, urging them to address disinformation and illegal content promptly.

EU officials have not limited their warnings to X (formerly Twitter). TikTok, another major social media platform, also received a strongly worded letter from European Commissioner Thierry Breton. In the letter, he urged TikTok’s CEO, Shou Zi Chew, to take immediate steps to combat disinformation and illegal content related to the Israel-Hamas conflict, particularly content depicting hostage-taking and graphic videos that have been circulating on the platform. However, TikTok has not provided an immediate response to this request.

This approach of sending warning letters to tech giants is not limited to X and TikTok. Mark Zuckerberg, the CEO of Meta, the parent company of Facebook and Instagram, also received a similar warning from Breton. These letters underscore the European Union’s commitment to holding social media platforms accountable for their content moderation and ensuring the safety of their users.

The issue of misinformation, especially during significant global events like the Israel-Hamas conflict, has posed a considerable challenge for social media platforms. It often leads to a game of “whack-a-mole,” where companies attempt to counter the rapid spread of rumors and false information. The EU’s Digital Services Act places a higher burden on these companies to proactively address this problem or face severe penalties.

As the investigation into X’s compliance with the DSA unfolds, the European Union is closely monitoring how major tech platforms handle these challenges. The outcome of these investigations is expected to have a significant impact on the regulation and oversight of social media content in the European Union and beyond.

The scrutiny of these platforms reflects the growing recognition of the role they play in shaping public discourse and the need for stricter regulation to safeguard against harmful content and disinformation.

Leave A Reply

Exit mobile version