Search

The EU is requesting information from X regarding content moderation as part of an investigation into the Digital Services Act (DSA).

Share it

The European Union is conducting a thorough investigation into social media platform X regarding its content moderation resources as part of its scrutiny under the Digital Services Act (DSA). The investigation aims to assess X’s adherence to the stringent regulations outlined in the DSA, which require online platforms to take a more proactive stance in policing illegal and harmful content on their platforms.

One of the key concerns raised by the European Commission, the EU’s executive arm, is the reduction in X’s team of content moderators by nearly 20% as reported in their transparency report submitted to the regulator in March 2024. Additionally, X’s decision to decrease its linguistic coverage within the EU from 11 languages to seven has also sparked further inquiries from the Commission.

The Commission is particularly interested in understanding X’s risk assessments and mitigation strategies related to generative artificial intelligence’s impact on electoral processes, dissemination of illegal material, and protection of fundamental rights within the EU.

X, formerly known as Twitter, has been summoned to provide extensive information requested by the EU on its content moderation resources and generative AI by specific deadlines set by the Commission.

The initial deadline for X to respond to the EU’s inquiries regarding content moderation resources and generative AI is set for May 17, with subsequent answers to additional questions scheduled for submission no later than May 27.

The European Commission’s investigation into X goes beyond just the recent developments regarding content moderation resources. Formal infringement proceedings were initiated against X in December of the prior year after concerns were raised regarding X’s handling of illegal content during the Israel-Hamas conflict.

The investigation is focused on X’s compliance with its obligations to combat the dissemination of illegal content, its efforts to address information manipulation, and its initiatives to enhance transparency on its platform.

In its pursuit of gathering evidence for the DSA investigation into X, EU officials are delving into X’s actions as outlined in its March transparency report and the responses provided to earlier requests for information regarding disinformation risks associated with generative AI.

The Digital Services Act, enacted in November 2022, imposes strict requirements on large online platforms like X to proactively combat disinformation, remove hate speech, and uphold freedom of expression.

Non-compliance with the DSA regulations can result in significant fines, with companies facing penalties of up to 6% of their global annual revenues for breaching the rules.

The European Commission’s efforts to hold platforms like X accountable for their content moderation practices and approach to handling illegal content underscore the EU’s commitment to ensuring a safe and transparent online environment for its citizens.

As the investigation unfolds, the outcome will not only impact X’s operations within the EU but also set a precedent for other tech giants to adhere to the stringent regulations outlined in the Digital Services Act.

With the evolving landscape of online content and the increasing challenges posed by disinformation and harmful material, regulatory oversight and enforcement mechanisms like the DSA play a crucial role in safeguarding digital spaces and promoting responsible online behavior.

🤞 Don’t miss these tips!

🤞 Don’t miss these tips!

Solverwp- WordPress Theme and Plugin