The European Union (EU) has opened an investigation against X (formerly Twitter) for lackluster moderation of illegal content and disinformation following the war between Israel and Hamas. The move, via Financial Times, comes two days after European Commissioner Thierry Breton sent an “urgent” letter to X owner Elon Musk questioning the billionaire over the company’s handling of disinformation. The formal investigation is the first under the new Digital Services Act (DSA), which requires platforms operating in Europe to police harmful content – and can impose fines large enough to give it teeth.
EU officials have sent X a series of questions to which the company has until October 18 to respond. The commission says it will determine its next steps “based on the evaluation of X responses.” The DSA, passed in 2022, requires social enterprises to proactively moderate and remove illegal content. Failure to do so could result in periodic fines or penalties which, in the case of X, could total up to “five percent of the company’s daily global turnover”, according to FT.
Researchers and fact-checkers have warned of widespread disinformation on X in the wake of Hamas attacks on Israel. Tuesday’s letter warned Musk about harmful content on X, signaling that Breton was prepared to use the full power of the DSA to enforce compliance. “Following the terrorist attacks carried out by Hamas against Israel, we have indications that your platform is being used to spread illegal content and disinformation in the EU,” Breton wrote. “I remind you that the law on digital services sets very precise obligations in terms of content moderation. »
Musk’s response seemed to contain at least a whiff of mockery. “Our policy is for everything to be open source and transparent, an approach I know the EU supports,” wrote the X owner and Tesla CEO. “Please list the violations you are referring to on X, so that [sic] the public can see them. Thank you so much. Breton retorted: “You are well aware of reports from your users – and authorities – about false content and the glorification of violence. It’s up to you to demonstrate that you lead by example.
European Commissioner Thierry Breton (Isabel Infantes / Reuters)
Yaccarino’s response claims that the company has redistributed its resources and reshuffled its internal teams to address moderation issues related to the Middle East conflict. She said X had removed or labeled “tens of thousands of pieces of content” since the attacks began.
The CEO added that X had removed hundreds of Hamas-aligned accounts from the platform, while stating that the company worked with counterterrorism organizations. Yaccarino said that X’s Community Notes, a crowd-moderation feature, is now supported on Android and the web (with iOS “coming soon”). She also claimed that the company had “significantly improved” a feature that sends notifications to people who liked, replied to or reposted something that then received a community rating fact-check.
The investigation recently opened by the EU also questions how X is prepared to react in the event of a crisis and what procedures it has to manage associated disinformation. The company would have until the end of October to answer these questions.
Breton is not focused exclusively on X. The commissioner also sent letters this week to Meta CEO Mark Zuckerberg and TikTok owner ByteDance reminding them of their obligations to the DSA following the bloodshed in the Middle-East.
This article is originally published on actualnewsmagazine.com