DUBLIN, Ireland: Ireland has ordered X (formerly Twitter), TikTok, and Instagram to take immediate action to curb the spread of terrorist content on their platforms.
This directive, issued by Ireland's media regulator, Coimisiun na Mean, follows findings that weak moderation practices leave these social networks vulnerable to such content.
The ruling requires these companies to implement "effective, targeted, and proportionate" measures within 12 weeks to address and prevent the dissemination of terrorist material while also respecting freedom of speech.
"TikTok, X, and Meta (in respect of Instagram) will be obliged to take specific measures to protect their services from being used for the dissemination of terrorist content and to report to Coimisiun na Mean on the specific measures taken within three months from the receipt of the decision," it said in a statement.
If the measures are deemed insufficient, the commission may impose fines of up to 4 percent of global revenue.
This action stems from the EU's terrorist content regulation, which mandates the removal of flagged material within an hour of a removal order issued by EU authorities. Ireland's regulatory role is particularly crucial due to the presence of major tech companies' EU headquarters in Dublin.
The scale of the challenge for European regulators has grown significantly. EU Home Affairs Commissioner Ylva Johansson reported over 550 removal orders issued since the regulation's implementation, underscoring the concern around terrorist content on social platforms, especially after recent global events, including the October Hamas attacks on Israel and an alleged coup plot in Germany.
Additionally, the Irish Media Commission noted public challenges in reporting illegal content and has announced plans to review these reporting processes to improve accessibility and response for users across platforms.