Before the 2024 EU elections, EDMO contributed to creating, using and monitoring the Rapid Response System (RRS) of the Code of Practice on Disinformation of the EU. The RRS, as explained in the Transparency Center of the Code of Practice, ‘is a time-bound dedicated framework of cooperation and communication among relevant signatories during the 2024 European Parliament election which allows non-platform signatories to swiftly report time-sensitive content, accounts, or trends that they deem to present threats to the integrity of the electoral process and discuss them with the platforms in light of their respective policies’.
During the EU elections campaign, the RRS issued 18 notifications before the vote. These notifications were addressed to various platforms: 7 to Meta, 6 to YouTube, and 5 to TikTok. The response rate was notably high, with feedback received for all 18 notifications. The feedback was provided in different formats: 9 were oral, 5 were written, and 4 were a combination of both oral and written responses.
As a result of the Rapid Response System’s efforts, there were 12 instances of content or accounts being removed or banned, either partially or entirely. In 3 cases, the platforms determined that the content did not breach community guidelines and thus took no action. Additionally, there were 2 instances where content was labeled to provide context to viewers, and one more case where both labeling and other measures were applied.
In addition to the RRS, EDMO had a separate communication channel with X (former Twitter), who is not a signatory of the Code of Practice on Disinformation and therefore was not part of the RRS. Through this channel, 4 notifications were sent to X, 2 received feedback (partial removal/ban) and 2 were not addressed.
The RRS created for the EU elections performed well, in general, but a few additional considerations are necessary.
First of all, no major disinformation incidents happened in the last days/hours before the vote, so it is not possible to assess the efficacy of the RRS should similar incidents happen (for example, like it happened during the US 2020 elections).
Secondly, the high percentage of removals is problematic in EDMO’s view: we advocate for providing more information, with labels and other tools that can provide additional context to the users, not for removing it, if not on very exceptional situations (eg. high risk of imminent harm to individuals or illegal conducts). Removing content without providing the necessary information (ex. the rationale of the removal), as a general policy, can easily lead to the spread of more disinformation, leaving space for conspiracy theories and accusations of censorship, with the risk of making the spreaders of disinformation “martyrs” of the freedom of expression’s cause. It’s important that all actions are carried out respecting fundamental rights.
Last but not least, the good performance of the RRS in preventing and countering the spread of major disinformation incidents before the EU elections should not be mistaken as a general victory over disinformation in Europe before the vote. In our understanding, disinformation doesn’t work with a few big major incidents before a specific election. Disinformation normally impacts public opinions and voters through an endless hammering of the same narratives, conveyed through a systematic dissemination of false content (statements, news, videos, images, etc.) that is not – taken individually – particularly relevant or exceptional. It’s more like the drop of water that excavates the rock year after year. And in this last election we think the impact of years of spreading disinformation narratives in the public opinions all over the EU is quite clear.
In this context the RRS is surely a useful tool and for the future it is desirable that similar initiatives are put in place before elections, but by far it’s not enough. All the tools and initiatives to tackle disinformation – like media literacy campaigns, support to fact-checking organizations, a stronger cooperation of the community that fights disinformation with traditional media, support to research, data access, implementation of dedicated legislation and regulation, and so on – must be deployed and sustained consistently over time.
Photo: Flickr, Wayne S. Grazio