Scroll Top

Blog Posts

Algorithmic Influence on Elections: Insights from Romania’s Case Study

Algorithmic Influence on Elections: Insights from Romania’s Case Study

Author:

Madalina Botan, Senior Researcher Bulgarian-Romanian Observatory of Digital Media (BROD)

Summary:

This post examines the role of two technology platforms in Romanias recent presidential elections, where fringe candidates outperformed traditional political figures. Meta appeared to provide more transparent ad disclosures, while TikToks operations remained opaque and unaccountable, facilitating the rapid spread of polarizing content and bypassing traditional scrutiny. Pro-Russian independent candidate Călin Georgescus campaign leveraged TikToks viral mechanisms to amplify nationalist and anti-Western narratives, particularly targeting Romanias 7-million-strong diaspora.

While national and European stakeholders have taken initial steps to demand public scrutiny of TikTok, raising significant concerns about the platforms compliance with Articles 33 and 34 of the EUs Digital Services Act (DSA)—which require platforms to mitigate systemic risks to democratic processes—it is evident that regulatory and enforcement gaps persist, and much remains to be done to address the evolving challenges of the digital age.

 

Romania has recently entered a period of significant political uncertainty, exemplified by a series of critical electoral events. These include a parliamentary vote and a presidential runoff canceled by the High Court of Justice less than two days before the vote initially scheduled on December 8, raising concerns about electoral integrity and the influence of technology platforms in democratic processes.

The first round of presidential elections had unexpected outcomes, disrupting the traditional dominance of mainstream political actors. Pre-election frontrunners Marcel Ciolacu of the Social Democrats (PSD) and George Simion of the far-right Alliance for Uniting Romanians (AUR) failed to secure a place in the runoff. Instead, independent far-right candidate Călin Georgescu and Elena Lasconi of the liberal-progressive Save Romania Union (USR) emerged as the top two contenders.

Călin Georgescu, a self portrayed pro-peace, anti-system candidate, has become a polarizing figure. His anti-globalist, anti-Ukrainian rhetoric aligns with far-right nationalist sentiments and has resonated with voters disillusioned by the perceived corruption of Romania’s political elite. Georgescu is skeptical of NATO and the EU, critical of Romania’s involvement in Ukraine, and advocates for closer ties with Russia. His campaign employed a highly effective social media strategy, particularly on TikTok, which elevated him from a fringe candidate with approximately 5% in polling predictions (exit polls placed him at 16%) to a contender with nearly 23% of the vote in the first round.

A key issue surrounding the Romanian elections is the role of TikTok. The platform catered to disengaged voters and enabled non-mainstream candidates like Georgescu to gain prominence without the substantial advertising investments required on Meta platforms. BROD’s preliminary report published immediately after the election analyzed the roles of both Meta and TikTok in the campaign. Our data indicates that main political parties spent approximately €4.94 million on Meta ads in the month leading to the elections. By contrast, Georgescu’s Meta advertising efforts were minimal, with only 25 political ads active in the last 30 days leading to the elections, financed by obscure media outlets with limited budgets. Instead, Georgescu leveraged TikTok’s algorithmic design, exploiting far-right narratives that bypassed transparency rules for political advertising. His nationalist messages gained significant traction, with his TikTok posts advocating for stopping aid to Ukraine and emphasizing “Russia’s wisdom” achieving remarkable engagement.

TikTok’s metrics discussed in BROD’s report illustrate Georgescu’s rise: a 223% increase in followers, a 222% surge in video views, and a 311% growth in comments during the week of November 10–16. By the first round of elections, his posts had amassed 62.7 million views, with a 214% increase in just one week. Hashtags such as #calingeorgescu experienced exponential growth, with over 73.2 million views in 7 days, reflecting a highly strategic campaign.

While Meta platforms dominated disclosed political advertising budgets—USR, PNL, PSD, and AUR collectively spent millions of euros on disclosed ads— it is clear that TikTok remained in a regulatory grey zone. Our report highlights systemic weaknesses in TikTok’s accountability. Despite claims that it does not host political advertising or permit access to monetization mechanics for candidates or public officials, our analysis shows that TikTok was used to amplify messages through influencer networks and fake accounts, thereby evading transparency obligations under the DSA.

Recently declassified documents from the National Defense Council indicate over 20,000 TikTok accounts were linked to boosting Georgescu’s campaign. Of these, 800 accounts were created as early as 2016. These accounts reportedly facilitated cryptocurrency payments to influencers, with some suspected payments reaching at least €1 million through TikTok’s gift donation systems during livestreams.

Regarding publicly reported transparency efforts, TikTok stated through its Transparency Center that in September, it had identified a network of 22 accounts spreading misinformation and promoting narratives critical of the Romanian government. In early December, TikTok announced it had disrupted a network of 78 accounts with 1,781 followers promoting Călin Georgescu’s political campaign. This network, primarily based in Romania, was dismantled in late November. However, the scale of this operation appears modest compared to the broader coordinated activity observed during the election.

Investigations by the think tank Digihumanism highlight significant enforcement failures. Between November 28 and December 3, after TikTok’s initial disclosure, researchers identified 47 new accounts impersonating Georgescu or engaging in inauthentic behavior. Digihumanism’s representatives point out that despite TikTok’s removal of flagged accounts for similar violations, new ones continue to emerge, demonstrating challenges in addressing coordinated inauthentic behavior. Additionally, their analysis of TikTok’s Commercial Content Library revealed 48 active political ads promoting Georgescu’s candidacy across 12 EU countries, Switzerland, and the UK, targeting approximately 300,000 users. Despite explicit policies against political advertising, these ads persisted even after the EU Commission asked TikTok to offer public explanations.

In response, the European Commission has requested additional information from TikTok regarding its management of information manipulation risks and recommender systems under the DSA. TikTok has until December 13 to address these concerns, including demonstrating the effectiveness of its efforts to mitigate inauthentic behavior and enable third-party oversight.

As of December 8, when this post is written, Romania is grappling with significant political and social turmoil following the canceled elections. Unresolved issues surrounding campaign financing, state interference, and potential foreign meddling have further fueled public discontent. In this context, a transparent response from TikTok, along with a firm commitment to implementing safer systems for future elections, is crucial to rebuilding trust in the platform’s ability to mitigate electoral risks.

Equally important, Romanian authorities—both political and judicial—must prioritize transparency in their decision-making processes. A public investigation into the campaign is essential to restore trust in national institutions. Only through such measures can similar outcomes be prevented in future electoral cycles, reducing public mistrust and polarization while fostering more informed electoral choices.

In a previous post, we have analyzed the risks posed by unaddressed disinformation during elections, emphasizing the role of emotions and the snowball effect generated by algorithmic propaganda. Prior research has also explored the declining trust in mainstream media, driven by the extreme politicization of editorial content and the use of political advertising budgets to silence independent media. The dissolution of journalism as the traditional gatekeeper of public interest information has further amplified the influence of technology platforms in shaping citizens’ perceptions of the political landscape. This shift has had dramatic effects on access to unbiased, substantive information, replacing it with pre-digested, fast-consumption social media posts that reinforce polarized worldviews.

Last but not least, it is important to note that BROD’s research relies on publicly available Meta Ads Library data and data-scraping tools such as Exolyt, due to the lack of direct access to platform APIs despite repeated applications. These transparency gaps highlight the urgent need for stronger regulatory oversight and the implementation of effective data governance mechanisms for researchers.

To conclude, the Romanian case serves as a stark warning: the business model of major technology platforms—built on opacity, monetization, and a lack of transparency—poses a serious threat to democratic processes, not only in Romania but across Europe. The adage “hacking democracy through technology” now seems more fitting than “saving democracy with technology.” Risks stemming from unchecked ads, undisclosed financing, and coordinated inauthentic behavior continue to challenge oversight. To address these risks, European and national authorities must enforce transparency and ensure platforms fulfil their obligations related to safeguarding elections integrity.