Super election year 2024: A super year for disinformation?

Cathleen Berger, Charlotte Freihse

Artikel

In 2024, there were more than 70 elections worldwide. In almost all of them, there were attempts to influence the outcome of the election with disinformation. AI-generated content was used in more than half of the cases. What do these figures tell us about the state of our democracies?   

Never before have so many people worldwide been called to vote within a year. In Germany, citizens cast their votes at local, state and EU level, while in the USA, India, Mexico, and other countries, the population decided on new presidents. In times of climate crisis, wars, populism, and polarisation, this super election year also threatened to become a super year for disinformation – a danger that the World Economic Forum also highlighted in its global risk report. The report identified disinformation as the key risk in 2024 and referred to the potential for manipulation of election periods. We have collated the data on 78 scheduled elections, researched where and how disinformation was used, and reflected on the lessons that can be learnt for the future.

Disinformation is global, AI is not (yet)

The collected data shows: Disinformation was spread in 95 per cent of elections. Elections in Europe were no exception. For example, the re-election of Moldovan President Maia Sandu in October and November was subject to a massive disinformation campaign. At the beginning of December, the first round of the Romanian presidential elections even had to be annulled after political ads (and disinformation) ran on several platforms but were not declared as such.

Evidence of the use of artificial intelligence (AI) can be found in just over half of all elections. While the figures themselves suggest a preponderance of manipulation attempts and the media repeatedly pick up on the hype surrounding the dangers of AI-generated content, it is anything but clear whether and to what extent such disinformation campaigns are effective. There has been a lack of in-depth, systematic analysis and long-term studies to date. Many reports mainly refer to anecdotes and describe individual fake videos or manipulated voice messages – which are certainly highly newsworthy when AI candidates are on the electoral list in the UK or Belarus, for example. The extent to which such content has a broad impact and fundamentally weakens people’s trust in information remains unclear (so far).

Not everything is AI and not everything AI-generated is fake

The example of AI candidates already hints at this: Artificial intelligence crops up frequently, but the public discourse on the influence of AI often remains blurry. Terms are diffuse and it can quickly seem as if everything is ‘AI’. But AI is not the same as fake. A fake is not always a deep fake. And not every ‘cheap fake’ – i.e. audiovisual fakes using conventional software – is used for manipulation.

What exactly counts as AI-generated manipulation and where does entertainment that is labelled as such begin? Ever since the Argentinian presidential election in 2023, which the New York Times described as the ‘first AI election’, it has been clear that AI is fundamentally changing the spread of disinformation – in Argentina, because AI was used for election campaigning. In Romania, it is also about political advertising and the effective enforcement of transparency requirements. A deeper understanding of these dynamics requires long-term, independent monitoring and detailed analyses that go beyond superficial risk assessments.

What is more, AI is not only used as a tool for disinformation, but also for countermeasures – which is key to keeping up with the speed and diversity of attacks. However, this does lead to situations where election reporting on Mozambique may mention the use of AI, but the core issue is an AI-supported platform to protect election integrity.

Voter turnout low on average, shift to the right a western trend

Numerous voices state that the 2024 election year surfaced a further shift to the right, globally. Our research found evidence of this in 13 cases, but only in three countries outside Europe and the USA. The shift to the right is therefore primarily a Western trend.

What is more striking is the overall low voter turnout, which averaged just 57.8 per cent worldwide. The range is 16.3 per cent in Comoros and 98 per cent in Rwanda – both the extreme variants with question marks over the freedom of the process.  Comparative figures for previous election periods are difficult to find, and regional differences are significant. One thing seems clear: democratic institutions must offer much more than just elections in order to bring people along.

Geopolitical narratives: Russia, China, balance of power

While Western countries primarily focus on Russia as the originator of disinformation, the research shows that many African, Asian, and Latin American countries are increasingly influenced by narratives that promote a closer alignment with China. This discrepancy in perception highlights a systemic conflict in which disinformation is used as a tool to reinforce the formation of geopolitical camps. The Western emphasis on one direction in particular risks a distorted and short-sighted perception. The question of how these narratives influence global power relations and information ecosystems in the long term urgently requires in-depth and contextualised research.

Trends and outlook

The polarisation of discourses is a long-term and far-reaching trend. Issues such as war, migration, the climate crisis, religious conflicts, and gender identities are deliberately instrumentalised to further social division. Comprehensive and long-term monitoring is needed as an early warning system for such methods. Reports show that disinformation is often disseminated at an early stage to influence elections – for example, seven months before the elections in Belgium or six months before the elections in El Salvador. The project-based approach of many research and funding institutions, which is often limited to the period three months before to three months after an election, falls short of the mark. A two-pronged strategy is required: firstly, long-term, independent monitoring to identify and analyse developments and trends, and, secondly, a rapid response capability that enables analyses in acute situations such as elections and crises. Furthermore, we must not lose sight of the fact that there is not just one platform for dissemination, but many. It is true that TikTok dominates in terms of perception and reporting. However, smaller alternatives such as Mastodon or BlueSky are also worth keeping an eye on.

The data basis for our review is based on a one-week sprint to collect data and focuses on the preparation and evaluation of figures, not on an in-depth analysis of individual elections. For those who stumble over the number of elections: It is as lively as (ideally) the democracies themselves. For example, announced elections have been postponed (Egypt, Guinea-Bissau, Mali, Ukraine) or brought forward (France, Japan, Iran). Language barriers also restrict access to information, as we can easily overlook reports in the respective national languages. If you have any suggestions for additions or further data, please do not hesitate to contact us. 


References with a view to disinformation and the use of AI:


Cathleen Berger

Cathleen Berger

Co-Lead

Cathleen Berger’s professional experience spans across sectors: academia, government, non-profit, corporate, and start-up. Her work and research focus on the intersection of digital technologies, sustainability, and social impact. She currently works with the Bertelsmann Stiftung as Co-Lead for Upgrade Democracy as well as the Reinhard Mohn Prize 2024 and Senior Expert on future technologies and sustainability. In addition, she occasionally advises and works with social purpose companies and organisations on their climate and social impact strategies.

Previously, she directed the B Corporation certification process of a pre-seed climate tech start-up, launched and headed up Mozilla’s environmental sustainability programme, worked within the International Cyber Policy Coordination Staff at the German Foreign Office, as a consultant with Global Partners Digital, a research assistant at the German Institute for International and Security Affairs (SWP), and a visiting lecturer at the Friedrich Schiller University Jena.

Follow on

Co-Lead

Charlotte Freihse

Charlotte Freihse

Project Manager

Charlotte Freihse is a project manager in the Bertelsmann Stiftung’s Upgrade Democracy project, where she focuses primarily on platform governance and disinformation, as well as the impact of digital technologies on public opinion-forming and discourse. Before joining the foundation, she was a freelancer in the newsroom of Norddeutscher Rundfunk (NDR). In parallel, she was a research assistant in the European research project NETHATE and developed a categorization system for intervention measures against online hate speech with the University of Jena and Das NETTZ. Charlotte holds a Master’s degree in Peace and Conflict Studies with a focus on digital technologies in conflicts as well as peace processes. 

Follow on

Project Manager

Share

Similar articles