If you want to successfully counter disinformation, you need a global network

Cathleen Berger, Charlotte Freihse

Artikel

Disinformation does not stop at national borders – countermeasures must therefore be just as international and interlinked. Based on comprehensive, international research, we identify key challenges in dealing with disinformation and call on political decision-makers to provide greater support for resilient civil society networks.

Our international research and the analyses by our regional research partners have vividly shown that the actors spreading disinformation are becoming increasingly professionalised, technically savvy and networked. A disinformation industry has emerged that is fuelled not only by known actors but also by offers from PR and marketing agencies. It recruits vulnerable, economically weak parts of the population as “keyboard warriors” for their dishonest purposes.

If agitators and aggressors become more professional, the protagonists who oppose these campaigns must be even better organised, coordinated, and resilient. There is no shortage of ideas and potential anywhere in the world, only the structures and existing resources need to be adapted to the digital reality and the speed associated with it. The response to the disinformation industry must be an ecosystem of protagonists and strategies that acts in a cross-sectoral, coordinated, and global manner as no single actor or measure alone can successfully work towards healthy digital discourse.

What is: Topics and the spread of disinformation

Analyses from all parts of the world show that disinformation prepares the ground for its influence over a long period of time by emotionalising socially controversial topics or distracting people with side issues. A direct attack on the integrity of elections, the trustworthiness of democratic institutions, or the credibility of individual candidates is often just the last drop in a slowly rising ocean. In this respect, elections can act as a catalyst and be the target of disinformation, but countermeasures are not only needed prior to elections but on an ongoing basis. A wide range of countermeasures, approaches, and protagonists must be combined to thwart disinformation campaigns: Prebunking, monitoring, demonetisation, debunking, regulation, and more. The toolbox is and must be versatile (article by Joachim).

On the one hand, the digital spaces in which disinformation spreads are globally connected and are largely based on large, private-sector platforms such as YouTube, TikTok, Instagram or WhatsApp. On the other hand, usage patterns and preferences vary considerably between individual countries and regions: LINE is available almost exclusively in Asia, TikTok is growing particularly rapidly in Europe, WhatsApp dominates in Africa, and the picture is mixed in Latin America. While digital publics overlap, disinformation campaigns use differing channels. Research documents that there are gaping holes in the responses of platforms, which interpret regulatory provisions as narrowly as possible while also applying their own rules vaguely, e.g. in the form of “copy & paste” procedures for varying contexts – especially in countries that are not considered lucrative markets from a platform perspective.

What we observe: Data, capacities, technological developments

Our understanding of the spread and influence of disinformation is based on the continuous monitoring of patterns, actors, and attempts to influence discourse on digital platforms. Access to data for research purposes could hardly be more crucial for developing evidence-based proposals and countermeasures. However, there are glaring gaps in the reliability, comparability, and analysis of data and platforms, especially with regard to non-European research that is not covered by the Digital Services Act (article by Cathleen).

The strength and resilience of civil society organisations is vital for the success of countermeasures and the promotion of healthy digital public discourse. The range of tasks for civil society protagonists is growing worldwide – their expertise is in demand when it comes to regulation and platform oversight, they act as fact-checkers, offer trainings for media and digital literacy, monitor digital discourse, educate, bring people together, and fill gaps wherever they come to light. At the same time, their scope for action is shrinking due to dwindling resources, political repression, strategic lawsuits, targeted attacks and more, which are putting an enormous strain on already hard-pressed civil society protagonists worldwide.

Technological changes, such as artificial intelligence (AI), have become an integral part of the digital public. AI has also become a regular companion in election campaigns – not only for the manipulative purposes of disinformation but also as a tool in the campaigns of political candidates. Existing supervisory structures that ensure the transparency and fairness of political advertising need to be upgraded in many places to provide adequate responses to these new technological developments.

What needs to be done now: Politicians must listen to international perspectives and support networking

For the field of protagonists worldwide to professionalise as successfully as the disinformation industry, political decision-makers must emphasise the fundamental value of networking and cooperation formats. And fund them.

Maintaining and activating networks is time-consuming and labour-intensive, which must be reflected in the funding provided and in public recognition received. For example, philanthropy and democratic governments must provide long-term support and build on existing successes instead of constantly “chasing” innovations and the latest technological trends in their requests for proposals. Not only do we need diverse, international perspectives to make smart policy decisions for a healthy digital public sphere. The resilience of our democracies also depends on the resilience of civil society engagement. All over the world and in interaction with each other.

More in-depth analyses of the aspects mentioned here have been published in a series of seven reports.


Cathleen Berger

Cathleen Berger

Co-Lead

Cathleen Berger’s professional experience spans across sectors: academia, government, non-profit, corporate, and start-up. Her work and research focus on the intersection of digital technologies, sustainability, and social impact. She currently works with the Bertelsmann Stiftung as Co-Lead for Upgrade Democracy as well as the Reinhard Mohn Prize 2024 and Senior Expert on future technologies and sustainability. In addition, she occasionally advises and works with social purpose companies and organisations on their climate and social impact strategies.

Previously, she directed the B Corporation certification process of a pre-seed climate tech start-up, launched and headed up Mozilla’s environmental sustainability programme, worked within the International Cyber Policy Coordination Staff at the German Foreign Office, as a consultant with Global Partners Digital, a research assistant at the German Institute for International and Security Affairs (SWP), and a visiting lecturer at the Friedrich Schiller University Jena.

Follow on

Co-Lead

Charlotte Freihse

Charlotte Freihse

Project Manager

Charlotte Freihse is a project manager in the Bertelsmann Stiftung’s Upgrade Democracy project, where she focuses primarily on platform governance and disinformation, as well as the impact of digital technologies on public opinion-forming and discourse. Before joining the foundation, she was a freelancer in the newsroom of Norddeutscher Rundfunk (NDR). In parallel, she was a research assistant in the European research project NETHATE and developed a categorization system for intervention measures against online hate speech with the University of Jena and Das NETTZ. Charlotte holds a Master’s degree in Peace and Conflict Studies with a focus on digital technologies in conflicts as well as peace processes. 

Follow on

Project Manager

Share

Similar articles