It’s all about trust! Why we need to focus on the affective level of online communication


Dr. Joachim Rother


“Checking a fact is challenging, but asking people to actually trust the fact is even more challenging”, says the employee of MAFINDO, an Indonesian NGO, that participated in our most recent regional workshop on disinformation in Thailand’s capital Bangkok. So facts are one thing, he says, trust is another.

Trust beats facts

Research has of course been well aware that “in social media, trust has emerged as an important governance mechanism that regulates the behavior of network members” (Trust, Media Credibility, Social Ties, and the Intention to Share towards Information Verification in an Age of Fake News – PMC ( it makes sense if you think about it. If there is a high level of trust, recipients of news are more likely to refrain from verifying the source. And that’s a problem which cannot be dealt with by fact-checks.

MAFINDO realized that and adapts strategies: it has started setting up prebunking campaigns to build up trust and resilience against manipulative or plain-out wrong narratives in Indonesia before they are being spread. Watchdog, a Sri Lankan Company, stopped its fact-checking work altogether and started focusing on technological solutions for fighting disinformation. Fact-checks simply don’t cut it, it seems, because they don’t address the core of the problem.

However, the amount of fact-checking organizations being founded globally remains high in 2023, according to Duke Reporters Lab Census (Latest News – Duke Reporters’ Lab ( So, results from research into the relevancy of trust do not yet resonate much amongst many practitioners in the field. There seems to be a general lack of awareness about the overarching importance of the affective level that is key to any information being shared online. “[T]rust in people online negatively affects information verification,” so the study in Behavioral Science concludes – to a degree, one might add that it seems almost irrelevant if the information itself is true or plausible, as long as there is enough trust in the source from which the information originated.

This also explains some of the more disturbing communication dynamics taking place online, e.g., why information is being shared that is blatantly implausible. Or why manipulated video snippets taken out of context and vilifying marginalized groups are running wild on WhatsApp and Lime Groups. Likely because they are shared in networks of trust. And trust allows us to reduce complexity – a helpful tool to cope with the flood of information we are exposed to online of course. However, trust also often out-does information verification.

Weaponization of trust

The manipulative power of trust has, of course, long been noted by disinformation actors too. Undermining trust in political opponents or systems is the 101 of any disinformation campaign. However, it works the other way around as well. Representatives from Wikimedia’s Trust & Safety Team increasingly observe strategies of weaponizing trust they claim, even amongst governments. The playbook for that is, in fact, strikingly versatile. On the one hand, there is misuse of trust by institutions or governments that you may not expect it from, e.g., a government actively spreading false or misleading narratives that suit their own interests. On the other hand, there are attempts to create a façade of trustworthiness, e.g., by implementing fact-checking units serving as fig leaves right before elections. The Philippine government installed a fact-checking unit just a few weeks before the elections and managed to pull off exactly one (!) single fact-check in total during the entire time.

However, some attempts are more sophisticated – and most likely, more successful. Like the case of is a fact-checking website that is not as “non-partisan” as it claims to be. is fully owned and financed by the Daily Caller Inc., a right-wing media outlet that was founded by Fox News’s Poster Boy Tucker Carlson. The Daily Caller has repeatedly been called out for spreading false information in its articles – and has then famously declined to correct them. It got ranked amongst the least trusted News Organizations in 2018, right next to Breitbart News – and now runs its own fact-checking branch (Here’s how much Americans trust 38 major news organizations (hint: not all that much!) | Nieman Journalism Lab ( Disturbingly enough, has managed to become signatory of Poynter’s International Fact Checking Network (IFCN) and is partnered with Meta’s Fact-Checking Program, steps both of which have attracted widespread criticism amongst journalists worldwide (Facebook teams with rightwing Daily Caller in factchecking program | Facebook | The Guardian ; Facebook’s fact-checking deal with the Daily Caller, explained – Vox )

The point is: The claim to run fact-checks in itself creates trust. And trust is important because it has become one of the most valuable currencies in online information spaces.

Let’s put trust center stage

We need to focus more on all aspects involving trust when dealing with disinformation. Putting trust center stage also allows for the attribution of agency to the main players in the field of any information ecosystem, which are the users. Scientists from Hong Kong University’s School of Journalism and Communication strongly highlight this notion of citizens not only being passive consumers but rather active producers and spreaders of information online as well. They act on impulses and sentiments that might not always be rational – but they are key to any information being distributed online. Especially since the global COVID pandemic and the next billion people joining online communication, gaining a deeper understanding of the affective context in which information is being shared by citizens will fundamentally contribute to effectively preventing disinformation going viral, the scientists are convinced.

Understanding the emotional state of citizens by analyzing their digital footprint has been a hot topic for quite a while by the way. It is called emotion prediction and various disciplines and – not surprisingly – platforms are on it already. Emotion recognition algorithms are trained on recognizing, inferring, and harvesting emotions using data sources such as social media behavior, streaming service use, voice, facial expressions, and biometrics. (XLM-EMO: Multilingual Emotion Prediction in Social Media Text – ACL Anthology). The prospects of this being fully functional are as promising as they are dystopian of course.

It is interesting to see, how important a part our emotional state, predominantly the factors constituting trust or distrust, really are in our communication behavior every day. And how neglected at the same time this notion is, when it comes to talking about the prevention of disinformation. It’s about time to change that and put trust at the heart of our endeavors to fight disinformation.

Dr. Joachim Rother

Dr. Joachim Rother

Project Manager

Dr. Joachim Rother is Project Manager in the Upgrade Democracy Team at Bertelsmann Stiftung and responsible for the Reinhard Mohn Award 2024. Prior to this position, Joachim was in charge of the Israel portfolio of Bertelsmann Stiftung, where he focused on fostering the German-Israeli relationship on a cultural, economic and political level. Joachim studied History, English and Social Studies at the University of Bamberg and holds a PhD in Crusade Studies. Prior to his work at Bertelsmann Stiftung, he served as deputy director of the Konrad-Adenauer Foundation (KAS) in Jerusalem, Israel. Joachim is alumnus of the KAS PhD scholarship program, The German Historical Institute (GHI) in Washington D.C. and of the Jerusalem Institute of the Goerres Gesellschaft. 

Follow on

Project Manager


Similar articles