Go back

Research engagement #2 to Bangkok, Thailand

Cathleen Berger, Dominik Hierlemann, Dr. Joachim Rother
30. October 2023 – 2. November 2023

?
Introduction

Our international good-practice research is supported and enriched by regional research engagements, consisting of workshops and bilateral discussions with decision-makers, experts, and relevant stakeholders, who we are bringing together in one comparatively central location in each region. The goal of these research trips is to create a space for exchange amongst experts and mutual learning of each other’s contexts to jointly explore the landscape of counter-disinformation initiatives, pro-democracy mobilisations efforts, and highlight particularly promising examples and good practices. In addition, networking with and among the respective actors aims to foster strong collaborations, alliances, and knowledge transfer, including assessing ideas for their potential to successfully strengthen counter-disinformation efforts in Europe and Germany.

Focus: Role of governments, freedom of speech, and democratic resilience

In a partnership with Digital Asia Hub (DAH), Upgrade Democracy’s second research engagement took place in Bangkok, Thailand. The mission: to explore the findings of a unique mapping study on misinformation and disinformation in the Asia Pacific’s political landscape with 16 stakeholders from 10 countries in the region. A two-day workshop brought together fact-checkers, journalists, academics, health workers, government officials and representatives of civil society groups, to dive deep into the conditions under which misinformation cycles begin, how they mutate and spread. This was followed by two days of bilateral meetings with researchers, activists, and civil society organisations. A summary of our findings, insights, and impressions follows below. The workshop summary is also available in PDF format, downloadable at the bottom of the page.

Critical topics and keywords that marked our conversations and that our team continues to reflect on included:

  • The persistent role of Chinese influence operations,
  • Digital disinformation is hyper-targeted and geared towards preventing pro-democracy mobilisation,
  • Migration, LGBTQI+ and the climate crisis get exploited across the board,
  • Regulation can be weaponised when governments don’t act in their citizens’ best interest,
  • Platforms are both: a source of funding and a blackbox for research due to limited data access.

A summary of our findings, insights, and impressions follows below. The workshop summary is also available in PDF format, downloadable at the bottom of the page.

Setting the Scene: The Making of Misinformed Choice

Earlier this year, Upgrade Democracy and DAH, commissioned a report to map out the kinds of information disorders that have mushroomed in in the run-up to post-pandemic national elections across the Asia Pacific region. The final report will be published in early 2024. At the convening, the research team from DAH shared their theoretical framework to garner feedback and help finetune insights into counter-disinformation efforts across various countries in the Asia-Pacific. The framework explores the concept of “informed choice” – a motivating factor for people’s participation in electoral processes – and how different factors play a role in creating “misinformed choice”, a phenomenon where voters have fallen prey to mis/disinformation from several sources.

The framework introduces a “stack” comprised of seven layers – self-recognition, verification, variety, representation, assurance, process, and practice – to analyse the vulnerabilities in the flow of information during election cycles in the digital age. Each layer of the stack is interconnected and forms a part of the electoral process: if one layer is targeted by disinformation, the risk of contamination of different parts of the stack is high. One of the key elements of this novel framework is its ability to connect intention and execution, allowing analysts to include seemingly irrational, emotional, and affective reasons behind individual actions and/or decisions.

1
30. October 2023 Published Information vs Produced Information

In a discussion that invited all 16 stakeholders to speak about their work in the field of mis/disinformation, one of the key points that emerged was that published information is not the same as produced information. The former has a distinct source and is easily verifiable but on today’s digital platforms, it is increasingly difficult to identify where information is coming from, and who its authors are. This raised questions around democratisation: who has access and who produces content?

Speakers from the Philippines, India, Pakistan, Sri Lanka, Thailand, and Indonesia quickly identified state-sponsored actors as sources of misinformation, while a fact-checker from Taiwan spoke of foreign information operators who sought to destabilise democratic systems in their country. The tactics range from pushing forward pro-government narratives that seek to consolidate power, to releasing information that withstands debunking because they appeal to the citizens’ sense of national pride; and in areas where any information can become a liability for the government, internet shutdowns occur. If mis/disinformation is power, creating an information blackout signals greater political power.

The stakeholders also established a pattern in the kinds of disinformation that takes place in different countries – including gendered disinformation, misinformation surrounding COVID-19, the climate crisis, migration and/or minority groups – thereby illuminating the ways in which mis/disinformation is intersectional. This was echoed in Dr. Pirongrong Ramasoota’s keynote address as well. The Commissioner, National Broadcasting and Telecommunications Commission (NBTC), Thailand, talked about the role governmental fact-checking institutions play in elections but also highlighted their limitations in ascertaining if disinformation campaigns have a direct impact on election results. For this phenomenon to be considered an unassailable fact, long-term investigations and comparative data would be required.

2
31. October 2023 Working Through the Stack and Identifying Future Challenges

By working in groups of three, the stakeholders interacted with the information stack. It began with two simple questions: how can we use the seven layers of the stack to demonstrate the ways in which information is manipulated? And how does that corruption in the stack then lead to misinformed choice? The three groups offered several practical examples from their body of work and experience. This highlighted the interconnectedness of the different layers of the stack as well as the socio-cultural similarities in each group that determine the affective choices of voters in the region. The workshop was instrumental in showing practitioners the ways in which theoretical frameworks can be useful to understand the mechanisms at play and provide insights into developing new and/or better adjusted strategies to combat mis/disinformation in the region.

Another focus area of the workshop was the future. How do emerging technologies such as large language models, video and audio content, generative AI and others shape our information ecosystems? The stakeholders divided themselves into two groups and tackled the question in two formats: one, by creating a time-risk axis by mapping both risk and time horizon of different technological trends on a map; and second, by creatively drafting future news headlines that illustrated the impact of different technologies.

3
Bilateral meetings

Following the workshop, our team had the chance to meet for a few in-depth conversations with civil society organisations, researchers and practitioners focused on Thailand and its political context. These included bilateral meetings with iLaw, Agence-France Press, Thai Netizen Network, and Chulangakorn University.

One thing was crystal-clear in all our conversations: Mobilisation across civil society is hard and often threatened by governmental interference and/or targeted disinformation campaigns. And yet, their commitment and success in mobilising civil society is unfaltering.

We learned that in Thailand, state-sponsored disinformation campaigns are institutionalised, for instance in the form of the military branch ISOC (Internal Security Operations Command), which systematically implements information operations (IOs). These IOs tend to distort information rather than spreading outright false ones. The important difference of these IOs in comparison to traditional propaganda is this: they are hyper-targeted and geared towards exploiting differences.

The sheer scale of hyper-targeted messaging also disrupts traditional fact-checking processes, which can simply not keep up. In addition, the lack of resources often puts civil society organisations in uncomfortable trade-offs with social media platforms – which provide funding, while also refusing to allow for outside scrutiny.

!
Wrap up

Observations and Key Takeaways

Summary

Download summary as PDF