Social media regulation can be weaponised – citizen participation is key
Cathleen Berger
A handful of people. A non-profit organisation focused on citizen participation and mobilisation to drive legislative change. Their core challenge: freedom of speech. The political system is marked by historical military structures, efforts to suppress counter-speech, and state-sponsored digital disinformation. The moment the conversation turns to the king, phones leave the room. People know they are infected with Pegasus spyware. The picture is bleak: How do you operate in a context where your government targets its own citizens and invests huge parts of the public budget into influence operations against pro-democracy voices?
Civil society mobilisation in Thailand
This is not fiction. This is the reality for civil society organisations in Thailand.
Thailand is a monarchy that is only just coming out from under military rule that took power in a coup in 2014. While the 2023 election saw a landslide win by the Move Forward party, the former opposition, changes to the political system are slow to unfold. To push for change, strong civil society organisations and activists are needed to voice concerns and help frame future policies. However, the fear of persecution, the lingering threat of surveillance and time in prison for anyone speaking out in favour of democracy, LGBTQ+, environmental or land rights are tangible. Even now, there are over 600 cases against protestors and activists in court, all of which are being defended pro bono by the consortium of Thai Lawyers for Human Rights. Consequently, the scarcity of people who actively, physically support and work in civil society organisations on these issues is painfully obvious: fewer than 100 people according to experts on the ground.
However online, the picture is different.
The power and ambiguity of social media platforms
Under the new Thai government, a constitutional referendum is within reach. For citizens to influence the framing and scope of the referendum, they need to collect 50,000 signatures to pose one question. Enters the mobilisation power of organisations like iLaw whose demands are as straightforward as they are outrageous in the Thai context: increase public participation and change the constitution to strengthen citizen rights. In a whirlwind of online campaigning the iLaw team held impromptu live streams on X and Facebook asking people to sign their petition and mail it to their offices. Within three days, 200,000 letters flooded their entrance, unexpected piles over unexpected piles. Online, the appetite for change is big – and hugely energising. In fact, the Move Forward party, too, mobilised their constituency on social media, TikTok mostly.
And this is where things get complicated. Large online platforms like TikTok, X, Lime, or Facebook continue to offer massive mobilisation power, notably in countries where activists are oppressed and threatened. Any social media regulation that is put in the hands of a government that targets and manipulates its population risks infringing upon this last resort of freedom for political mobilisation.
The Counter-Brussels effect of EU regulation
To a European audience this may be puzzling. The sentiment here is squarely on the fact that platforms hold too much power and must be held accountable, which is why they must be regulated to allow for curbing hate speech, digital disinformation, and other threats to democracy. The Digital Services Act as a rights-respecting, empowering, and progressive piece of legislation inspires confidence – and hope that its positive effects for platform accountability will spread beyond its borders, similar to the EU’s data protection regime.
And other countries do closely follow – and adopt – EU regulation, including Thailand. But the effects may well go counter their intention, notably where due process and recourse to the rule of law is limited. As researchers from the GIGA Institute stipulate: “In an age of proliferating disinformation, governments in South and Southeast Asia have come out with anti-fake news laws.” However, these are too often weaponised.
Who gets to flag and label content as hate speech or disinformation? Who can request content takedowns? Who is approved by whom as a vetted researcher to access platform data? If these questions get answered by non-democratic governmental structures, their answers are likely to infringe on freedoms, not protect them.
Should we push for global rules on digital disinformation?
Such concerns aren’t new, and many bodies are working towards defining core principles for platform governance that harmonise legislation to be more rights-respecting for everyone. The UNESCO recently published the results of a global survey, spanning 16 countries and 8,000 participants. Their findings indicate that people perceive digital disinformation as a major threat (85%), even more so when it comes to electoral processes (87%) and that they feel more action from governments (88%) and platforms (90%) is necessary. UNESCO’s conclusion: propose an action plan for social media regulation and announce that they will be convening a World Conference of Regulators in mid-2024 on the issue.
And clearly, online discourse seems tainted these days, not just on X, which in the English-speaking world has become a major cesspit of hate. If you look at these results, the message seems obvious: 85+% are calling for more social media regulation – ideally at the UN level. But what if it isn’t as obvious?
Strengthen and focus on the policymaking process instead!
The question is: If the United Nations harmonise regulation on platforms and content, who will be responsible to provide safeguards? In other words, how do we make sure that organisations like iLaw will not be censored by governments who are following the “word” of the law, but not the process?
When governments are sources of disinformation, any effort to correct misleading content, challenge manipulation, or draw attention to censorship and/or the stifling of pro-democracy mobilisation puts a target on your back. Not just digitally, but in real life. If we want better social media platform regulation, we need to push for citizens involvement in the policymaking process, not narrow in on specific wording or consolidate even more power in the hands of arbitrary powers.