Cathleen Berger, Charlotte Freihse, Katharina Mosene, Matthias C. Kettemann, Vincent Hofmann

Threats to democracy: Climate misinformation and gendered disinformation

Impulse #4
  • 1. How climate misinformation and gendered disinformation threaten democracies and societal transformation

    Our information ecosystem is under pressure, challenging democracies around the world. Digital disinformation reaches millions of people with devasting effects: online campaigns turned into hate-motivated violent attacks in Christchurch, Halle an der Saale, and El Paso; deny, delay and diffuse tactics in context of the climate crisis are slowing or even halting necessary action and regulation with severe consequences for people’s lives, biodiversity, and more. But what if the content is awful, but lawful? How do we balance the need to counter harmful content with the obligation to protect freedom of expression and privacy?

    Disinformation is intentional. That means that people spreading disinformation seek to influence, mislead, or manipulate. In practice, malign intention can be hard to pin down. Once information is out in the open, it gets shared, forwarded, amplified – including by people who may be unaware of its fabrication, lack of context, or manipulation. The consequences of this dynamic are multifaceted and contribute to undermining trust across the board: in the media, of public figures and scientific findings, the importance and safety of electoral processes and more. To a certain degree, targeted disinformation is illegal and can be countered on the grounds of foreign interference, public safety, or national security. In other areas, however, the picture gets blurry– gendered and climate disinformation among them.

    Both gender-based or gendered disinformation and climate misinformation touch upon topics that are closely interlinked with societal values, people’s openness to change, and ideas around progress and transformation. These changes often trigger emotional responses, which make both fields particularly vulnerable to disinformation campaigns that try to sow discord, divide people, and slow progress.

    Digital platforms present an increasingly conflicted space where women, gender non-conforming people, and marginalised groups are disproportionately targeted and harassed. The effect is that women are leaving online spaces, forgoing their fundamental right to participate in civil and political life. While coordinated disinformation campaigns target men as well, the nature and modus operandi of campaigns against women and sexual minorities are distinctly gendered. This comes with the recognition that targets of gendered disinformation have intersecting identities and those most vulnerable are often targeted on multiple grounds.

    Similarly, the climate crisis requires a whole-of-society response – and it challenges traditional realms of power and profit. Attempts to slow, discourage, and deny the need to act have been occurring for decades. Even the Intergovernmental Panel on Climate Change’s report stresses that misinformation undermining climate science and disregarding risks delays urgent action to tackle human-made climate change. Against this backdrop, malign actors are actively exploiting this tension to further polarise and divide democratic debates by tapping into emotions: Specifically, into the panic of those who urge faster and more decisive climate action as well as the exasperation of those wanting to simply keep going about their lives as they always have.

    Both topics have become symbolic turf for the division of society – and thus are a lucrative point of friction for disinformation campaigns. We need to understand the dynamics and patterns at play here to be better able to come up with solutions and ideas for more resilient debates, and progress, on both topics.

  • 2. Beyond regulation: Understand the pattern and respond appropriately

    To set the scene, let’s take a closer look of what we understand as climate and gendered disinformation:

    Climate misinformation involves the dissemination of false or misleading information about climate change, its causes, impacts, and potential solutions. It is considered misinformation, if spread unknowingly, which can happen due to misunderstandings, misinterpretation of scientific data, or the propagation of outdated or flawed research. Climate disinformation, however, is the intentional spread of false or misleading information about climate change to deceive or manipulate public opinion or awareness. Climate change, driven primarily by human activities such as the burning of fossil fuels and deforestation, poses one of the most significant challenges facing our planet. However, disinformation surrounding climate change seeks to downplay its severity, disregard scientific consensus, and hinder collective action mitigating its effects.

    In a thought piece in October 2022, Cathleen Berger put it like this: “In the past, climate misinformation often came in the form of outright climate change denial. Over time, this has shifted towards various delay tactics, such as “climate action is a threat to our economy or national security”, “a warmer climate means longer farming/agricultural periods”, “renewable energy is insufficient or unreliable” etc. […] The complexity of the challenge is used to point fingers at another piece of the puzzle before assuming responsibility for one’s own contribution—a phenomenon more actors are culpable of than one would hope, and maybe even expect.”

    She continues: “The consequences of climate misinformation are far-reaching and impact both present and future generations. Delayed or inadequate action in response to climate change threatens global ecosystems, exacerbates natural disasters, and intensifies socioeconomic disparities. Misinformation undermines public trust in scientific expertise, hampers policy-making efforts, and impedes the implementation of effective climate solutions. As a result, addressing climate change and transitioning to a sustainable future becomes increasingly challenging.”

    Gendered disinformation refers to the dissemination of false, inaccurate, or biased information against women and other non-conforming genders. It involves the distortion of facts, perpetuation of stereotypes, and promotion of harmful narratives about gender identities, expressions, and experiences or personal attacks against individual women* or other non-conforming genders. In addition, gendered disinformation often touches upon sexual health issues, such as disinformation spread about abortions, undermining the sexual and reproductive rights of women. This type of misinformation can manifest in various forms, including but not limited to misleading statistics, fake or non-consensually spread pornographic material, threats or similarly violent content. Gendered disinformation is a widespread issue.

    The effects of this sort of disinformation are grave. Gendered disinformation undermines the inclusivity of democratic institutions and presents a security concern, as it can lead to online and offline violence. This harmful practice has a detrimental impact on large parts of the global population, as women often feel compelled to withdraw from public discourse, self-censor their opinions, and steer away from pursuing careers in politics and other traditionally male-dominated and/or public fields. Gendered disinformation has clearly undermined sexual and reproductive rights in various countries not only limited to but including in the European Union and the United States. And yet, oftentimes their wording and framing are protected by freedom of speech and hence lawful.

    In other words, disinformation campaigns targeting topics of societal progress and transformation require counterstrategies and tactics at all levels: legal, societal, corporate, and between communities.

    Against this background, we posed the following hypotheses as conversation starters:

    • Regulation will not be able to address gendered and climate misinformation content in a balanced, rights-respecting manner. We need to encourage, support, and expand civil engagement and explore a variety of tactics to ensure a healthy information ecosystem.
    • Both issues are related to societal transformation and if challenged in a local or national context, spill-over effects having global and long-lasting consequences. Identifying and supporting structures and tactics that foster collaboration among communities and raise the voices of marginalised groups is crucial for everyone’s sake.

    This raises three main questions that we will reflect in the following:

    • How can we successfully design and enhance civil engagement efforts countering gendered and climate misinformation?
    • What is the role of different stakeholders, such as governments, media, academia, civil society, businesses, and others, in addressing these challenges and what are successful examples that may lend themselves for scaling?
    • Beyond the national context: how do we address this issue at a global level, including with a focus on highlighting their relevance to the resilience of democracies?

    2.1 The (limited) possibilities of regulation

    The question “How can we successfully design and bolster civil engagement efforts countering gendered and climate misinformation?”, is trickier to answer than one might think at first. In our reflections and exchanges with experts, the conversations kept drifting towards the roles and responsibilities of governmental actors. Legislation on the issue may be challenging, and yet, many who have worked in the field for years are vocal on this being critical to curbing the dangerous effects of climate and gendered disinformation.

    States have the primary responsibility to protect human rights and fundamental freedoms, including in the digital environment. However, things can get complicated when fundamental freedoms contradict each other: a positive obligation to protect must not lead to the infringement of other human rights (negative obligation). Every intervention is therefore a balancing act that must be carefully evaluated in the specific context. States must attempt to fulfil this obligation by creating a legally secure environment in digital spaces as well – through their own laws and by monitoring the rules and conditions of large social platforms.

    Digital spaces are regulated at various levels: international law, national legislation, regional standards, and the rules of the platforms. When it comes to banning or persecuting disinformation spread by state actors, there are clear legal limits: Origin, intention, target, and content must all be assessed in their specific context. Disinformation about individuals can be banned if it contains false statements, not just opinions. Insult and defamation can be prosecuted as criminal offenses. However, new rules that seek to restrict the right to spread falsehoods must show consideration for freedom of expression.

    There is a fine line between protecting common goods and individual freedoms and limiting these rights more broadly. Drawing clear boundaries and identifying specific instances of misinformation can be challenging, particularly in the context of rapidly evolving online environments. This also includes the problem that there is not something like the one singular truth. It is a very slippery slope if the public discourse on truth is left only to the power of states; democratic and certainly autocratic – truth mustn’t be defined by those with power. Societal discourse and progress depend on a multitude of perspectives that all provide facets to an argument. This challenge is exacerbated in digital spaces since the sheer volume of content shared on online platforms can hamper both, monitoring and enforcement of rules – legal as well as private.

    This highlights that banning or blocking of content has legal as well as practical limits. However, the toolkit of legislation and content moderation rules has a lot more to offer. Among the observations and proposals shared, were the following:

    • Mandatory transparency: To be able to provide guidance and develop solutions, it is critical that very large online platforms (VLOPs) transparently provide information on how content is shared, amplified, and moderated. If sensationalist or emotional content is promoted for clicks and attention, this can feed harmful narratives, that are often reproduced in disinformation campaigns. Researchers, civil society, and legislators must be able to understand content moderation decisions, as well as algorithmic filters and dynamics, and coordinated patterns to conduct balanced human rights assessments.
    • Policy co-creation structures: Notably when it comes to gendered and climate disinformation, the slate of actors that perpetuate such campaigns is worryingly broad. Societal progress and transformation challenges have established diverse power structures, and groups willing to stop, slow, or distract from changes to their power or influence in various contexts: the far right, conservative or religious groups, the fossil fuel industry, the military, highly privileged individuals and others. The goal can’t be to censor such large portions of society, and rather must be bringing forth structures that allow for co-creation, engagement, and mediation. Be it through citizen councils, multistakeholder consultations or community-driven policy development, there are a range of options that allow for progress over regress.
    • Financial incentives: Regulation can also target corporate responsibilities through a range of other obligations. For instance, reporting against environmental, social, and governance frameworks can be mandated; advertising can be limited (think about tobacco, alcohol or porn); investments can be restricted by social or environmental indicators; and supply chains can be monitored as well. All of these and others aim to incentivise responsible over traditionally profitable choices – and can help support societal progress.

    These observations and proposals all have one thing in common: they distribute responsibility and require various stakeholders to act together. This is why we looked at current trends and good practices across stakeholder groups next.

    2.2 Everyone has a responsibility: Current trends and good practices

    What we learned so far, is that the multifaceted nature of gendered and climate disinformation requires the involvement of diverse stakeholders as well as better cooperation among them. Below are observations of trends and good practices as well as related suggestions for further improvement:

    • Governments: State actors uphold a crucial role in developing and providing the legal frameworks and respective policies to address gendered and climate disinformation. However, as mentioned above, it is important that regulation does not focus on limiting freedom of speech, but instead incentivises change in terms of transparency and accountability. This could include regulatory efforts on environmental, social, and governmental aspects as well as financial instruments and participatory practices. While controversial examples like the Digital Services Act and the European Corporate Sustainability Regulation were mentioned as steps in the right direction.
    • Media: Journalism finds itself in a complex and challenging position, as it faces the dual task of countering disinformation while simultaneously grappling with its own structural challenges within the media industry: erosion of public trust, financial pressure, etc. In addition to that, we observe a shortcoming in the representation of diverse voices and perspectives, particularly those of marginalised communities in the field of journalism, which, in some contexts, can also contribute to further spreading climate or gendered disinformation. The responsibility of media outlets includes the provision of accurate, understandable, and unbiased reporting. Fact-checking initiatives, an increased presence in digital spaces as well as the attempt to train more science journalists have proven to be effective in-house measures employed by media outlets. In addition to that, various media outlets collaborate with external fact-checking initiatives and independent researchers, which contributes to the credibility of news reporting.
    • Corporate Actors and Social Media Platforms: As providers of digital spaces and large communicative infrastructures, corporate platforms hold a crucial role in tackling disinformation campaigns. They, too, must balance freedom of expression and content moderation as defined by their users and/or community. In principle, platforms have the right to and responsibility of content regulation, including the potential removal in accordance with their terms of use and community guidelines. Platforms of a certain size, though, are usually obligated to provide users with legal recourse after deleting content. When it comes to gendered and climate disinformation on platforms, content is often considered awful, but lawful – which means it does not usually get deleted immediately.  The Digital Services Act mandates a certain level of action on such content, if it presents a threat to democracy. In addition, social media platforms have a responsibility to develop responsible platform governance, including the implementation of the following measures:
    1. Algorithmic Interventions: Social media platforms can use algorithms to identify and reduce the visibility of potentially false or misleading content. By analysing user engagement and behaviour patterns, algorithms can prioritise credible sources and downrank or label content that may contain disinformation.
    2. Fact-Checking Partnerships: Collaborating with fact-checking organisations allows social media platforms to rely on expert analysis to identify and flag disinformation. These partnerships help provide users with accurate information and context and contribute to bolstering trust and credibility in sources.
    3. Warning Labels and Contextual Information: Platforms can add warning labels or provide contextual information alongside content flagged as potentially false. This helps users make more informed judgments about the information they encounter.
    4. Reducing Amplification of Disinformation: Social media platforms can limit the spread of disinformation by restricting the ability to share or promote posts that have been identified as false. This measure aims to prevent disinformation from going viral or spreading faster than the potential disproval.
    5. Community Reporting: Encouraging users to report suspicious or false content empowers communities to become active participants in identifying and countering disinformation, while also increasing overall awareness and confidence in the information ecosystem.
    6. Transparency and Accountability: Platforms must be transparent about their content moderation policies and measures to tackle disinformation campaigns. Publicly sharing information about the actions taken against disinformation promotes accountability and builds trust with users, notably when they cover technical, policy, and moderation efforts.
    7. Investing in Content Moderation: Content Moderation is still a task primarily performed by humans who often work under extremely poor conditions. Social Media platforms should invest more financial resources and improve working conditions to recognise the importance of this role.
    • Academia: Research institutions play a critical role in studying and providing evidence-based insights on the spread of disinformation that inform policy decisions and public discourse, including on topics of gendered and climate disinformation. Collaboration between academia and other stakeholders facilitates knowledge sharing and the development of informed strategies. Academic institutions can contribute to media literacy and critical thinking education by integrating these topics into their curricula. Additionally, capacity building programmes for journalists, policymakers, and civil society organisations enhance their ability to address gendered and climate disinformation.
    • Civil Society Organisations (CSOs):  CSOs play a crucial role in advocating for gender equality and climate action, as well as countering disinformation. Through awareness campaigns, they raise public consciousness, challenge stereotypes, and disseminate accurate information. Grassroots organisations engage communities directly and amplify their voices. Civil society can monitor and report instances of gendered and climate disinformation, holding stakeholders to account. They collaborate with fact-checkers, conduct independent investigations, and provide platforms with reporting and documenting cases of disinformation. Such initiatives contribute to creating a robust evidence-based narrative and are fundamental to develop balanced policy responses.

    No single actor, and no individual stakeholder can tackle disinformation on their own. Everyone has a role to play – and all stakeholders must engage in coordinated efforts. Countermeasures will only be effective if actors collaborate and risks are addressed at a systemic, instead of an individual level. Otherwise, even promising or established practices will fall short of scaling and hence, competing with the width, reach, and pervasiveness of disinformation campaigns.

    Crucially, the importance of positive counter narratives that support people on an emotional level in accepting transformation and progress as well as the protection and support of individuals targeted by gendered and/or climate disinformation cannot be stressed enough – at both, local and global levels.

    2.3 Local progress, global effects?

    The tension as well as the mutual amplification of local and global levels was evident in most of our conversations and reflections. And this tension plays out on all levels. Our hypothesis was that gendered, and climate disinformation are related to societal transformation and if challenged in a local or national context, spill-over effects will have global and long-lasting consequences. While not as clear cut, the sentiment seems to be widely shared.

    Solid democracies must –carefully, but confidently—lead the way in providing legislative templates for transparency requirements, duty of care, and platform accountability. While people shared their worries of misguided and disproportionate copy and paste efforts of anti-hate speech and disinformation laws such as the German NetzDG from 2017 by less democratic, or autocratic, countries; there was a clear sense that regulatory efforts in democracies can prove beneficial. Importantly though, law enforcement must be adequately educated and equipped to operate without undermining encryption, privacy, and anonymity.

    Considering the continued pressure on (and resulting decline of) democratically organised societies, civil engagement and support structures continue to provide safe spaces where inclusive narratives and arguments countering regressive tendencies can take shape. Support structures are fundamental to anyone targeted by disinformation campaigns – at a local level, this can mean involving your personal network to respond to posts on your behalf or filter through your messages. At a global level, this may pan out through international solidarity or by raising and strengthening the visibility and voices of marginalised groups.

    Empowering communities to embed counter narratives within their own context and lead their own initiatives is crucial in countering gendered and climate disinformation. Oftentimes, resources, training, and support for grassroots efforts enables targeted interventions and fosters a sense of ownership among community members. Successful efforts, even if context-dependent, can then spill over and provide valuable lessons and ideas for other communities and regions. Therefore, international cooperation providing for local good practices to be discussed and shared among an international community is essential.

    2.4 Observations and recommendations to address climate and gendered disinformation successfully

    Climate and gendered disinformation are pressing issues that have significant consequences for individuals, societies, and the planet. They hinder progress towards gender equality, sustainable development, and effective climate action and, above all undermine democracy as a whole – in the short and the long run. The complexity of these challenges, along with the limitations of regulatory efforts, requires a multifaceted approach involving diverse stakeholders and continued cooperation. Countering climate and gendered disinformation is a social task, not primarily a legal one. Drawing from our exchanges with experts as well as our own observations, we propose a range of recommendations to address climate and gendered disinformation going forward:

    • Legal regulation is part of, but not the only solution: Legislative measures to counter disinformation must be firmly embedded in rights-respecting frameworks and complement measures on other levels. Put differently, a responsible legal framework for handling climate and gender disinformation goes hand in hand with the measures and tools available to corporate actors, the media, civil society, academia, and others. Beneficial regulatory efforts include mandatory transparency, environmental, social, and governance obligations, and policy co-creation formats.
    • Incentives for global cooperation and alliances: Disinformation often transcends national borders, as do digital spaces and infrastructures. Implementing effective anti-disinformation measures therefore requires international cooperation, and reliable, cross-sectoral alliances.
    • Awareness and media literacy: It is important to learn and adapt to a constantly evolving disinformation landscape. This requires skills and competency training in the fields of media, information and digital literacy. However, and this is to be stressed, it comes with its own limits as targeted disinformation campaigns often build on controversial topics, emotionalised discourses, fear and confusion – elements that cannot merely be countered by facts (or literacy). The root causes of these issues thus often lie within current, social or environmental challenges that cannot solely be solved with digital competencies.
    • More resources for efforts and actors strengthening democracy – online and offline: Observations around gendered, and climate disinformation have shown how much both are interlinked with deeply rooted social issues and concepts of progress. And, both topics just present two of many other themes being targeted and instrumentalised for disrupting democracy, dividing society and hindering progress and transformation. In consequence, efforts that aim to strengthen democratic societies online and offline should be embedded within a sustainable system that allows them to operate efficiently. This goes for actors as well: The responsibility and the role of the media in countering any kinds of disinformation is crucial – this requires that quality journalism is equipped with the necessary resources and journalists are safe and protected to do their work.
    • Incentives for platforms for self-regulation efforts: Platforms should establish mechanisms for detecting and flagging harmful campaigns and enforce their own terms of service to address gendered or climate disinformation effectively. They can be encouraged to take proactive steps against disinformation by offering recognition and rewards for maintaining their reputation. In addition, fostering collaborative efforts through public-private partnerships, as well as involving users in content moderation, can contribute to a more effective and transparent approach in addressing disinformation on platforms.
    • Content Moderation: Effective content moderation plays a crucial role in countering digital disinformation. Transparency in defining disinformation through clear policies is essential for evaluating content accurately. Leveraging advanced AI and machine learning tools can significantly improve the efficiency of identifying and flagging false information, although content moderation remains a task primarily conducted by humans. Consequently, collaborating with reputable fact-checking organisations is vital, as it ensures content accuracy and credibility. Additionally, it is important to focus on improving the working conditions of content moderators and encouraging user reporting to enhance the overall effectiveness of content moderation efforts.
    • Champion successful, positive examples: In our research and related conversations, we have encountered a plethora of great initiatives that all deserve broader visibility and support. Initiatives may benefit from amplification, financial resources, volunteers, or additional collaboration. Among the examples highlighted repeatedly were HateAid and the International Fact Checking Network (IFCN).

    With all that said and done, we have yet to look at the dynamics and impact of gendered and climate disinformation on electoral processes. These aspects will surely feature in our next paper, where we will look at the upcoming EU elections in 2024 and ask, whether and how we can strengthen discourse on social platforms today to protect processes of tomorrow.

  • 3. Go deeper on climate misinformation and gendered disinformation

    Below is an overview of recommended reading around climate misinformation and gender-related disinformation – we’re also happy to accept further advice.

    1. Cathleen Berger explores the widespread issue of climate misinformation, which deflects responsibility and hinders effective climate action. Despite this challenge, growing awareness and sustained public support offer hope for transformative change in addressing the climate crisis: We “Pledge” action: The delay and diffuse tactics of climate misinformation | ORF (orfonline.org)
    2. Matthias C. Kettemann discusses the emerging calls for intervention to protect democratic publics from digital disinformation with a focus on laws, community standards and platform design. (BPB)
    3.  Researchers around Jessica Colarossi at Boston University conducted a yearlong project called “Data and Misinformation in an Era of Sustainability and Climate Change Crises” to combat climate misinformation. The project delved into how false information about climate change spreads on social media platforms like Twitter and Reddit, through native advertising in mainstream media, and the language used to sow doubt about the urgency of climate change, aiming to promote accurate information and encourage fact-checking to address the pressing climate crisis: Tweets, Ads, and Lies: Researchers Are Fighting against Climate Misinformation
    4.  During the RightsCon 2021 event, the EU DisinfoLab co-hosted a community meeting addressing the disproportionate targeting and harassment of women, gender non-conforming individuals, and marginalized groups on digital platforms. Recommendations included documenting the threat, raising early warnings, and maintaining an intersectional lens to effectively combat gendered disinformation.: Gender-Based Disinformation: Advancing Our Understanding and Response – EU DisinfoLab
    5. Under the title “Antifeminism as a threat to democracy? Equality in times of right-wing populism”, the Amadeu Antonio Stiftung explains strategies of extreme right-wing and anti-feminist actors and presents recommendations for action to be able to defend oneself against attacks on equality work. Antifeminismus als Demokratiegefährdung?! Gleichstellung in Zeiten von Rechtspopulismus
    6. This study “Correcting climate change misinformation on social media: Reciprocal relationships between correcting others, anger, and environmental activism” by Isabelle Freiling and Jörg Matthes explores the drivers behind corrective efforts on social media to combat the spread of climate change misinformation. The research suggests that climate change-related anger and political environmental activism play significant roles in motivating individuals to correct others on social media, and that there are reciprocal relationships between corrective actions, anger, and activism. The findings shed light on the affective level of expression effects and their implications in misinformation research.
    7. Maria Giovanna Sessa highlights the impact of gendered disinformation on democratic representation, the silencing effect it has on women, and the need for comprehensive measures to address this growing threat in the digital age.: What is Gendered Disinformation? | Heinrich-Böll-Stiftung | Tel Aviv – Israel
    8. Lucina Di Meco and Kristina Wilfore highlight the use of gendered disinformation campaigns that target women in politics, aiming to undermine their credibility and discourage their participation. They call for addressing this issue as a national security and foreign policy imperative: Gendered disinformation is a national security problem | Brookings
    9. The Special Committee on foreign interference in all democratic processes in the European Union, including disinformation, and the strengthening of integrity, transparency and accountability in the European Parliament Hearing on “Climate change disinformation” | Hearings | Events | ING2 | Committees | European Parliament (europa.eu)
    10. The BBC found TikTok to remove little content that is, although conflicting the community to not “undermine well-established scientific consensus”, containing false information on the climate crisis. The climate change-denying TikTok post that won’t go away – BBC News
    11. In this report, Fabian Klinker and Sven Brüggemann provide a first systematic overview of the different actors in the climate movement, their online presences and their public communication on the platforms Facebook, Twitter and Instagram: IDZ Jena: Beitrag #6 (idz-jena.de).
    12. In DeSmog’s Climate Disinformation Database, you can browse our extensive research on the individuals and organizations that have helped to delay and distract the public and our elected leaders from taking needed action to reduce greenhouse gas pollution and fight global warming: https://www.desmog.com/climate-disinformation-database/.
    13. This policy brief by the United Nations from June 2023, is focused on how threats to information integrity are having an impact on progress on global, national and local issues: our-common-agenda-policy-brief-information-integrity-en.pdf (un.org).
  • Participants of the discussion on 20 July 2023

    Conversation Starters

    • Fabian Klinker, Institut für Demokratie und Zivilgesellschaft Jena
    • Lucina di Meco, Women’s Rights Advocate and founder of #ShePersisted

    Experts

    • Mauritius Dorn, Institute for Strategic Dialogue
    • Sara Schurmann, Network of Climate Journalism Germany
    • Shmyla Khan, Digital Rights Foundation
    • Sina Laubenstein, Gesellschaft Für Freiheitsrechte
    • Elisa Lindinger, SUPERRR Lab
    • Georgia Langton, Bertelsmann Stiftung
    • Katharina Mosene, Hans Bredow Institut

Leave a Reply

Your email address will not be published. Required fields are marked *