Very Large Online Platforms pledge compliance with DSA – Researchers must put that to a test

    Platform Governance
    Regulation

Cathleen Berger

Artikel

Since August 28, 2023, very large online platforms (VLOPs) like Facebook, Instagram, YouTube, or TikTok must comply with new requirements under the EU’s Digital Services Act (DSA).

Almost all 19 of them made public announcements and pledges around their implemented changes. In most cases, this includes among others: chronological feeds, reporting tools for harmful or misleading content, transparency around advertising and clearer options to opt-out of personalisation. Netzpolitik.org has a good overview of changes for users (in German).

One of the more anticipated announcements: TikTok. Until now, the platform has been comparatively opaque with a view to its user base, algorithmic filtering, political interference, and other aspects of compliance. Their press release from August 4, 2023 claims to introduce a whole slate of features, among them: a new reporting option for illegal and harmful content, a new team of content moderators and legal specialists, more transparency around content decisions, increased controls of algorithmic filters and personalisation options, including a pledge to no longer allow the targeting of minors with ads.

But pledge doesn’t mean compliance. Researchers Martin Degeling and Anna Semanova from the Germany-based think tank “Stiftung Neue Verantwortung” got to work immediately, testing TikTok’s API for its claims to no longer personalise ads for people under 17 years of age. Their findings are clear: nothing has changed in terms of algorithmic filtering and targeting.

Not only is this critical information for EU regulators, who must ensure compliance of VLOPs, this is also a call to researchers to make use of their new data access rights under the DSA and scrutinise public claims by platforms for their validity. The DSA will become fully applicable in February 2024, also for smaller platforms and with a view to member state implementation and oversight. The most important element to keep an eye on will be enforcement, including with a view to civil society engagement and research access – since as the TikTok example shows, the relevance of the DSA will be measured in actual change on digital platforms, not in pledges or minor tweaks for optics.

And – if you need help getting started or would like to replicate Degeling’s and Semanova’s work, good news: we’ll be launching a Knowledge Hub for Social Media Monitoring that includes code samples for researching TikTok, X, blogs and more next week. The Knowledge Hub will be open source, under CC BY license and open for additional contributions. Time to put new research and access tools to a test.


Cathleen Berger

Cathleen Berger

Co-Lead

Cathleen Berger’s professional experience spans across sectors: academia, government, non-profit, corporate, and start-up. Her work and research focus on the intersection of digital technologies, sustainability, and social impact. She currently works with the Bertelsmann Stiftung as Co-Lead for Upgrade Democracy as well as the Reinhard Mohn Prize 2024 and Senior Expert on future technologies and sustainability. In addition, she occasionally advises and works with social purpose companies and organisations on their climate and social impact strategies.

Previously, she directed the B Corporation certification process of a pre-seed climate tech start-up, launched and headed up Mozilla’s environmental sustainability programme, worked within the International Cyber Policy Coordination Staff at the German Foreign Office, as a consultant with Global Partners Digital, a research assistant at the German Institute for International and Security Affairs (SWP), and a visiting lecturer at the Friedrich Schiller University Jena.

Follow on

Co-Lead

Share

Similar articles