Home

How to Ensure Broad Access to Social Media Data for Maximum Transparency and Accountability

Mark Scott / Sep 3, 2024

Mark Scott is a senior resident fellow at the Atlantic Council's Digital Forensic Research Lab's Democracy + Tech Initiative where he focuses on comparative digital regulatory policymaking topics.

When the European Commission charged X for failing to uphold its new social media rules, Brussels regulators came out swinging.

Under the European Union’s Digital Services Act, or DSA, Elon Musk’s social network was accused in late July of hoodwinking its users via content from ‘verified accounts’ that have the blue check marks anyone can now buy. The platform also didn’t sufficiently explain how people were targeted with online advertising — another must-have under the EU’s new rules.

But it was the third charge — that of failing to give outsiders legally-required access to X’s public data on posts, shares and other social media engagements — that will be the most far-reaching.

Under Europe’s social media rules, platforms like X, Facebook, and YouTube must open up their treasure troves of data (everything from people’s viral TikTok videos to public-facing Instagram posts) to outside groups of academics, researchers and civil society groups. The goal: to boost transparency and accountability about how these platforms affect society by giving outsiders mandatory access to the inner workings of these social networks, for the very first time.

X will now have a chance to respond to these changes, though Musk posted: “The DSA is misinformation!” More charges are likely, with both Meta and TikTok facing ongoing investigations into how they also allow outside groups to tap into their public-facing data.

Calling out these platforms is only half of the solution.

As I start my new job as senior research fellow at the Atlantic Council’s Digital Forensic Research Lab’s Democracy + Tech Initiative, my goal is to help solve the second half of this policy dilemma — one that builds on the data access rules within the DSA to make such transparency and accountability tools available to the largest number of independent researchers possible.

Currently, much of this research — often based on long-standing relationships with social media networks and confined, primarily, to US Ivy League academic institutions — is limited to the few, not the many.

It has led to unprecedented social media analysis, including the recent 2020 Facebook Election Project overseen by independent academics that detailed that platform’s role in the most-recent US presidential election.

But to maximize the potential of Europe’s transparency and accountability rules, we must find ways to widen the scope of who can conduct such research; create user-friendly tools that allow non-technical subject experts to participate; and ensure — above all — that people’s privacy and security on social media is protected against potential harm.

For meaningful research — let alone greater corporate transparency — to take place, there needs to be a clear plan to turn such rulemaking into a hard nosed reality. It must build on existing, and ongoing, work across the regulatory, civil society and academic world to meet the needs of the wider research community whose efforts are crucial if social media companies are to be held accountable for their positive and negative effects on the world.

In my previous role as POLITICO’s chief technology correspondent, I spent more than a decade scouring social media for potential harm. Now, I see the challenge ahead as twofold.

The devil is in the detail

The first challenge relates to the basics of who should be allowed to access social media data, who should shoulder the costs associated with conducting this research, and what should the underlying technical infrastructure look like to make social media data access as accessible and intuitive as possible?

Much of this work is already underway. Under Europe’s social media rules, only so-called ‘vetted researchers,’ or those associated with academic institutions or non-commercial entities like civil society groups or not-for-profit media outlets can access social media data. That, unfortunately, means that almost all journalists — still the main way that potential harm is uncovered across these global platforms — are not included.

To maximize the potential of these mandatory data access rules, the net needs to be cast as wide as possible so that all potential vetted researchers, even those with limited technical knowledge, are included. That includes those with subject expertise in topics like climate change, immigration and LGBTQ+ rights, alongside more traditional social media and foreign interference researchers.

The participation of non-EU based experts, particularly those from the US with, collectively, decades of understanding of how these platforms operate, is also essential if Europe is willing to tap into all voices that share in its efforts to boost transparency and accountability for social media companies.

Such efforts will not come for free. While social media can’t charge for mandatory data access, under the DSA, the infrastructure — cloud storage capacity to hold reams of social media posts, engineering skills to analyze these large data sets, stakeholder engagement programs required to bring in as-yet untapped subject experts — to implement the new rules will require significant long-term funding.

Much of that currently relies on mostly US-based philanthropic organizations, many of which are shifting away from supporting social media research in favor of untangling how artificial intelligence is shaping society. Into this likely void, governments — arguably, the only entities with the financial resources to invest in such long-term society-wide ‘goods’ — must step up in ways that fund independent research without giving officials direct access to reams of social media data.

The third question related to this challenge is technical. Ongoing leading data access programs like those overseen by Social Media Archive at the University of Michigan’s Institute for Social Research and Media and Democracy Data Cooperative, run by a group of US academic institutions, have shown what is possible when outsiders can dive into the often murky world of social media. They are best-in-class efforts, albeit mostly accessible to those in the US, on which wider access should be built.

Yet currently, these projects focus almost entirely on giving academics access to social media for long-term research, often published years after potential harm is first flagged. To meet the needs of the wider, non-academic community, much of which is dedicated to the near-term threat posed by social media, non-technical tools and ways to compare what happens on different platforms must be developed to understand these platforms’ immediate impact on society.

That does not mean abandoning legitimate academic researchers’ needs for those of civil society organizations. Instead, it requires a recognition that what both groups need for their accountability work differs, and that we should build on existing data access projects — and not reinvent such infrastructure — to meet those different demands.

Global digital policy questions

The second challenge is tied to the wider global digital policy debate — something the Atlantic Council’s Digital Forensics Research Lab has prioritized during almost a decade of social media research and policy expertise.

Currently, Europe’s DSA is the only democratic regulatory playbook that requires social media companies to provide outsiders with mandatory access to public-facing data. It sits squarely in the so-called ‘Brussels Effect,’ or the EU’s effort to cajole other countries and regions to follow its rulemaking lead.

But we can not rely on Europe, alone, to hold social media platforms to account.

Policymakers in other democratic countries – most notably in the United Kingdom, Australia and Canada where lawmakers are mulling new digital legislation – must build on Brussels’ efforts to similarly force companies to be more transparent. Existing collaborations, including the Global Online Safety Regulators Network, a group of online safety agencies from nine democratic countries, should prioritize promoting social media data access for outside groups.

In the US, where digital lawmaking has been mired for years in partisan disputes, any future White House administration should also look to piggyback on what Europe has started, particularly via ongoing transatlantic cooperation like the EU-US Trade and Technology Council.

Each jurisdiction has its own challenges and its own regulatory approaches. But the core tenets of accountability and transparency — borne out of mandatory data access rules mirroring those already in place within the EU — are universal. They should form the basis of any country’s efforts to corral social media’s potential harmful effects on democratic institutions, elections and the broader society.

This digital rulemaking does not come without risks. No one wants to create a readily-accessible social media database that gives governments unfettered access to people’s social media accounts. That would quickly stray into potential Big Brother-style online surveillance, and would give authoritarian regimes like those in China and Russia cover to create similar data access requirements that would put their citizens at risk of real harm.

What is required are clear-cut policymaking about who can access social media data, and what happens if, or when, things go wrong.

Under Europe’s social media rules, an independent body — empowered by non-political regulators — will soon oversee the bloc’s data access regime so that only those who fulfill the EU’s vetted researcher designation can tap into the data. That group will be not beholden to either governments or platforms, ensuring that only the appropriate groups will have access via Europe’s social media rules. It provides a pragmatic solution that other countries should also adopt.

We also need to acknowledge that mistakes will happen.

The Cambridge Analytica scandal, sparked when an academic collected Facebook users’ personal information that he then passed onto a political consulting firm, remains the most high-profile example of what can happen when appropriate checks are not in place. Tough data protection rules, including hefty fines for those who do not protect users’ privacy, are an essential component to any country’s data access rules so that people’s rights are upheld, even while researchers track potential harm across social media.

None of this will be easy, nor will it be clear cut. Companies legitimately push back against outsider data access, arguing it puts their users’ security and privacy at risk. Outside research groups often have widely different priorities, many of which place them at loggerheads over how to carry out social media research.

But, as we try to unpick the potential harm from social media associated with the 2024 global election cycle, when more than half of the world’s democratic population will have gone to the polls from France to Bangladesh to, in November, the US, there is an urgent need to better understand how these platforms affect society and what, if anything, policymakers and politicians must do to counter potential harms.

For me, such questions can only be answered with a better understanding of what happens on these global networks, so that any legislative response, if required, meets the actual societal need, and not what is merely perceived to be the underlying problem.

That requires a well-functioning social media data access regime, based on democratic principles, that is as inclusive and secure as possible. Currently, Europe offers the only way forward, something I hope will change. Yet for now, the need for such transparency and accountability is urgent. It’s time to get to work.

Authors

Mark Scott
Mark Scott is a senior resident fellow at the Atlantic Council's Digital Forensic Research Lab's Democracy + Tech Initiative where he focuses on comparative digital regulatory policymaking topics. He is also a research fellow at Hertie School's Center for Digital Governance in Berlin. His weekly new...

Topics