The UK's Opportunity to Lead on Social Media Transparency
Mark Scott / Jul 15, 2025Mark Scott is a contributing editor at Tech Policy Press.

Houses of Parliament at dusk, London, UK by Eric Hossinger is licensed under CC BY 2.0.
In the world of online safety rulemaking, most attention has focused on the European Union’s Digital Services Act.
But just across the English Channel, the United Kingdom’s Online Safety Act (OSA), a set of rules for social media, video-sharing, and internet messaging companies to remove illegal content like terrorist material and online financial fraud, is now well underway.
That rulebook, which includes potential fines of up to 10 percent of a firm’s global revenue, just got a double revamp.
On June 19, British lawmakers became the second group of politicians — behind their counterparts in the EU — to mandate social media giants open up to independent researchers in the name of accountability and transparency.
Under the country’s Data (Use and Access) Act, the British government tweaked the country’s OSA to require so-called regulated online services, or any tech company that falls under the British digital rulebook, to provide social media data to researchers related to “online safety matters.”
And then, on July 8, the country’s online safety regulator, known as Ofcom, published suggestions that detailed how this new system could work.
These options included everything from amending existing data protection rules to provide researchers seeking to access information with greater legal clarity to creating new regulatory structures to allow academics and civil society groups to examine the inner workings of Meta, Alphabet, and TikTok. (Disclaimer: I sit on an independent advisory committee at Ofcom, and everything I write here is done so in a personal capacity.)
Yet despite the one-two digital policymaking punch, how the UK’s nascent data access regime will work is still very much up in the air.
Over the next 12 months, British policymakers and politicians need to transform these proposals into a data access structure. One that balances the needs of independent researchers to access social media data in the name of accountability and transparency with the protection of social media users’ privacy and security rights and the companies’ legitimate trade secrets.
That task comes at a time when some US social media companies are increasingly taking adversarial positions against non-US regulators seeking to impose online safety regulation on American companies.
Donald Trump’s administration has already expressed publicly that such rules may infringe on US citizens’ fundamental rights. The White House explicitly called out the UK — alongside the EU — for potentially forcing American companies to develop products that “foster censorship.”
London is eager to remain a bridge between Washington and the rest of Europe, and in the wake of the recent US-UK tariff deal, British politicians are wary of alienating their most important international partner.
Yet the UK’s foray into social media data access offers a unique opportunity to learn lessons from the EU’s ongoing data access efforts to create a regime that is more nimble, more secure, and better aligned with society’s growing need to understand global platforms’ real-world impacts.
Under the EU’s Digital Services Act (DSA), independent researchers have been able to apply to social media giants to access publicly available information over the last 12 months. In the Fall, primarily academics will also be able to request access to more sensitive user information, such as location data, posts, and other engagement metrics, so long as they adhere to strict safeguards to prevent data leaks.
But so far, the EU’s world-first attempt to boost social media transparency and accountability via data access mandates has been more bark than bite.
Few have been able to access the publicly facing data. Companies have either designed overly complicated application procedures or have rejected submissions for being out of scope of what’s allowed under the DSA. The private data access process has yet to launch and is unlikely to give researchers meaningful insight before 2026, at the earliest. It is also expected to face pushback from companies nervous about opening up their systems to outsiders.
The UK should learn from these missteps as it develops its own data access regime.
To figure out what that should look like, I am working with three British academics as part of a yearlong Social Platforms Data Access Taskforce designed to advocate for greater independent social media access — but in ways that protects privacy, upholds world-class security and delivers on the UK government’s stated goals of improving transparency on platforms that deeply affect the British public.
At this stage, there are more questions than answers.
But in a post-Brexit world where the UK stands apart from its counterparts in the EU, merely replicating the 27-country bloc’s data access regime is a missed opportunity.
Such a copy-and-paste approach to social media transparency would double down on limitations already apparent within the DSA.
Worse, it could prevent other countries from forging their own paths if two of the leaders on online safety regulation, the EU and UK, adopted identical — but flawed — rules.
Instead, British policymakers should take inspiration from the useful parts of the DSA — such as definitions around societal risks, criteria for researchers seeking to access social media data, and developing data protection protocols — that do not need to be reinvented from scratch.
But the UK should then strike out on its own path to offer an alternative to what is underway within the EU.
That could mean, for example, making it explicitly clear that so-called public interest scraping, or automated collection of publicly available data from social media platforms, is legally permitted under the country’s online safety regime — something that remains ambiguous under the EU’s DSA.
It could also involve creating a central body to securely collect and store sensitive social media data as opposed to the more decentralized, ad hoc approach the EU envisions under recently published guidelines that will come into force in September. That was one of Ofcom’s suggestions in its recent data access report.
This is not about creating an entirely different set of rules for the UK compared to the EU.
But just as the UK’s OSA is not the same as the EU’s DSA, so too should its approach to social media data access reflect local needs, while also offering other countries an alternative to Brussels’ efforts at platform transparency.
If done correctly, the UK’s data access regime — just like many of the country’s rules in the post-Brexit era — can be complementary to the EU’s approach, giving independent researchers additional tools to unlock greater society-wide insight about online harms. It can also help address an underlying societal concern: the lack of quantifiable understanding of what actually happens on social media.
That is the task that awaits British policymakers and politicians over the next 12 months. They must now grasp this opportunity to develop a data access structure that demonstrates why the UK deserves to be seen as a global leader in digital policymaking.
Authors
