Can Data Portability Shift Power in Europe’s Digital Ecosystem?
Megan Kirkwood / Mar 12, 2025Megan Kirkwood is a fellow at Tech Policy Press.
Data portability is the ability of individuals to obtain and reuse their personal data for their own purposes across different services. For instance, if an individual chooses to leave a social media site that they have been using for a long time, the policy would enable them to transfer their profile, content, photos, and other relevant data to a new platform rather than rebuilding their digital presence from scratch or losing valuable data.
As data portability advocate Chris Riley explained in a 2020 paper on the topic, the benefit of data portability is that users:
“can control the data by extracting it from a platform you no longer trust, and manage it directly, or offer it instead to a different service provider in whom you have greater trust. Competition thus emerges as the second purpose of data portability; regardless of your interest in data ownership or privacy values, if you can port your data to another service provider, you can switch services with low transactional cost.”
In theory, data portability shifts data ownership away from platform owners and toward the users creating the data. And, as pointed out in the quote above, it offers opportunities to open digital markets to new services and products, reducing the market concentration in digital markets. As many global policymakers grapple with the dominance of US tech platforms, data portability could drive the development of alternative platforms and business models for social networks and promote digital sovereignty.
Data portability is recognized in the European Union under Article 20 of the General Data Protection Regulation (GDPR), which provides that individuals have the right to access personal data from a ‘data controller’ and transport it to another ‘data controller’ in a machine-readable format, where technically feasible. Outside the EU, data portability is also found in the data protection laws of Canada, Brazil, and Singapore.
However, despite being legally established in Europe in 2018, data portability remains underutilized. Tools such as Google Takeout and Facebook’s Download or Transfer Your Information only support manual data transfers, requiring users to find the data download tools in their settings and wait hours or days for the data to be downloaded. This limitation means data cannot be kept up to date easily, nor can it be merged with data from other services due to differences in data format. Moreover, downloaded data is often “delivered as a bundle of technical files, hard to understand and often delivered without explanation,” which massively reduces the ability of the average user to do anything with it.
The biggest hurdle is the lack of services that will accept ported data. For example, Facebook’s data transfer tools offer the option to deliver data directly to one of five endpoints. However, all of the endpoints on offer are cloud storage services, and none are alternative social media platforms. This creates a classic chicken-and-egg problem: without viable destinations for a user’s data, users have little incentive to move it, and the lack of access to user data means it remains in the hands of the incumbents, and new services struggle to emerge as competitors to established platforms.
The role of the DMA in improving data portability
Despite the limited success of data portability under the GDPR, the EU Digital Markets Act (DMA) introduces an opportunity to remedy the situation. Under Article 6(9), designated gatekeepers– the large incumbents such as Google (Alphabet), Meta, Apple, Amazon, Microsoft, TikTok (Bytedance), and Booking, must provide:
“end users and third parties authorised by an end user, at their request and free of charge, with effective portability of data provided by the end user or generated through the activity of the end user in the context of the use of the relevant core platform service, including by providing, free of charge, tools to facilitate the effective exercise of such data portability, and including by the provision of continuous and real-time access to such data.”
By expanding the scope of transferable data, this provision in the DMA could be used to enhance the utility of data portability. Previously, users could only port data they directly provided to a service, but the DMA now extends it to include data generated by a user’s activity. Additionally, while Article 20 does not require data controllers “to adopt or maintain data processing and transfer systems that are technically compatible with other controllers in different organizations,” the DMA addresses this issue. Under Recital 59, “data should be received in a format that can be immediately and effectively accessed” by the end receiver (e.g., a competing social media platform or other related service).
Perhaps the most significant change is that where previous data download tools were cumbersome for the user, who had to manually download and port their data, under Article 6(9), services should provide “continuous and real-time access to such data.” To comply, the gatekeepers have, for the most part, introduced or improved their Application Programming Interfaces (APIs). APIs are “an interface of a computer program that allows the software to ‘speak’ with other software” and can facilitate automatic and secure transfers, thus allowing verified third parties to access user data more seamlessly, granted by users through an intuitive consent flow.
Is the regulation working?
To evaluate the effectiveness of the new data portability APIs, I conducted a study to capture the perspectives of developers attempting to build products and services utilizing the new tools. Between September 2024 and November 2024, I interviewed seven developers, mostly business owners in various stages of app development and deployment, to understand their experiences with the new APIs.
The findings revealed that developers faced significant challenges in accessing these tools. Many struggled with the cumbersome verification processes, unsuitable security verification assessments, and lack of communication from gatekeepers. In some cases, such as with Meta, an API or other suitable automatic transfer has not yet been implemented. Although Meta had been working on creating an improved data portability tool, it recently removed all documentation, suggesting it has abandoned the effort.
The need for oversight
While concerns about preventing unauthorized access to personal data are valid, platforms have a history of restricting access to APIs under the guise of privacy and security— often as a strategy to protect the firm’s market power. In the early stages of growth, platforms, particularly social media platforms, tend to encourage third-party developers to access APIs and build complementary services that add value to their business and increase use.
An example of this pattern is found in the story of a third-party application called Tweetie. Originally released in 2008, it was an application made for iOS and Mac devices to access Twitter (now X), which at that time did not have a mobile or desktop application but only ran on web browsers. The Tweetie application even developed the pull-to-refresh function, which is well-known to Twitter users. It was acquired by Twitter in 2010 to become the official app. Similarly, broader examples, such as smartphone app developers creating value for Apple and Android or the very structure of the internet as an open network, highlight the benefits of openness in digital ecosystems.
However, once a platform gains market share, it often limits or strategically controls third-party access, particularly as potential competitors emerge. For example, researchers documented changes made by Facebook and found that “by limiting and restructuring API access to user and friends data [...], Facebook intended to undermine any competitors who used friend data and to reward complementors who added value to Facebook.” This can be observed in the case of the video-sharing app Vine, where Facebook cut off API access due to an “ongoing feud between Facebook and Twitter,” the latter owning Vine. Facebook made clear at the time that “apps that are using Facebook to either replicate our functionality or bootstrap their growth in a way that creates little value for people on Facebook” would not be supported by the platform.
Moreover, following the Cambridge Analytica data-sharing scandal of 2018, Facebook “reportedly suspended some 200 third-party apps,” although data collection and use for ad targeting continued to be the main revenue driver of the company. More recently, Twitter, now X, introduced huge paywalls on access to its APIs in 2023, inhibiting journalists and researchers from studying the platform and raising the cost for third-party developers.
This trend reveals a fundamental power imbalance between large platform operators and businesses dependent on API access. Changes to APIs “may cause disturbances or ripple effects across the entire ecosystem of apps and services relying on an API, potentially impacting the viability of all apps and services supported or sustained by it.” This underscores the importance of maintaining regulatory oversight of how remedies are implemented when they rely on opening up platforms through technical tools like APIs.
Ensuring effective implementation
Recent developments suggest some progress is being made due to the DMA. In a workshop led by the Data Transfer Initiative on February 11, 2025, Google announced several API improvements that address issues raised in my study. Notably, it introduced a Data Minimization feature in its API, allowing users to port only the relevant data within a specific time range rather than an entire history. This feature also benefits developers by reducing the volume of unnecessary data they need to process. Google also announced plans to improve the verification process, which they recognized as unsuitable.
However, the developers I interviewed expressed deeper concerns about the verification process and whether they can trust gatekeepers to manage API access fairly. Many favored independent verification systems or oversight by the European Commission rather than having gatekeepers control access. Without independent oversight, they could continue the historical trend of revoking or altering API access without explanation or justification, effectively shutting down emerging businesses and services. Thus, ensuring data portability regulations function as intended requires robust enforcement that prevents gatekeepers from using their control over APIs to entrench their dominance.
Can data portability reduce Europe’s dependence on US incumbents?
Data portability has the potential to reduce Europe’s reliance on big tech, but only if it is designed to empower users to move beyond the ecosystems these companies control. As Cory Doctorow argues, “a mandate to let users take their data from one company to another—or to send messages from one service to another—should be the opener, not the end-game. Any kind of interoperability mandate has the risk of becoming the ceiling on innovation, not the floor.”
To truly challenge the dominance of the US tech incumbents and avoid the DMA or ex-ante laws like it from merely securing gatekeepers' power over their ecosystems, Europe must encourage competition and new business models within those tech ecosystems and markets over which gatekeepers remain in control, and empower users with the tools to move outside those ecosystems and experiment with new services.
Many policymakers increasingly recognize the risks of relying on a few corporations for essential digital infrastructure. As the Trump Administration upends transatlantic alliances, the need for EU alternatives is increasingly being prioritized, fueling calls to create sovereign digital ecosystems. This new focus on digital sovereignty requires European users to have “the ability to have control over your own digital destiny – the data, hardware and software that you rely on and create.”
To secure this independence, the European Union is urging European member states to increase digital investments in an effort to strengthen the EU’s “digital sovereignty and [...] standards, rather than following those of others – with a clear focus on data, technology, and infrastructure.” Building sovereign infrastructure was also highlighted in the Commission’s 2025 Competitiveness Compass to ensure that the continent is “at the forefront of innovation in tech sectors that will matter in tomorrow’s economy.”
Moreover, multiple voices are campaigning to build a European tech stack, often dubbed the EuroStack, or alternative visions including a public-led digital stack. These efforts argue the need to build core European infrastructure technologies through the entire stack to ensure European values, such as human rights protection, are encoded into those systems. Thus, digital sovereignty is not just about regulation; it requires structural power over data governance and technical standards. As Robin Berjon writes, controlling the infrastructure ultimately controls the power dynamics:
“Most importantly, the value of data depends entirely on what you are able to do with it. [...] What matters is who makes the rules according to which data is processed and actions executed. [...] What matters is who has the structural power to deploy the standards they want to see and avoid those they dislike. [...] Both for data governance and standards, what matters is structural power. If you have it, you can meaningfully steer both, if you don't, you can't.”
The power of data portability to reduce our dependencies on incumbents will rely equally on the ability to shift the power away from big tech, be it through the control of APIs or other means, and the ability to create a new digital ecosystem where data can be transformed from “a tool of exploitation into a shared resource for societal progress.”
Authors
