Digital Media Research Changes Will Alter Election Studies
Josephine Lukito, Megan A. Brown / Jun 4, 2024Josephine Lukito and Megan A. Brown are among the authors of The State of Digital Media Data Research, 2024.
Digital media research is always changing. The development of new methods, policies, and data access strategies means that researchers regularly adjust not only how they conduct studies, but what to even focus on. While shifting platform policies and regulatory efforts like Europe’s Digital Services Act (DSA) can create new opportunities for researchers, as we note in our recent report, other changes in internet technology will also create many challenges to conducting research about elections taking place around the globe in 2024.
From 2023 to 2024, one of the biggest changes in this research landscape has been data access. While researchers of the past decade have benefited from relatively open access through social media APIs, data access to multiple platforms has diminished over the last year. Researchers have been priced out of data access from the X (formerly Twitter) API, which had been commonly used in academic research. And, Pushshift, an archive of Reddit data, went private to comply with Reddit’s API policies.
In particular, Meta’s recent announcement of the imminent sunsetting of CrowdTangle on August 14, 2024—a transparency tool popular amongst researchers and journalists—will greatly reduce researchers’ ability to understand how major platforms such as Facebook and Instagram will be leveraged during the election. This has resulted in several calls to reinstate CrowdTangle until January 2025. In light of these recent data access changes, researchers in both academia and civil society are left in the dark on key transparency efforts investigating the impact of social media platforms on social and civic life.
At the same time, however, many platforms announced academic programs for data access, including the YouTube researcher program, TikTok’s Research API, and the Meta Content Library. These programs have been met with both optimism and skepticism among the research community. While the development of researcher programs within tech companies is a promising step forward, researchers have struggled to balance best research practices with the data security and publication requirements mandated by platforms.
Both the loss of data access platforms and the stringent policies surrounding academic data access are unlikely to be fully addressed by the time of the US 2024 elections on November 5th.
As these conversations are happening, we are also seeing policymakers play a larger role in platform accountability. Most notably, on February 17 of this year, the EU Digital Services Act (DSA) went into full effect, mandating that large platforms give researchers timely access to public data. We don’t yet know how these policies will impact data access for researchers in the US, and it remains unclear what this data access will look like in practice. On Capitol Hill, legislative efforts to mandate researcher access have stalled, though there have been robust discussions over possible provisions in the American Privacy Rights Act (APRA) and Kids Online Safety Act (KOSA) that would address researcher access.
In addition to seismic changes in data access, we also see changes in platforms in the wake of Elon Musk’s purchase of Twitter. Notably, federated social media platforms have gained some popularity. For example, some Twitter/X users flocked to Mastodon, Threads, BlueSky, and other federated (or soon to be federated) platforms. While some, such as BlueSky, have begun to encourage political leaders to use their platforms, the influence of these new social media remain unclear, and may present unique challenges, as researchers must build new tools to analyze them or wait for third parties or the platforms themselves to make data available.
Finally, generative AI’s explosion has affected both elections and election studies. Already, we have seen the use of generative AI to produce deepfakes and other forms of misinformation in the 2024 Indonesian and Indian elections. In the US, most Americans expect AI deception, abuse, or misinformation will affect the 2024 presidential election. These developments will likely shift how researchers assess the scale, scope, and style of AI-generated misinformation.
However, generative AI may also be an opportunity for researchers. Already, there are many studies of researchers using OpenAI’s ChatGPT and other Large Language Models (LLMs) to classify content or attempt to predict election outcomes. However, we encourage researchers to proceed with caution. LLMs and generative AI are promising avenues for future research, but it is important to remember that AI models cannot replace people, and may produce a range of unintended results.
Given politicians’ growing reliance on digital media, particularly during elections, it is imperative for researchers to have responsible access to platforms’ public data. Drawing from our report, we argue that digital media data research should be guided by collaboration, transparency, preparation, and consistency.