Home

Donate

TikTok and a Broader Policy Agenda

Emma Leiken / Mar 31, 2023

Emma Leiken is a Tech Policy Fellow at the CITRIS Policy Lab and Goldman School of Public Policy, UC Berkeley. Currently on the Responsible Technology team at Omidyar Network, Emma leads a portfolio focused on youth organizing and responsible technology.

TikTok CEO Shou Chew testifies at a House Energy and Commerce Committee hearing, March 23, 2023.

A week ago, TikTok’s CEO, Shou Zi Chew, testified before Congress amid national security concerns. Policymakers expressed fears– some founded, some not– that TikTok and its owner, ByteDance, put Americans at risk of Chinese surveillance and manipulation because of the parent company’s relationship with the CCP.

In response, policymakers across the aisle have introduced a range of bills that seek to address safety and security threats. They want to restrict or ban apps like TikTok that are linked to countries deemed as foreign adversaries, such as China, or, alternatively, force ByteDance to sell TikTok.

While some of the provisions in these proposals are worthy of consideration, the TikTok fixation is misguided. Past experience tells us that when foreign adversaries have access to troves of Americans’ data — along with complex processing capabilities— it can lead to attempts at manipulation at scale via targeted ads and political messaging. But opaque data harvesting practices and highly attuned digital profiling capabilities aren’t unique to TikTok. Nor is the general lack of transparency about how the platform’s recommendation algorithms or corresponding trust and safety apparatus work. We have a bigger systems level problem that cannot be addressed by banning one platform. As Tech Policy Press Editor and CEO Justin Hendrix pointed out, “TikTok is not a product of Chinese communism, it is a product of American surveillance capitalism.”

Systems-level issues require systems-level solutions.

In 2022, despite mounting evidence that an array of platforms engage in harmful or reckless practices to the detriment of all users, with younger users bearing a substantial cost, Congress was unable to enact meaningful federal policy to rein in the harms of big tech companies. Here is what Congress should be paying attention to in 2023.

Data Privacy

In the aftermath of the TikTok hearing, advocates, academics and even some policymakers have rightly emphasized that the United States is one of the only developed nations that has no meaningful data protection or privacy laws. They know that the country’s data privacy problem looms larger than just one platform. As Washington Post Staff Writer Will Oremus highlights, “reams of data on Americans’ shopping habits, browsing history and real-time location, collected by websites and mobile apps, is bought and sold on the open market in a multi-hundred-billion-dollar industry. If the Chinese Communist Party wanted that data, it could get huge volumes of it without ever tapping TikTok.” We need policies in place that heighten the national standard for data privacy, including:

  • Data minimization obligations wherein user data cannot be collected or used beyond what is strictly necessary to provide a specific service;
  • Regulation of sensitive data;
  • Clear user rights, like the right to withdraw previous consent, particularly when privacy policies change;
  • The ability to opt-out of data transfers to third parties and targeted advertising;
  • And the creation of special protections for minors, such as a ban on platforms collecting large amounts of sensitive data on children.

Meaningful Transparency

Information asymmetries, a lack of clear standards, and dependence on voluntary and piecemeal mechanisms for data sharing hamper our ability to really understand online harms. As the Federal Trade Commission (FTC) stated in its report to Congress on Combating Online Harms Through Innovation, “Platforms should provide not only public reports but also researcher access to data on the use of automated decision tools for potentially harmful content.” In other words, researchers and the wider public should be able to understand the myriad ways data is generated, collected, and combined by tech companies, and to what end. Platforms should provide researchers and, ultimately, the public, meaningful transparency into recommender systems and content moderation measures.

This kind of access will best position researchers to both study and understand the scope and nature of harms that occur on social media platforms and propose the best evidence-based policy solutions. A well-resourced independent research institute should be created to facilitate researchers’ access to data, including the creation of a code of conduct and meaningful metrics for reporting on recommender systems and content moderation systems. There is, of course, critical work that must be done first to operationalize transparency provisions and considerations around security and privacy protections for user data, vetting and protection of researchers, and distinctions between types of researchers.

Trustworthy Design

Lawmakers should take note of the safety-by-design provisions embedded in the recently-passed California Age Appropriate Design Code. The law will require tech platforms to:

  • Turn on high privacy settings by default for kids;
  • Proactively consider how the design of products could endanger minors (for example when recommendation algorithms favor harmful content);
  • Alert minors when their location is being monitored, and
  • Prohibit the use of dark patterns that trick minors into giving up personal information.

Further, policymakers should reframe harms so that negligent design choices map directly to product liability (this was the basis of a lawsuit against the dating app Grindr that suggests the platform was negligent in its age verification process and that it actively sought to bring underage users onto the app by targeting its advertising on TikTok to minors).

Competition

Many of the biggest tech platforms, TikTok included, rely on surveillance advertising, mining and leveraging personal information to feed users target ads. As the American Economic Liberties Project points out, not only is this a massive data privacy issue that undergirds our entire tech ecosystem, it’s also a manifestation of lack of real competition in the system. Smaller players frequently do not have the core infrastructure to conduct such data processing nor the various business lines through which to employ such targeting. The outsized market share these large platforms have, combined with their ability to self-preference their own products, undermines a competitive marketplace. Congress should prohibit self-preferencing and ensure that all platforms compete on a level playing field.

- - -

We need a legislative framework that can hold all platforms to standards of online safety, privacy and trustworthy design, TikTok included. And there is a role for an array of stakeholders, from Congressional leaders, to the growing Office of Technology at the FTC, to researchers, civil society advocates and, critically, platforms themselves, in helping us realize a safer, more private, trustworthy, and competitive platform landscape.

Disclosure: Tech Policy Press receives support from Omidyar Network.

Authors

Emma Leiken
Emma Leiken is a human rights advocate and technologist with a commitment to belonging, information integrity, safety and inclusion both online & off. Currently on the Responsible Technology team at Omidyar Network, Emma leads a portfolio focused on youth organizing and responsible technology. At Om...

Topics