Home

Donate

Beyond Checkboxes: Privacy Protections That Work for the Future Generation

Lama Mohammed, Meri Baghdasaryan, Reema Moussa / Oct 26, 2023

Meri Baghdasaryan is an international lawyer currently working as a Senior Case and Policy Officer at the Oversight Board. Lama Mohammed is a public affairs and communications expert specializing in artificial intelligence, cybersecurity, and privacy. Reema Moussa is a J.D. Candidate at USC Gould School of Law.

2023 will be remembered as a momentous year for state privacy regulation in the United States. While federal privacy legislation appears stalled, privacy laws are slowly taking effect in California, Colorado, Connecticut, Delaware, Indiana, Iowa, Montana, Oregon, Tennessee, Texas, Utah, and Virginia.

Meanwhile, the recent US-European Union Data Privacy Framework has passed the EU’s adequacy threshold (for now). Globally, new privacy laws continue to pop up, from Tanzania to Indonesia, and from Cuba to India. These developments are coupled with fast-paced advances in the field of generative artificial intelligence (AI), which brings about various privacy, trust and safety, intellectual property, and bias issues.

But do these laws actually make consumers feel more private? Do they make us safer in this digital age of pervasive CCTV and other surveillance systems, when our online activity can be tracked and used against us when trying to exercise fundamental intimate privacy rights?

Privacy threats are everywhere and are multi-sectoral. But do consumers, policymakers or privacy professionals view it in that light? Are privacy norms and practices aimed at compliance, allowing companies and government agencies to check boxes and call it a day on their privacy-ensuring efforts — or do they strive to go further and build user trust, ensuring the protection of user rights? Some privacy regulations, such as the EU’s General Data Protection Regulation (GDPR), started as a step in the right direction toward a more consumer-focused framework, especially acknowledging the difficulties of shifting corporate practices towards a privacy-conscious culture, but implementation is often reduced to a sequence of checkboxes.

Have we lost sight of the goal? And if so, how can we course-correct? What does the notice and consent model of privacy compliance mean for the next generation, and what perspective do young people bring in advocating for equitable, effective, and actual privacy?

We bring three points of view to these questions — that of a human rights attorney, a communications specialist, and a law student — and all three of us are attuned to what our generation is prioritizing when it comes to privacy. We spoke on the subject at the International Association of Privacy Professionals’ Global Privacy Summit in 2023 — one of the largest privacy conferences in the world. This article is our way of summarizing key takeaways from the conversations we engaged in. We do not disregard the positive impact of having frameworks that put privacy at the forefront and strive towards better practices. Instead, we invite all stakeholders to take stock and move forward collaboratively, to ensure privacy as a value and as a fundamental right that does not get lost in the weeds of compliance.

If Gen Z is Desensitized, Has Privacy Law Failed?

Although the continuing enactment of new privacy-focused legislation across the globe is laudable, little is changing in how young people feel about the protection of their individual privacy. Stories of government agencies suing data brokers for collecting sensitive data that could be used to create a case against those seeking abortions, groups outing individuals within the queer community without their knowledge or consent, and immigration facilities increasing surveillance towards migrants at countries' borders, are more salient and frequent in news cycles of today.

Despite the gravity of these stories, the strides made in privacy law largely leave these harms unaddressed. Why? Corporate privacy policies implemented in line with various regulations do not address them head-on. Even for the rights that are outlined within these policies, many users and consumers do not take advantage of advocating for themselves through the processes outlined in these wordy, difficult-to-understand policies, often burying protocols around the collection and usage of individuals' data deep within service agreements. As a generation growing up in a technology ecosystem governed by such privacy policies and punctuated by regular instances of data abuse, it would not be surprising to find young people who may have developed a learned helplessness around their privacy.

Some privacy regulations require breach notices, but young people are less likely to remember experiencing a breach, much less even notice it in the first place (perhaps owing to the average 199 unread emails that sit between Millennials and Generation Z’s two or more inboxes). Even if individuals pay attention to a breach notification, with the number of these attacks increasing every year, many may feel that they have no choice but to use these technologies regardless of their security practices (or lack thereof) and despite the consequences of data misuse or other harms that may occur when a threat actor obtains personal information.

By being desensitized to such phenomena, young people could be forgiven for failing to recognize signs of increasingly sophisticated and subtle cyberattacks in personal and professional contexts alike; from romance scams and identity theft to business email compromise and spear phishing. Members of Generation Z are now three times more likely to be a victim of an online scam than older generations are. Even when an intern falls for a gift card scam resulting from little cybersecurity awareness, it disrupts the entire organizational ecosystem.

Research shows a troubling trend that a majority of youth do not have the skills or tools to protect their information, even if they wish to do so. Many young people seem to care about privacy, but largely only after incidents occur — this should not be the case. Despite coming of age in the digital ecosystem, we still do not know the extent to which our data is collected and used, and the limited policy action around data breaches and mishandling has failed to adequately protect us from the severe consequences of identity theft and personal data falling into the wrong hands. It is, therefore, the privacy profession’s responsibility to limit the degree of desensitization “helpless” individuals experience when, for instance, they receive overwhelming policy breach notices and are left with an unclear sense of what rights they actually possess. These legal devices are better than nothing — but we can do more.

We have watched hearings with top technology executives and read about lawsuits against the designs of digital platforms, only to see little to no change for our generation’s future. Despite investigations into technology companies’ security practices, privacy violations and cyber attacks that impact millions of individuals through data breaches, leaks, and mishandling continue to occur. As professionals in the field, we are aware and concerned enough about risk to engage in additional steps that protect our online identities. Other privacy-aware individuals in our generation have expressed “digital resignation” by creating groups such as the "Luddite Club," which consists of teenagers who have rejected using smartphones and social media. Groups like this one highlight how Generation Z are starting to think critically about their relationship with technology. If privacy protections afforded by today's regulations fail to resolve digital problems, then today’s data economy may come to face a reckoning.

We have detailed the stakes of these digital issues youth face every day below. Privacy obligations need to be re-evaluated to reflect these concerns and help instill trust and accountability between the next generation, technology developers and innovators, and the government.

The Real Issues

In thinking about the landscape of issues in the digital ecosystem, young professionals in privacy and technology policy are thinking of addiction and mental health, as well as algorithmic justice, as top technology issues.

Addiction & Mental Health

Increasingly known as the “tobacco moment” of the youth mental health crisis, journalists and researchers have explored various ways that social media and digital platforms can play a negative role in the youth’s overall well-being — from comparisons ushered in by unrealistic beauty standards to a constant feedback loop of jarring images and content around climate change, war, and other traumatizing subjects.

In the attention economy, even if we recognize these patterns of declining mental health, the purposefully addictive designs of social networking and other digital platforms, such as manipulative designs in video games, targeted advertising, and personalized recommendations, have aided in the pursuit of profit over public good.

Even those who choose to disengage from the digital realm also experience side effects from the “fear of missing out.” In an increasingly interconnected world, technology is necessary to connect with one another, to see family and peers’ live whereabouts, participate in activism, hear and engage in popular culture, and more. Finding oneself in a situation where they are not aware of the latest meme or TikTok trend could translate into social exclusion when hoping to engage and connect with their peers in the physical world.

Meanwhile, states across the US have enacted or are drafting social media bans and parental approval policies. While the effects of digital and social media on young people are getting more attention, policymakers must consider the relationship between addiction, declining mental health, social media platforms, and privacy implications of contemporary approaches to solving these issues in the future.

Algorithmic Justice and AI Governance

Youth-centered organizations, including Design It For Us and Encode Justice, recently sent a letter to Congress and the White House Office of Science and Technology Policy, calling to include more young people in discussions on AI governance.

Algorithmic justice is core to protecting everyone, including young people, and privacy has a key role to play in establishing a process for redress to respect individuals’ rights. “Bossware,” AI in classrooms, and the potential use of generative AI to create deepfakes and nonconsensual sexual images as a unique form of cyberbullying are a few examples highlighting the high stakes for policymakers to make informed choices on the most effective solutions to these emerging issues. As early adopters, youth are likely quick to jump into using technology heavily dependent on mining the information they share despite having a limited understanding of how it contributes to their own harm.

With AI development showing little to no signs of slowing down, many regulators may feel overwhelmed, especially those who have been working in this area for decades. It is critical that after the era of hype, AI governance continues to remain a priority, preventing technology from getting too far ahead of policy. We still need methods of informed consent and designs that make it easy for individuals to understand what information is being collected and how it is used to power this technology.

Is Privacy Compliance Helping Anyone?

The notice and consent-driven model falls short in addressing contemporary issues for digital citizens. In the US, the patchwork of privacy and data breach notification laws creates several different compliance requirements for businesses. With 12 different states issuing their own privacy laws with different obligations, and each of the 50 states (as well as US territories) possessing different breach notification laws, compliance professionals are struggling to keep up with the complex web of requirements that their organizations must meet — or else, face the threat of fines and penalties. The broad applicability of many privacy laws to small businesses, nonprofits, and other organizations that don't have sufficient resources to comply results in privacy compliance crippling innovation and entrepreneurship. This is not where we need privacy law to go.

Further, even when a company does have the resources to comply — what does compliance actually provide for consumers’ protection? Of course, having a baseline cybersecurity and privacy-by-design infrastructure within an organization goes a long way in protecting consumers from contemporary threats to their digital civil rights. However, the current slate of privacy compliance obligations does little to address the most impactful issues facing online users, and has instead become a checklist of items for companies to tick off rather than an integrated component of a company’s competitive advantage, strategy, and corporate social responsibility.

One salient element of this problem is the divide between anti-surveillance advocates and privacy compliance professionals; trust and safety teams and national security workers; founders and legislators, founders and regulators — and all the permutations of these across one another. The tech policy field is siloed, despite its interdisciplinary nature. Now, more than ever, collaboration is required if we want to move beyond the deficiencies of checkbox-based models.

Conclusion

What is the way forward? We call for action and collaboration:

  • Ensure Privacy by Design: Collect only the data necessary for the product to function effectively and in the best interests of the user. Ask for explicit, informed, and meaningful consent when offering new products or services, features, or technologies, presenting the changes to the ongoing privacy regime and newly requested permissions in a user-friendly, easy-to-comprehend manner.
  • Consider Trust and Safety Implications — Diversity and Inclusion is Key: Trust and safety implications are not the same for all users. Build products with the full scope of potential users in mind. AI can enhance the user experience and is widely deployed in trust and safety, but algorithmic solutions should not discriminate between users.
  • Listen and Engage With Youth Organizers: The next generation has by and large lived with digital technologies increasingly integrated into their daily lives — and this will be even more true of future generations. Youth should actively be given a seat at the table in policy discussions or offered a platform to speak on these issues, such as through the Internet Law & Policy Foundry, product design focus groups, and regulation development.
  • Professionals Should Strive to Cross-Pollinate in Adjacent Fields: In these fast-paced times, it is challenging to stay abreast of all the developments in any one field. But as privacy intersects with antitrust, cybersecurity, trust and safety, artificial intelligence, and other fields, it is crucial to keep tabs on the main trends in the fields intersecting with yours, as these issues all exist within the same ecosystem, and by working together, we can achieve much more.

- - -

Meri Baghdasaryan, Lama Mohammed, and Reema Moussa engage with Tech Policy Press voluntarily and in their personal capacities. The views and opinions expressed do not necessarily reflect those of the organizations with which they are affiliated.

Authors

Lama Mohammed
Lama Mohammed (she/her) is a public affairs and communications expert specializing in artificial intelligence, cybersecurity, and privacy. She previously worked on cybersecurity policy for D.C. government relations firms and peace and security policy for the United Nations. Lama is also an active me...
Meri Baghdasaryan
Meri Baghdasaryan (she/her) is an international lawyer from Armenia, currently working on issues at the intersection of law and technology as a Senior Case and Policy Officer at the Oversight Board. Her expertise and experience focus on issues related to global free expression online, intermediary l...
Reema Moussa
Reema Moussa (she/her) is a J.D. Candidate at USC Gould School of Law, concentrating her studies and practice on cybersecurity, privacy, artificial intelligence, and trust and safety. With her background in communication, international studies, and technology management, she endeavors to build bridg...

Topics