As Privacy Policy Heats Up, Lawmakers Should Heed Gen Z’s Preferences
Jen Sidorova / Aug 13, 2025We are in the midst of an ambitious legislative moment for data privacy regulation. But as lawmakers debate legal frameworks that will shape the future of online interactions, one question remains underexplored: what should privacy regulators learn from the habits and preferences of younger generations?
Europe has long enforced privacy through its notorious General Data Protection Regulation (GDPR), which centralizes data protection standards, mandates explicit user consent, and empowers regulators to levy fines of up to 4% of global revenue for noncompliance. Meanwhile, 19 US states have enacted comprehensive privacy laws, and many others are actively considering them. While Congress has not yet passed federal legislation, recent contenders — the American Data Privacy Protection Act (ADPPA) of 2022 and the American Privacy Rights Act (APRA) of 2024 — have their foundations in comprehensive laws that are heavy on compliance and have been found to impact user experience and stall innovation.
Practically, these laws are premised on an outdated understanding of how much and what kind of privacy individuals value — and how much they are willing and able to control. Many current privacy proposals, whether it's Europe’s GDPR or US bills like APRA, rest on the assumption that people want maximal protection from all forms of data collection. These frameworks are primarily built around principles like state preemption, opt-in requirements for sensitive data, private rights of action, and data minimization. But examining the online behavior of younger generations offers valuable lessons about where these assumptions diverge from reality.
Privacy, but on their terms
Gen Z and Millennials are more likely to be comfortable with personalized advertising (22%) than Gen X and Baby Boomers+ (15%), according to online research firm YouGov. Online ads following them around or apps logging their clicks are practically baked into the experience of being online. According to Pew Research, while 56% of those 50 and older take issue with their data being used for personalization, for those under 50, the number drops to 41%. In the same Pew series, 72% of adults under 30 say they immediately click “agree” on privacy policies without reading them, compared with 39% of those 65+ — a good behavioral signal of being less alarmed by background data flows.
While this is hardly a majority, it points to a generational pattern: many young people see practices such as targeted ads and location tracking as acceptable trade-offs for modern convenience (or simply unavoidable). Gen Z is also more comfortable with certain forms of surveillance in personal relationships. For example, they are more open to sharing location data with friends or significant others, whereas older generations might deem that less appropriate.
But crucially, ‘not minding’ institutional tracking does not mean Gen Z has no privacy boundaries. On the contrary, this generation places enormous value on consent and control in their social sphere. Gen Z is more likely to seek permission before posting about others, across all relationship types — from close friends to acquaintances. This is a generation that views privacy not as secrecy, but as narrative control — deciding what to share, with whom, and when.
The personalization paradox
Most modern privacy frameworks treat personalization and tracking as risks to be minimized through consent mandates and strict opt-in requirements for sensitive data. But this framing is increasingly out of step with how younger users, particularly Gen Z, approach their digital lives.
Perhaps the clearest illustration of Gen Z’s pragmatic privacy stance is their love-hate relationship with social media. They are digital natives and are constantly plugged into platforms where they share enormous amounts of personal data. This is what is often referred to as the “data privacy paradox”: Gen Z willingly surrenders data to social apps in exchange for the customized experiences they prefer. An Oliver Wyman Forum survey found that about 88% of Gen Z respondents said they were willing to share some personal data with a social media company if it improved their experience, a far higher share than the 67% of older adults who agreed. This suggests that Gen Z has largely accepted personalization as the price of admission.
They’ve grown up with algorithmic feeds tuned to their tastes, and they know those algorithms run on data. When asked hypothetically, young people are much more likely than the prior generations to say they’d trade personal data for a better website or free content, according to a study commissioned by the hosting company WB Engine. In fact, by one measure, Gen Z rated their willingness to share data for a better online experience about 15% higher than non-Gen Z did. As the researchers summarized, Gen Z finds personalization to be “more of a non-negotiable need — even if it puts their privacy and data at risk.” This helps explain why many Gen Z-ers have rallied against the TikTok ban in the US — essentially choosing a beloved, tailored social experience over abstract data protection concerns. The value they get from curated content, viral trends, and algorithmic discovery often outweighs the privacy they give up in return.
Empowerment over restriction
The same think tank study found that Gen Z-ers clear their cookies, browse anonymously, and use encrypted communications “twice as often as other generations.” In practice, that means habits like opening incognito windows, using virtual private networks (VPNs) and encrypted messaging apps, and regularly purging trackers are second nature to many young people. They grew up with a smartphone in hand and understand the levers of digital privacy intuitively — adjusting app permissions, disabling location tags, and finding workaround tools to stay private. This everyday privacy hygiene shows a generational belief that protecting personal data is each user’s responsibility.
So, what does all this mean for how we protect user privacy? The next generation doesn’t need a flood of consent pop-ups, like European users saw once GDPR took effect — they need privacy defaults that respect their intelligence. Gen Z’s behavior sends a clear message: empower us through giving us technical solutions and not restrictive regulations.
Tools like browser-based universal opt-out mechanisms, tiered consent systems, and self-sovereign identity (SSI) frameworks offer a smarter, more user-centric approach to privacy.
Ironically, one of the cleanest fixes — Global Privacy Control (GPC), a browser‑based universal opt‑out — caught on because California required sites to honor it, even though the same standard could have emerged from industry coordination. GPC lets users set a single signal in their browser that websites must respect, reducing the need for pop‑ups or per‑site toggles.
Tiered consent matches the sensitivity of data to the level of required user involvement: low-risk data might require only a one-time agreement, while highly sensitive information — like health or biometric data — would trigger more detailed, explicit consent. Self‑sovereign identity systems, such as wallets built on W3C Decentralized Identifiers (DIDs) and Verifiable Credentials like Microsoft Entra Verified ID, take this further by giving individuals portable digital identities that include their privacy preferences. Once set, these preferences would automatically apply across platforms and services, saving users time while reinforcing their autonomy.
Together, these tools reduce friction, improve compliance, and better reflect how younger users want to manage their digital lives, all without overburdening small firms or chilling innovation.
But to build these tools at scale, the market needs room to experiment. Blanket regulatory mandates — especially rigid opt-in systems or inflexible consent formats — risk preventing the growth of innovation. Compliance with fragmented and overly prescriptive frameworks disproportionately burdens startups and small firms, who must divert resources from product design to legal conformity.
As Congress yet again debates how to legislate the future of data, it would do well to look at what younger users are already doing: customizing their own privacy boundaries and demanding tools — not rules — to support them. In other words, the lesson from Gen Z is not to tailor laws just for them, but to recognize how their habits illuminate a broader truth about modern privacy: people want agency, adaptability, and meaningful control over their data.
Authors
