Home

Donate

Considering Age Verification and Impacts on LGBTQ+ Youth

Mariel Povolny / Jul 9, 2024

Shutterstock

Everyone wants to protect kids. This extends to the internet, where even diametrically opposed politicians want to champion child safety measures. Unfortunately, laws that seek to make the internet safer for children often overlook their right to privacy and fundamental rights like free speech and free expression. Efforts to 'protect' children online can also be veiled attempts to curtail LBGTQ+ rights or may unintentionally do so by failing to factor in their specific circumstances.

Age assurance measures, an increasingly popular way for platforms to manage children’s access to their sites, are poised to become a legal requirement for platforms hosting user-generated content worldwide. However, these measures can disproportionately affect queer and marginalized youth, depriving them of access to vital online resources, communities, and support networks.

‘Age Assurance’ Methods and Concerns

Age assurance technology is rapidly progressing. If you’ve ever clicked “Yes, I’m 18 or older” or entered your DOB to access a website, you’ve interacted with an age gate. This practice is a widespread and relatively non-invasive form of age assurance, known as “declaration” or “age screening.” However, some platforms have moved beyond this, asking users to upload government IDs. Known as 'age verification,' this method requires users to prove not only their age but also their identity by providing personally identifiable information.

Widespread privacy concerns around identity verification and data minimization protocols have moved many platforms even further towards AI-based tools that estimate users’ ages by collecting and analyzing biometric data (fingerprints, voiceprints, iris scans, facial recognition systems, etc.) Other methods include parental consent for minors, vouching (which calls for an individual other than a parent to confirm a user’s age), analysis of online usage patterns, credit card verification, and third-party intermediary verification.

Lawmakers are increasingly mandating that platforms implement these age assurance measures to moderate and filter content for minors. These legislative approaches include restricting access for certain age groups or even barring minors altogether. They aim to protect children from harmful content but subsequently incentivize platforms to be overly cautious, often opting to remove content and restrict access broadly to shield themselves from liability.

Many digital rights advocates oppose age assurance mandates across the board, citing privacy violations and calling them “surveillance systems” because they require the collection of sensitive data. From an equity perspective, documented-based verification systems reinforce existing social disparities, excluding anyone who lacks official ID (including millions of adults who are then denied access to legitimate, legal content). This will only deepen structural inequality and further isolate those with limited access to resources.

Privacy isn’t the only concern. Mandated age assurance threatens children’s right to express themselves freely and access information online, rights recognized by the United Nations and UNICEF. These concerns are amplified for LGBTQ+ youth, for whom the internet can be a lifeline for self-expression and a gateway to crucial resources.

This tension is evident in the debate over the Kids Online Safety Act (KOSA) in the United States. This original version of this legislation, which has enjoyed bipartisan support since its introduction in 2022, mandated a study to explore a "technologically feasible method" for verifying user age at the device or operating system level. KOSA, along with measures like the TikTok ban, reflects a broader legislative trend aimed at protecting children by restricting access to online spaces and content.

Related: Read more perspectives on KOSA

LGBTQ+ advocacy groups further raised concerns about the broad language of KOSA, arguing that it would give state attorneys general in conservative states license to indiscriminately mandate the removal of LGBTQ+ content under the guise of child safety. The legislation has since been amended to focus on design features of social media platforms rather than the content itself. For instance, mentions of “grooming” have been removed, as the term is increasingly used in a homophobic context. Some queer-advocacy groups withdrew their opposition to KOSA in light of these changes, but many remain opposed.

Dangers of Age Assurance to LGBTQ+ Youth

Age assurance poses a clear threat to LGBTQ+ youth. Similar measures on social media have already caused harm. For instance, Facebook's "real name policy" deprives users of anonymity, potentially forcing youth to abandon platforms altogether if they face harassment or bullying. For transgender and gender-queer youth, this can mean creating profiles under legal names they don't identify with, causing significant mental health burdens for an already vulnerable population.

Recent attempts to scale content moderation also illustrate the risk that broad policies pose to queer youth. In the context of social media specifically, content moderation and filtering are often touted as the ‘least restrictive’ methods for shielding kids from harmful material. Studies show, however, that automated content moderation systems remove/filter LGBTQ+ content at a disproportionately high rate. Evan Greer, Executive Director of Fight for the Future, notes that, in its original formulation, KOSA’s duty of care guidelines applied to content recommendation systems in a way that incentivizes platforms to filter and remove content deemed risky. As a result, LGBTQ+ youth may find themselves blocked from viewing content that is relevant to their reproductive health, mental well-being, or healthy sexual development.

Queer youth are more likely to deal with bullying and rejection and often use online spaces as alternative forums for self-expression. For those lacking family support or even facing the threat of physical or emotional violence at home due to their sexuality or gender identity, the internet is a critical resource. A report from the Gay, Lesbian & Straight Education Network found that LGBTQ+ youth use the internet at higher rates, even displaying higher rates of civic engagement online versus offline.

It’s not an exaggeration to say that, for some queer youth, access to digital communities and resources may be the difference between life and death. Different methods of age assurance (parental consent, biometric variation) pose unique dangers to queer youth.

Child protection legislation often seeks to safeguard children by giving parents more insight and control over their online activity. Parental consent is a common form of age assurance, wherein parents must vouch that their child is a certain age or consent to them having an account (and sometimes both). Florida Governor Ron DeSantis recently signed HB 3 into law, which requires social media platforms to obtain parental consent for accounts of users under the age of 14 and gives parents the power to delete their child’s account. Increased parental oversight is risky for LGBTQ+ youth who lack familial support or would be in danger of abuse if their families were aware of their sexual orientation.

Identity disclosure and external consent come with clear risks, but biometric verification can also be problematic. Research by the Algorithmic Justice League and other organizations reveals significant bias in facial recognition technology. AI-based systems struggle to accurately identify women and people with darker skin tones, highlighting a lack of training data on diverse populations. If training data is not accurately equipped to identify queer youth, biometric verification tools are more likely to miscategorize them or incorrectly gauge their age, barring them from content that they are legally entitled to access.

Collecting sensitive biometric data also raises obvious privacy concerns for youth who lack the digital literacy to properly consent and understand the risks. Misidentification, especially for gender-queer youth, can have significant psychological consequences. The experience of being misgendered or inaccurately identified can lead to feelings of alienation and low self-esteem. These risks are heightened for queer youth of color, who are even more likely to be impacted by algorithmic shortcomings and bias.

LGBTQ+ youth are often in more precarious situations than their peers and are thus at higher risk of harm like exploitation. This remains true online. Queer-identifying youth are more likely to experience online sexual exploitation, in part because more of their early sexual encounters take place over the internet. This is by no means a call to restrict LGBTQ+ youth’s access to online spaces. Instead, it underscores the importance of inclusive approaches to online safety that consider marginalized groups' experiences, emphasizing their agency alongside their security. Both aspects are crucial for their long-term health and well-being.

Despite potential threats to LGBTQ+ youth rights, meaningful work is underway to address privacy concerns about age assurance technologies. In France, a ‘digital intermediary’ that acts as a third-party verifier is under government review. The program is a firewall between the content provider and the verifier and does not track the eventual destination of the user. This approach tackles issues with profiling, which is a major concern for the LGBTQ+ community in political/cultural environments where minority sexual identities face persecution.

Reducing Harm: Fostering Digital Literacy and Empowerment

Protecting kids online won’t come from just age gates, parental consent, or facial recognition. Even if age assurance measures were ubiquitous, tech-savvy teens would probably find ways to bypass them. Instead, we should adopt a harm-reduction approach. This means recognizing real dangers online while prioritizing digital literacy skills and providing minors with resources to navigate the online world safely.

There is little consensus about how to define ‘age-appropriate’ material, leading to age gates in situations where they aren’t necessary. To prevent the discriminatory challenges that have troubled content moderation, regulators and platforms should seek input from LGBTQ+ advocates in civil society to create inclusive guidelines. This is especially important regarding "sexual content," a contentious area in content moderation, especially concerning queer content.

Sex education (which already faces increasing opposition) should expand to cover online themes. LGBTQ+ teens, especially genderqueer or cisgender homosexual males, are more likely to explore sexuality online through sexting or sharing nude photos. They must understand not only the dynamics of consent and safety online but also the legal implications of online sexual exploration (e.g., distribution of self-generated child sexual abuse material). Age assurance is problematic on two levels. First, neither data minimization nor privacy is adequately addressed by current age-assurance methods aside from declaration. Beyond that, blocking youth from accessing platforms entirely denies them agency in online spaces. These concerns are amplified for marginalized youth.

Exploration and social interaction are woven into the fabric of the online experience. This is especially true for LGBTQ+ youth, for whom online communities are a vital lifeline for social connection, exploration of identity, and access to resources that may be limited elsewhere. For queer-identifying youth, the internet is a crucial space to explore sexuality in a safe and judgment-free environment. Sweeping measures like age assurance don’t protect LGBTQ+ youth and may even push them further into the margins.

Authors

Mariel Povolny
Mariel Povolny is an MA candidate at Columbia's School of International and Public Affairs where she is a researcher with the Trust & Safety Tooling Consortium, housed at the Columbia Institute of Global Politics. Her research interests are in digital rights and equitable internet governance. Before...

Topics