Home

How the Passage of KOSPA May Impact Tech Roadmaps at Small and Medium-sized Companies

Vaishnavi J / Aug 20, 2024

Senate Majority Leader Chuck Schumer (D-NY) at a press briefing following the passage of the Kids Online Safety & Privacy Act on July 30. Sens. Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT), to his left, were sponsors of the legislation.

On July 30, the Kids Online Safety & Privacy Act (KOSPA) passed in the United States Senate with resounding bipartisan support by a vote of 91-3. It will now need to make its way through the House, where its prospects are far from certain. I’ve written previously in Tech Policy Press about why child safety regulation has become so popular in the US and analyzed the arguments for and against KOSPA’s predecessor, the Kids Online Safety Act (KOSA). If the bill passes the House, becomes law, and withstands the inevitable legal challenges, companies providing online services may significantly need to adjust their product and policy roadmaps over the next 24 months.

Much of what compliance will entail is uncertain, but as the House debates the bill, the issue is an important one for small and medium-sized companies. Larger companies with extensive resources will most likely be able to manage the additional requirements that KOSPA requires no matter what the final law requires of companies, but that’s not the same for more resource-constrained companies. In an effort to illuminate those potential challenges, I explore what is included in KOSPA and what it may mean for small and medium-sized companies.

Safety and privacy are now part of the same discussion

KOSPA combines the Kids Online Safety Act (KOSA) and the Children & Teens Online Privacy Protection Act 2.0 (COPPA 2.0) as a single piece of legislation. Under KOSPA, companies would have the following broad obligations:

  • Duty of care: Companies should exercise reasonable care when creating or implementing a design feature to prevent and mitigate harms to minors, including mental health disorders, addiction-like behaviors, physical violence, sexual exploitation, promoting substance abuse, or predatory marketing practices. KOSPA is careful to avoid any content-based requirements and tasks the Federal Trade Commission (FTC) is tasked with enforcing this standard and developing guidelines for compliance.
  • Default and design safeguards: Companies should turn on the highest privacy and safety settings by default when they know a user or visitor is a minor and allow them to limit or opt out of features like personalized recommendations. Companies should also limit design features that may increase the amount of time that minors spend on a platform. Some examples include autoplay (infinite scrolling), rewarding time spent on the platform, push notifications, personalized recommendations, in-game purchases, and appearance-altering filters.
  • Personal information and targeting: Companies are prohibited from collecting personal information from 13 to 16-year-olds without their consent; COPPA 1.0 only covers children under the age of 13. Companies also cannot target them with personalized advertising, but contextual and first-party advertising is still permitted. Companies should also allow teenagers to delete information that has been collected about them and “refuse to permit the operator’s further use” of personal information gathered from the teen.
  • Age assurance: Companies should consider whether “a reasonable and prudent person under the circumstances would have known that the user is a minor.” This expands COPPA 1.0’s “actual knowledge” standard that a person might be underage. While KOSPA clearly states that the act should not be interpreted as requiring a platform to “implement an age gating or age verification functionality,” it tasks the FTC with providing guidance within 18 months of enactment on best practices around how to determine “reasonably likely.”

If KOSPA becomes law, larger companies that have already been preparing for European regulation may find it easier to adapt to its requirements. However, smaller and medium-sized companies will need to significantly modify their product development process to be in compliance. The following sections examine some of the potential changes and approaches that small and medium-sized companies may need to consider if the legislation is enacted.

First, companies may need to reallocate resources

In the tech layoffs of the last few years, small and medium companies alike have significantly pared back teams focused on trust & safety, user experience, design, data science, and analytics. KOSPA’s passage may require bringing back some of those employees to prioritize safety and privacy features for younger users or engaging with a growing marketplace of external vendors to supplement this loss in capacity.

Companies serving children and teens may also need to invest in building or buying significantly improved data management systems to comply with new data protection and potential age assurance requirements. Small and medium firms could see a substantial portion of their legal budget redirected to compliance. At the same time, teams like user experience and design, data science & analytics, and trust & safety will also become crucial to the compliance process. These teams will need to work alongside legal teams to understand user behavior, including access to harmful content, playing a pivotal role in assessing the potential risks of new products for children and teens.

Most small- and medium-sized firms with youth users have not traditionally had dedicated teams to understand the youth experience. KOSPA recommends research into studies disaggregated by age ranges of 0-5, 6-9, 10-12, and 13-16, suggesting that the FTC or state attorneys general may evaluate companies based on the age-appropriate protections they have designed for these bands. Services like e-commerce sites, marketplaces, and game developers will, therefore, need to actively consider how they design their experiences for different age bands of users who may be reasonably likely to access their service.

Finally, product and engineering teams at small and medium-sized firms should expect to see their resources reallocated to either implement new features or modify existing ones to comply with these regulations. Larger players with a strong European footprint are already building out these roadmaps to account for regulations like the Digital Services Act (DSA) in the EU and the Online Safety Act in the UK. However, smaller companies or those mainly targeting US audiences may now see significant reallocations as similar regulations now come home. Companies with the budget might expand their existing teams; others will look to deprioritize existing projects to meet these new regulatory expectations.

Preventing and mitigating harms to minors (‘Duty of care’)

KOSPA states that companies must “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors.” This requirement, known as a “duty of care” requirement, is one of the bill’s most significant developments and likely to be one of the most disputed and litigated. If this requirement survives legal challenges and is ultimately regarded by courts as content-agnostic, companies will need to review their product development processes and risk mitigation strategies to account for this duty of care.

KOSPA outlines serious harms such as anxiety, depression, eating disorders, substance use disorders, suicidal behaviors, and harassment – topics on which few small companies have dedicated product or policy expertise. To address their duty of care, they may need to more proactively seek out external academic and civil society experts, children, and parents and incorporate their insights into product and policy development. External experts in these areas may not necessarily have expertise in internal product or policy development, so companies may need to dedicate resources to understanding how these external recommendations can be integrated into the product and policy development process. Companies may also need to increase their investment in explainability, providing more granular insight into how they are building their products.

Considering the range of services covered by KOSPA, companies will also need to fit the broader, principled guidance from external experts into the specific ways in which their products operate. The list below is not meant to be exhaustive but provides a sample of the range of services and platforms that may need to address their product design:

  • Online multiplayer games: Companies may need to understand how broader expert guidance around avoiding anxiety or depression can be factored into designing their players’ gameplay or the social elements of these games, such as voice or text chat features. This might involve increased investment in proactive moderation instead of relying on community moderation or reactive user reports to protect children from abuse.
  • Online marketplaces: Marketplaces where children and teens buy and sell goods may need to assess whether the way in which they recommend, review, or sell their products risks exacerbating any of the harms to children that KOSPA outlines.
  • Direct messaging services: These services may need to review their anti-bullying and harassment measures to ensure that they are truly effective. When paired with recommendation or discovery features that influence where people can find one another, they will also need to take more “top-of-funnel” measures - more proactive screening for potentially concerning interactions. This may look like preventing all unknown adults from connecting with minors or not recommending posts from children or teens to adults.

Companies that have been preparing for European regulations may already have some of these mechanisms in place – the DSA, for example, requires very large online platforms and search engines to share an assessment of their platforms’ systemic risks, the mitigations they’ve taken to address these risks, and their crisis plans that are in place. Compliance with the DSA has been a significant part of larger companies’ roadmaps for a while, and smaller companies may be likely to feel the effect of these requirements in the US if KOSPA is enacted.

Still, this is likely to be a drawn-out process for most companies, particularly small and midsize ones that do not currently set aside headcount or budget to address whether their services are aggravating these kinds of harms. Even companies that do currently engage with external experts report feeling stymied by understanding which aspects of these harms are broader societal issues and which ones are being amplified by their products. They will still need to better understand how general youth behaviors translate into observable on-service actions that they can then account for in how their products are designed and the user experience for children and teens. Product, engineering, user research, and policy roadmaps will need to significantly expand to account for fulfilling this duty of care requirement.

Design requirements and default settings

These two aspects of KOSPA are particularly interesting yet will have the most significant knock-on consequences for roadmaps across companies. Product and design teams will need to evaluate whether a particular feature merely makes the experience more engaging or if it tips over into being addictive in nature. KOSPA mentions features such as autoplay (infinite scrolling), rewarding time spent on the platform, push notifications, personalized recommendations, in-game purchases, and appearance-altering filters. Though these are likely to be challenged both in courts and through ongoing research, companies can focus on centering user expectations in their product development. Additional research may help to clarify whether particular features exacerbate addictive behaviors.

More significantly, companies will need to understand how to keep the value of some of these features - e.g., the value of serving user content that is relevant to their interests - without compromising on whether the content is valuable or of good quality. This will come down to aligning on a set of standards for content quality that is safety and privacy-centric. If companies cannot factor in a child or teen’s potential interest in a certain piece of content, they will need to develop innovative new models to decide what to surface for them. This might range from understanding what is generally popular with their age band to using other contextual signals that are not personalized to them.

The proposed changes to default settings will also significantly impact the social graph experience within services that prioritize social connection and discoverability. Social media might be the first example that comes to mind when we think about building personal connections, but this will impact a range of services, ranging from gaming to AI startups with a social component.

I recently published an abridged version of a framework built around age-appropriate AI that includes several default settings for AI interactions with children. Similarly, in gaming, players in many games decide who they want to play with by looking at information like the games other players play, their lists of friends, or the badges they’ve acquired. The highest levels of privacy defaults would require growth and connection teams to find other ways to support players of similar interests finding each other, potentially doubling down on investing in automated matchmaking.

Personal information, targeting, and data erasers

KOSPA’s requirements around the collection, use, and erasure of children and teens’ data will require small or medium-sized companies to build or buy advanced consent management systems and a way for children and teens to delete their personal data. This feature must be integrated into existing surfaces to ensure full compliance, which may potentially require the reallocation of resources from other projects on the roadmap at the time of enactment. As with earlier requirements, companies may find resources being redirected to meet these new standards, requiring specialized software engineering and legal expertise.

These regulatory changes will also incentivize small and mid-sized companies to rethink their broader product marketing strategies. Bans on personalized ads for younger audiences may incentivize services to pivot to contextual and first-party advertising methods, developing new data analytics tools and campaign designs that do not depend on personalized targeting. Companies will need to invest in these new technologies and strategies over the long term, potentially impacting their short-term revenue streams and forcing a reevaluation of marketing budgets and priorities.

In practice, this might look like integrating ads into gaming experiences that align with the game's content (e.g., in-game billboards or sponsored skins). Social media platforms may explore contextual targeting algorithms focusing on user-agnostic experiences in different parts of their services rather than personal data. These strategies will require substantial investment in technical, marketing, and creative teams.

Protecting against unintended harms

After multiple rounds of revision, KOSPA makes it clear that companies must act to prevent and mitigate harm in a way that is consistent with medical information. This reduces the likelihood of the bill being weaponized against marginalized communities, but companies will continue to have a unique responsibility to protect access to critical information and connections. Any new piece of legislation has the potential to be weaponized against marginalized communities, and KOSPA is no exception.

By mandating more rigorous standards for age-appropriate design features, some companies may be tempted to more broadly age-gate access to supportive resources around topics like medical care or LGBTQ support, which is vital and potentially life-saving information. Companies will need to continue to build in ways that are privacy-preserving, restrict disproportionate surveillance of teens, and continue to provide them with access to valuable online communities.

Understanding the age of minors on your service

The undercurrent running through any legislation around age-appropriate design is whether a service can reasonably know that they are being accessed by minors. KOSPA effectively covers most online services, including online platforms, video games, messaging apps, and streaming services. Yet most services across a range of industries have so far relied on simply stating that users under a certain age are not allowed to use their services, taking down underage accounts when they are reported, or at most, asking users to self-declare their age. This minimal approach has been the norm, with little proactive enforcement or verification to date.

If KOSPA becomes law, companies will need to closely monitor guidance from the FTC to assess whether these current measures are likely to remain adequate. KOSPA explicitly says that nothing in the act should be read as requiring an age verification solution, but the FTC’s guidance will likely play a significant role in that risk assessment. Age assurance requirements are likely to have the most significant impact on product roadmaps, particularly for smaller companies, but for now, this is something to monitor.

Conclusion

If KOSPA becomes law, it will mark a seismic shift in the US towards scrutinizing youth safety and privacy across online services. Product and engineering teams at smaller companies may need to reprioritize roadmaps, ensuring that their design features and interactions are not just engaging but also appropriate for younger audiences. This is unlikely to be a box-ticking exercise and will require firms to rethink how they design digital experiences for children and teens.

Additionally, if the FTC takes a page out of overseas regulators like Ofcom, it could potentially levy significant fines on companies that do not demonstrate they are adequately protecting the data of children or designing for their safety. Ofcom recently fined TikTok nearly $2.6 million, stating that its data governance processes had “a number of failings” and that the company had “insufficient checks in place leading to an inaccurate data submission to us.”

The FTC has shown that it is willing to similarly litigate. The Department of Justice recently filed a lawsuit against TikTok on the FTC’s behalf, alleging that it continued to host millions of underage users and collect their personal data even after knowing about their presence on the platform. The FTC also recently banned anonymous messaging app NGL from hosting minors on its service for unfairly marketing its products to kids and teens and made it return $4.5 million to customers whom it charged for a “pro” paid version of the app.

KOSPA’s potential impact on product roadmaps will not be a surprise to larger, more established companies with significant resources. For smaller companies and startups, however, compliance with KOSPA may require some significant adjustments to product prioritization and reinvestment in teams that were previously reduced, such as legal, trust & safety, and even marketing.

All product teams are invested in maintaining vibrant, engaging services. Now, they may need to prioritize safeguarding users as part of their design processes. Services that typically did not account for distinctive youth experiences will now need to do so, building responsibly in ways that still promote growth and innovation. As companies reevaluate their roadmaps against the possibility that KOSPA may ultimately become law, this is a prime opportunity for innovative approaches to protecting young people online.

Authors

Vaishnavi J
Vaishnavi J is the founder & principal of Vyanams Strategies, advising companies and civil society on how to build safer, more age-appropriate experiences for young people. She is the former head of youth policy at Meta.

Topics