Home

Is There a Future for Social Media Parental Consent Laws? Comparing Florida’s HB 3 and Ohio’s Social Media Parental Notification Act

Tim Bernard / Apr 4, 2024

Florida Governor Ron DeSantis signs HB 3, March 25, 2024. Source

On March 25, Florida Governor Ron DeSantis signed into law HB 3. The law has three parts: the first restricts the ability of social media platforms to create or maintain accounts for those under seventeen years of age; part two requires age verification for websites consisting of more than one-third pornography; and the last part establishes standards for “anonymous age verification.”

Most media attention has been on the first part of the bill, and while that will also be the focus of this article, it is worth noting that the second part is very similar to other bills considered and passed in other states over the last few years, including Louisiana and Utah. The only notable component is the requirement for the sites in question to make use of “anonymous” third-party age verification, which is then elaborated on in the third part of the bill. This appears to be an attempt to answer a common privacy concern with age verification measures of this type elsewhere.

Florida’s new law is not the first to require parental consent for young people to use social media. Utah’s SB 152, passed last year, was another law in this vein, but was slated for amendment before a court could rule on challenges by NetChoice and FIRE. Arkansas’s SB 396, also from 2023, was blocked by the courts last summer, due to a range of problems, including confusion about which platforms were covered. Ohio’s Social Media Parental Notification Act was enjoined by a district court on February 12. As there are significant similarities between Ohio’s law and the first section of Florida’s, examining the court order’s conclusions about the merits of that case may shed light on the chances that the Florida bill will ever go into effect.

What was the reasoning of the federal judge in granting the injunction in Ohio?

1. First Amendment applicability

The primary gambit of the Ohio legislators was to frame the law in terms of contracts rather than content:

The operator of an online web site, service, or product that targets children, or is reasonably anticipated to be accessed by children, shall do all of the following:

(1) Obtain verifiable consent for any contract with a child, including terms of service, to register, sign up, or otherwise create a unique username to access or utilize the online web site, service, or product, from the child's parent or legal guardian... (emphasis added)

Their theory was that access to content would have to be justified within the bounds of the First Amendment, whereas a law about commercial transactions is not subject to that scrutiny. The court squarely rejected this argument, declaring that, “(1) it regulates operators’ ability to publish and distribute speech to minors and speech by minors; and (2) it regulates minors’ ability to both produce speech and receive speech,” while also observing that the First Amendment is also pertinent to many commercial endeavors, such as those of most publishers.

2. Level of scrutiny: (a) is the law content-based?

The next question taken up in the preliminary injunction order is whether “strict scrutiny” or “intermediate scrutiny” should apply in this case, which determines how strong the government interest needs to be and how precisely the law has to be calibrated to justify the suppression of speech. A law will require strict scrutiny if it appears to be targeting some content in particular: the state attempting to suppress the expression of certain kinds of speech.

Here, the court concluded that two factors regarding how the law defines its scope reveal that it is content-based. Firstly, the court determined that “[t]he features that the Act singles out [in defining a social media platform] are inextricable from the content produced by those features.” The social functionalities of these platforms influence the kind of content that is present, and so discriminating against these platforms is discriminating against their content. The second factor is that the law’s exemptions for traditional media outlets and sites with product reviews are clearly content-based, implying that the in-scope services are targeted for having different content.

3. Level of scrutiny: (b) minors’ rights

The order states that minors have rights to access “constitutionally protected, non-obscene content.” The fact that parents may exercise some control over their children, and that this law gives them the power to veto the restrictions, does not permit the government to proactively keep minors from this content. According to the court, this also points to strict scrutiny as the appropriate standard for adjudicating the law.

4. State interest and tailoring: (a) protecting minors

Under strict scrutiny, the state must show that it has a compelling interest to take speech-restrictive measures. The court recognized that there is some evidence for harms to minors from social media. Privacy concerns are deemed “less clear[ly]” a compelling interest, but ones that in any case could more appropriately be remedied by a law that regulates terms of service themselves. Furthermore, these concerns could also apply to the news sites specifically excluded from the scope of the law.

Regarding damage to mental health, the court tentatively agreed that “it very well may be” a compelling interest to protect children from these harms. However, the order describes the law as a “breathtakingly blunt instrument” in meeting this concern, as all of the content on social media will be blocked from teens and children under 16 unless parents grant consent. At the same time, a one-off parental permission for each platform is deemed sufficient for granting access to even the purportedly harmful aspects of social media. The law is therefore both overbroad and underinclusive.

5. State interest and tailoring: (b) parents’ rights

Citing the same Supreme Court decision as it did in the discussion of minors’ rights, the court notes that even if the state has some interest in strengthening parental authority, that does not justify “punishing third parties for conveying protected speech to children just in case their parents disapprove,” especially when parents already have other tools at their disposal to control the access of their children to speech that they disapprove of.

6. Vagueness

Lastly, the 14th Amendment is interpreted as precluding the enactment of vague laws as they do not give those subject to the laws enough information to comply and avoid penalties. Two sections of the Act are cited in the order as impermissibly vague. This first is the scoping of the bill to any service that “targets children, or is reasonably anticipated to be accessed by children,” despite the stated list of factors that may be taken into account. The second is the traditional media organization exemption that applies to those that are “established” and “widely recognized.”

How does Florida’s law compare in these aspects?

In speeches preceding the signing of Florida’s law, the Governor and State House Speaker stressed that they thought this bill would withstand First Amendment scrutiny by the courts because of its focus on the addictive features of social media platforms and not on content. But how does this law compare to Ohio’s regarding the rulings issued by the court?

1. First Amendment applicability

The Florida law is also framed as a law about contracts, which would seem no more likely to be excluded from First Amendment applicability than Ohio’s. HB 3 also includes the following stipulation:

(8) If a social media platform allows an account holder to use the social media platform, the parties have entered into a contract.

This may reinforce the argument that the language of contracts is, more or less, a ruse to disguise the fact that this bill regulates access to expression.

2. Level of scrutiny: (a) is the law content-based?

This concern is perhaps the most significant point of divergence from Ohio. While another court may ultimately come to a similar conclusion regarding the inextricability of content and functionality when it comes to the definition of social media platforms in the bill, Florida legislators were deliberate about including the requirement of at least one “addictive” feature for a platform to be considered in-scope, lending credibility to their argument that the bill is dealing with this problem, and is neutral as to the content that may be accessed. The definition is also well-constructed enough that the policymakers did not feel the need to add transparently content-based exemptions. A court applying the same standards as the Ohio court may therefore conclude that the Florida law is not content-based.

3. Level of scrutiny: (b) minors’ rights

Here, however, the Florida law likely fares no better, if not worse, as HB 3 outright prohibits social media platforms from allowing those under 14 to have accounts, and contains a provision that if the parental consent parts of the bill are struck down, the same should apply to 14- and 15-year-olds. Parental consent may be a defense for Ohio in restricting minors’ access to protected expression (albeit one that the court did not buy); Florida has no such defense here, and so it appears that strict scrutiny would certainly apply, using the same logic.

4. State interest and tailoring: (a) protecting minors

Florida appears to be most focussed on mental health concerns (rather than privacy), which may be more acceptable to the courts as a compelling interest, but its remedy could well be seen just as overbroad and underinclusive. The state may argue that the platforms could just remove their addictive features to move themselves out-of-scope, but then that would also suggest a less speech-restrictive approach that they should have taken—to compel the disabling of these features for young users, while still allowing teenagers to keep using the platforms (though some would argue that this too would be precluded by the First Amendment).

5. State interest and tailoring: (b) parents’ rights

This interest is clearly important to Florida, as the absence of a parental consent provision was one of the main reasons that DeSantis vetoed an earlier version of the bill. However, it does not appear to be significantly different from the one that the judge in Ohio dismissed, aside from the fact that it cannot be presented as the primary concern, considering the restriction for 13-year-olds and the fallback position if parental consent is struck down.

6. Vagueness

The Florida law does not have the same vague terms as those identified by the court in the Ohio act. However, other language may be identified that raises this concern. In particular, the clause stipulating that, for a platform to be in-scope, at least 10% of daily active users under 16 must spend, on average, two hours or more on the platform each day. However, platforms may not know the age of their users, either at all, or to a specific standard of certainty, and so it is unclear how this should be understood. Similar concerns may be raised about:

  • How the identity and relationship of parents must be confirmed; Ohio gave more detail here, and
  • How the platforms are expected to conduct age assurance going forward to comply with the bill; in Ohio, the state attorney informed the court that no age verification requirement would be enforced. (It is curious that age verification is central to the remaining sections of the bill, but no reference to that is made in section 1.)

***

From this analysis, it seems that the Florida bill may have insufficient advantages over the Ohio bill to be acceptable to a court employing the same reasoning, and perhaps possess a few disadvantages too. It is hard, in fact, to imagine how any social media parental consent law would satisfy the Ohio district court’s standards regarding minors’ rights to access non-obscene content. Florida could find a court that applies the precedents differently, but the confidence expressed by Florida’s legislative and executive leaders may well be unwarranted.

Authors

Tim Bernard
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work inclu...

Topics