Good intentions often motivate problematic policies. In technology policy, lawmakers spend much time coming up with policy solutions to tackle real and perceived harms from innovation.

The worthy aim of ensuring online safety for kids and teens has sparked state and federal legislation. Bills have been introduced and even passed to limit access of minors to sexual or harmful content through automatic parental controls such as age verification and time limits on how long a minor can be online. The question is whether these bills will be effective and what the unintended consequences could be.

As my colleague Patricia Patnod observed recently

These social media laws likely won’t have a severe effect on how children use the internet. Although they wouldn’t be able to make a profile, they can still view content anonymously and directly text one another, maintaining that instant communication. The interconnectivity that the internet and phones gave us won’t be eliminated for children under any current proposal or law unless a parent physically takes away a cell phone or installs a blocking tool.”

One federal bill under consideration is the Kids Online Safety Act (KOSA) (S. 3663). We have concerns about the possible unintended consequences of limiting speech.

According to a fact sheet, the bill beefs up kids and parental controls. It requires that platforms provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. The bill also gives parents and children a dedicated channel to report harm to kids on the platform.

KOSA also requires that platforms take greater accountability for the content on their sites by creating “a duty for social media platforms to prevent and mitigate harms to minors, such as content promoting of self harm, suicide, eating disorders, substance abuse, and sexual exploitation.” KOSA imposes a new annual auditing mandate for independent assessment of the platforms’ risks to minors, their compliance with the bill, and meaningful steps they are taking to prevent those harms.

Our sister organization, Independent Women’s Voice, signed a coalition letter raising concerns about the sweeping nature of this bill. While the messaging of the bill appears to be targeted at social media platforms like TikTok, FaceBook, and YouTube, KOSA also captures video streaming services among user-generated content (UGC) providers creating new liability for them.

Not all platforms are the same. Non-UGC streaming services and others such as libraries, academic databases, and video game distribution services have “human-curated services” that vet and determine what content users can access. UGC sites like YouTube, TikTok, and Instagram do not have the same stringent human vetting processes because the sheer volume of submissions makes it impossible. They largely rely on algorithms. Human oversight occurs after questionable or harmful content is posted and then flagged. Should we really treat both types of content providers the same?

The letter also highlights the burdens of new compliance costs on smaller competitors in the content space:

Additionally, the Big Tech platforms most popular with children and teenagers, such as TikTok and YouTube, possess vastly greater resources to comply with burdensome regulation than their smaller competitors like Kidoodle or PBS Kids. Saddling those smaller companies with needless compliance costs would therefore only entrench the market position of Big Tech platforms. Perversely, that would undermine not only the underlying goals of KOSA, but also broader efforts by the Senate Commerce Committee to increase competition with Big Tech and curtail social media overuse among kids. 

The enforcement powers KOSA provides regulators should raise a red flag. Agencies like the Federal Trade Commission—already a foe of Big Tech and not for principled reasons—or partisan state attorneys general could weaponize KOSA to censor and suppress ideological opponents and opposing viewpoints.  

Advocates of this legislation suggest that good conservative platforms should have nothing to worry about because they don’t provide the kind of harmful content this bill targets. Consider a Daily Signal piece responding to the coalition letter we signed onto:

Folks would be hard pressed when arguing that conservative outlets, like The Daily Wire or TheBlaze, “promote … suicide, eating disorders, substance abuse, sexual exploitation” or encourage minors to take narcotics, ingest tobacco products, gamble, or consume alcohol.

Most conservative outlets, given their ideological stances, promote content that advances the prevention of these behaviors—particularly for children. It is unclear how advocates of this line of thinking square the circle here when, in fact, the act would have the opposite effect of what the bill’s naysayers claim.

In theory, that may be true. However, reality is fraught with pitfalls. 

The risk of political or viewpoint censorship is not one we take lightly. It would be unfortunate if KOSA unintentionally fueled greater censorship or increased legal exposure to companies that promote unpopular conservative views.  

IWF has faced censorship because of our positions on issues particularly dealing with gender identity such as protecting women’s sports for biological women, featuring the stories of detransitioners, or demanding that women-only spaces be protected. 

In fact, just this week, Eventbrite canceled a posting for an event hosted by one of our chapters in Texas on protecting single-sex spaces like sororities, prisons, and locker rooms. The company labeled it “Hateful, Dangerous, or Violent Content and Events.” 

In the current cultural environment that says gender is not binary and children’s gender identity choices should be affirmed, it’s not hard to imagine that a platform such as the Daily Wire could become a target of an FTC investigation or a state lawsuit for promoting children’s content that reinforces the basic, scientific truth that there are only two sexes because it failed in it’s “duty of care” to prevent or mitigate suicidal behavior among gender-questioning minors.  

Parents should be active participants in their children’s social media usage. No amount of legislation can regulate that. Tools to empower parents are helpful and needed. As Patricia noted, the private sector is stepping up with new tools and resources. 

However, we should be careful to fully embrace well-intentioned government interventions. They may create new problems without solving the ones they set out to fix.