Proposals to increase government regulation of children’s online presence and usage are a hot topic. Understandably, policymakers want to protect children from dangerous content and predators online. Illegal activity aside, there are also concerns about mental and emotional harms that social media may be causing young people as I wrote about earlier this week.
There are no easy answers. Furthermore, magic bullet bills will fail to ensure children are not harmed or encounter harmful content. In part, because it’s difficult to even define a term like “harm.”
Some principles should be consistent about any policy considerations. First, overregulation can stifle innovation. The internet and social media blossomed from a light regulatory touch approach that policymakers adopted in the 1990s.
Second, unintended consequences must be considered because even well-intentioned public policy can rob individuals of their freedom of speech and privacy, leave users worse off, and destroy opportunities for enterprising individuals.
Government’s legislative agenda
The government is limited in the outcomes it can achieve through legislation, regulation, and legal remedies. Regulating undesirable human behavior offline like drinking and driving is difficult. Regulating online human behavior is even more challenging because young people can outsmart the mouse traps, technology changes faster than policy is enacted, and regulations can raise a host of other issues including constitutional concerns.
In a policy analysis titled “Would New Legislation Actually Make Kids Safer Online?” Cato tech policy expert, Jennifer Huddleston. explained the range of approaches that policymakers have adopted in recent years as well as the challenges they pose:
- Model 1: Complete (or Nearly Complete) Bans on Minors’ Use of Social Media.
- Model 2: Age‐Appropriate Design Codes
- Model 3: Limited‐Topic Age‐Appropriate Design Code
Bans on Minors’ Use of Social Media
The most extreme examples of social media regulatory action are policies that would ban teenagers from using online platforms… These draconian approaches to regulation would inevitably face legal challenges on First Amendment grounds.
Age‐Appropriate Design Codes
An age‐appropriate design code is a regulatory measure that limits platforms from displaying certain types of content deemed inappropriate for users younger than a certain age… An age‐appropriate design code may be less obviously problematic than a direct ban of young people’s online access but could have significant consequences, such as silencing voices, decreasing the privacy of users, and hampering the ability of young people to find help online.
Limited‐Topic Age‐Appropriate Design Code
The third model that has appeared is an age‐appropriate design code applied to a specific and limited type of content that is already regulated. Unlike the general bills in Model 2, this proposal specifically responds to pornography, a form of speech where regulation in some forms—particularly regarding minors’ access—has previously been upheld. As such, the law will likely face fewer and more‐limited First Amendment challenges; but this does not mean it is not without additional tradeoffs or concern for privacy and speech.
Tech companies take a proactive approach
Sometimes, the best defense is offense. Tech companies and social media platforms are rolling out policies and services to address minors’ online safety. Giving parents more control over what and how much content their children consume seems to be the leading response.
The question of how much responsibility platforms have to protect children online from “harm” is an ongoing debate. However, anyone who places 100% of the blame for “harm” to children’s mental and physical well-being and responsibility for oversight of content on the shoulders of tech platforms ignores the critical role that parents play and opens the door to other concerns.
As my colleague, Patricia Patnode wrote recently there are many tools available to manage children’s time and presence online. Tech platforms seem to be embracing self-regulation and enhanced parental control tools rather than waiting for the government to regulate them.
Recently, YouTube released its guiding principles for how it approaches features and content for children and teens.
Two principles stand out: parental empowerment and involvement and age-appropriateness. For example, Principle #2 reads:
Parents and caregivers play an important role in setting the rules for their family’s online experiences, particularly for the youngest children.
From setting clear screen time boundaries to teaching their children and teenagers to make responsible choices online, parents and caregivers can deeply influence young people’s behavior.”
This is a recognition that a one-size-fits-all approach to how much screen time kids need or what websites they should be restricted from will never fit the needs of millions of U.S. kids. That is a job for parents both actively setting rules and passively by setting an example through their own online habits.
Just as parents have rallied behind having a greater voice in public school education, they should exert energy in overseeing their kids’ online consumption.
Principle #4 reads:
The developmental needs of children differ greatly from those of teenagers and should be reflected in their online experiences.
As teenagers build independence, find identity and look for community, they need the freedom to explore a broader range of important content and form connections that bring understanding and acceptance.
Treating all minors the same ignores the developmental and maturity differences between young kids and teens. Preschoolers don’t need to understand eating disorders, but high school students should be able to find help for such mental conditions. Those are different online experiences.
As policymakers seek to navigate the murky waters of online policy for kids, YouTube also proposes a framework reflective of its principles. The guidelines that YouTube has set for itself seem reasonable but may not be appropriate for all platforms.
Raising children in this digital age is undoubtedly more difficult than during the analog age. Back then, parents needed only to shield kids from the skin flicks and adult channels like Showtime and Cinemax. Today, social media content is readily available 24/7 at their fingertips.
The desire to ask the government to step in and set the rules online may feel comforting to parents, but won’t solve mental and physical problems afflicting today’s teens.
Whichever principles guide policymaking, conservatives would be wise not to forget the First Principles. As Adam Thierer cautioned conservatives last year:
When all your best efforts to help or protect your kids don’t seem to work according to plan, it’s only natural to call for help. But there are very serious problems associated with calling on the government for that help. When legislators and regulators are asked to play the role of National Nanny, it comes with all the same baggage that accompanies many other efforts by the government to intervene in our lives or control what people or organizations can say or do.
Thierer recommends a “layered approach” to online safety that includes empowering parents with controls, education and digital literacy for young people, and enforcement of existing laws.
None of these is a silver bullet, but they are bottom-up approaches that go to changing habits among young users and are less likely to be challenged or invalidated by courts.
Protecting children and minors from dangerous content and dangerous people is critical. We want to prevent sex traffickers from exploiting teens or pedophiles from preying on little kids. However, the definition of “harmful” content seems to be wider than just abusive, violent, and vile.
Sweeping policies that aim to prevent harmful content could eliminate life-saving or at least age-appropriate information. Parents need to be in the driver’s seat of kids safety online. This calls for thoughtful apporaches that balance good intentions with legitimate concerns and measured expectations about what government can actually deliver.