The heads of Facebook, Twitter, and Google were back on Capitol Hill recently to answer questions about misinformation, extremism, and kids’ access to social media.
The big takeaway from the hearing is that there are no easy answers and little agreement on how to police speech. That’s a good thing. First Amendment speech rights are under fire as Congress aims to ensure safety online, but also wants to use Big Tech for politically-motivated censorship of opposing views. Any one-size-fits-all regulation will leave everyone worse off, not better.
Last Thursday’s hearing marked the first time since the January 6th attack on the Capitol Building that Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey, and Google’s Sundar Pichai appeared virtually before the U.S. House Energy and Commerce Committee.
3 Key Takeaways from the Hearing:
- Social media companies are mixed on whether they bear responsibility for the January 6th attack.
Twitter: “Yes. But you also have to take into consideration the broader ecosystem. It’s not just about the technological systems that we use.”
Facebook: “I think the responsibility lies with the people who took the action to break the law and do the insurrection. And secondarily with the people who spread that content, including [President Trump].”
- Everyone wants Section 230 reform, but Republicans and Democrats back opposing approaches.
As the Washington Post reported, “Democrats want it to hold companies to a higher standard for the spread of racism and misinformation. Republicans want the companies to cut back on moderation, arguing that current practices threaten free speech.”
- Kids’ online usage is now on Congress’s radar.
The news that Facebook plans to launch an Instagram for kids did not sit well with lawmakers. They raised concerns about excessive screen time, social media’s mental and emotional toll on teens, and safety. Washington Republican Cathy McMorris Rogers noted, “Your platforms are my biggest fear as a parent.” Democrats expressed similar concerns indicating that this may be an area of bipartisan agreement.
There is consensus that Congress must and will do something on content moderation. It’s just a matter of time and what reforms they can coalesce around.
A common-sense approach takes a step back though and asks what are they trying to accomplish and what approaches achieve those end with the least damage and unintended consequences.
Dorsey made a good point about mandating content moderation. He admitted that only the largest platforms could afford to employ staffs large enough to meet imposed standards, which could hurt competition.
We need to also ask what would those standards be, who would they apply to, and who sets them?
Fighting misinformation is a difficult task, especially when there’s no agreement on what qualifies as misinformation. The medical community’s knowledge about the coronavirus and how to treat it has evolved from one year ago and continues to change with new data. Unfortunately, legitimate dissenting opinions sometimes get mislabeled as misinformation and even removed. Imagine if Congress sets the parameters for misinformation leaving little leeway for the evolution of science or dissent.
We also know partisans will weaponize content moderation rules to silence their opponents.
There are no easy answers. That’s all the more reason for lawmakers to tread slowly and carefully, if at all.