Perspective: Should Section 230 of the Communications Decency Act be repealed?

America blur flag backdrop, two cable microphones in front. Political, business concept. 3d illustration presidential debate tile election tile / Getty Images
America blur flag backdrop, two cable microphones in front. Political, business concept. 3d illustration presidential debate tile election tile / Getty Images

YES: Biden should revoke Section 230 before we lose our democracy

By Steven Hill

The mob attack on the U.S. Capitol was incited and planned over Facebook, Twitter, YouTube and other digital media platforms, with a tragic nudge from the president of the United States. The gripping presence of gunshots and cowering lawmakers inside the People's House is a warning to us all. How did we arrive here?

Since the birth of the Big Tech media platforms 15 years ago, democracies around the world have been subjected to a grand experiment: Can a nation's news and information infrastructure, which is the lifeblood of any democracy, be dependent on digital technologies that allow a global free speech zone of unlimited audience size, combined with algorithmic (nonhuman) curation of massive volumes of mis/disinformation, that can be spread with unprecedented ease and reach?

The evidence has become frighteningly clear that this experiment has veered off course, like a Frankenstein monster marauding across the landscape.

Facebook is no longer simply a "social networking" website - it is the largest media giant in the history of the world, a combination publisher and broadcaster with 2.6 billion regular users, and billions more on the Facebook-owned WhatsApp and Instagram. A mere 100 pieces of COVID-19 misinformation on Facebook were shared 1.7 million times and had 117 million views - far more viewers than The New York Times, Washington Post, Wall Street Journal, ABC, Fox News and CNN combined.

The FacebookGoogleTwitter media giants have been misused frequently by bad actors for disinformation campaigns in more than 70 countries to undermine elections, even helping elect a quasi-dictator in the Philippines; and to amplify and even livestream child abusers, pornographers and the Christchurch mass murderer.

How can we unite to take action on climate change when a majority of YouTube climate change videos denies the science, and 70% of what YouTube's 2 billion users watch comes from its sensation-saturated, conspiracy-driven recommendation algorithms?

It is time to push reset in a major way. President Joe Biden should start by revoking Section 230 of the Communications Decency Act. That's the law that grants Big Tech Media "blank check" immunity for the mass content that is published and broadcast across their platforms. Revoking Section 230 is not a perfect solution, but it would make these companies somewhat more responsible, deliberative and potentially liable for the worst of their toxic content, including illegal content. Just like traditional media is liable.

But let's be clear: Some of the worst past outrages would not likely be affected by 230's revocation. While President Donald Trump's inciting speech to the mob about a stolen election was false and provocative, other media outlets publish untrue nonsense all the time. It would be difficult to prove legally that any particular individuals or institutions were harmed or motivated by the president's many outrageous statements.

So revoking Section 230 will likely not be as impactful as its proponents wish or its critics fear. The next step involves recognizing that these Silicon Valley companies are creating the new 21st century infrastructure of the digital age, requiring a whole new business model.

The Biden administration should treat these companies more like investor-owned utilities, as the U.S. previously did with telephone, railroad and power companies. (Facebook founder Mark Zuckerberg has suggested such an approach.)

As utilities, they would be guided by a digital license - just like traditional brick-and-mortar companies must apply for various licenses and permits - that defines the rules and regulations of the business model according to a "duty of care" obligation, a kind of Hippocratic oath that says "first, do no harm."

One abuse that is ripe for stricter rules is "data grabs" of users' personal info. These companies never asked for permission to start sucking up our private data, or to track our physical locations, or mass collect every "like," "share" and "follow" into psychographic profiles of each user that can be targeted and manipulated by advertisers and bad political actors. The platforms started that sneaky practice secretly, forging their destructive brand of "surveillance capitalism."

Now that we know, should society continue to allow this? Shouldn't the default regulation require platforms to obtain users' permission before collecting any of our personal data, i.e., opt-in rather than opt-out?

The new business model also should encourage competition by limiting the mega-scale audience size of these digital media monopolies. And it should restrain the use of specific "engagement" techniques, such as hyper-targeting of content, automated recommendations, and addictive behavioral nudges (like pop-up screens and autoplay) that allow manipulation.

These companies' frequent outrages against our humanity are supposedly the price we must pay for being able to post our summer vacation and new puppy pics to our "friends," or for political dissidents and whistleblowers to alert the world to their just causes. Those are all important uses, but the price paid is very high. We can do better.

The challenge now is to establish sensible guardrails for this 21st century digital infrastructure, so that we can harness the good that these technologies provide, and greatly mitigate the dangerous effects.

Steven Hill is the former policy director at the Center for Humane Technology and author of seven books. He wrote this for InsideSources.com.

Tribune Content Agency

photo Steven Hill

NO: Repealing Section 230 would limit Americans' speech

By Will Duffield

Section 230 of the Communications Decency Act prevents digital intermediaries from being treated as the "publisher or speaker" of their users' speech and blocks litigation over platforms' decisions to remove speech they deem violent, obscene or otherwise objectionable. Platforms are under no obligation to remove speech, with some exceptions, but cannot be required to carry speech either. The law applies universally to digital intermediaries; Facebook is not liable for its users' speech, and The New York Times is not liable for its comments section. By properly placing responsibility for harmful or unlawful speech with the speaker, Section 230 maximizes the ability of companies to produce publishing tools.

In the 25 years since its passage, this prescient rule has paid tremendous dividends. Americans are served by a dizzying array of publishing intermediaries, allowing us to communicate in real time via text, audio and video. We have created forums for work, worship, play and romance, serving every imaginable nice interest and minority. Of course, not all interconnection has been positive. Extremists and criminals use the internet too. Some argue that amending or repealing Section 230 would compel platforms to suppress extremist speech and criminal activity.

However, exposing platforms to broad liability for user speech would lead to the removal of much more than dangerous speech.

Platforms already make extensive use of their ability to remove unwanted speech, filtering spam, threats, advertisements for illegal goods, foreign propaganda and even simply off-topic speech. Popular platforms review millions of posts a day, often with the assistance of imperfect software. At this scale, some innocent speech will inevitably be misunderstood, mislabeled and removed. Over the past few years, major platforms' rules have become more stringent and expansive, prompting concerns about censorship and bias.

Demanding that platforms assume liability for their users' speech will at best exacerbate the accidental removal of innocent speech. However, it also runs the risk of limiting who can speak online at all. Digital intermediaries usually review speech after publication. Speech may be flagged, either by other users, human moderators, or algorithms, and placed in queue for adjudication. Section 230 allows platforms to remain open by default and worry about excluding misuse when it occurs, giving a voice to everyone with an internet connection.

In contrast, newspapers and other traditional publishers filter, edit and modify submissions before publication. While this allows them to safely assume full ownership of the speech they publish, it dramatically limits who can speak. Editing is a laborious and time-consuming process. Even if a newspaper wanted to publish every letter to the editor, it would have neither the space nor the time to do so. This model often produces consistently high-quality speech, but tends to favor some perspectives over others, offering only a narrow slice of elite sentiment.

Repealing Section 230 would make social media more like traditional media by making it exclusive. With limited resources to review speech before publication, platforms would have to determine whose perspectives should be prioritized. There is little reason to think their selections would differ greatly from newspapers. If replies and responses had to be reviewed as well, social media would lose most of its interactivity, becoming another conduit through which speech is passively received.

Without Section 230, platform moderators would not become more deliberate, they would simply remove more. The threat of costly litigation does little to inspire thoughtful decision making - moderators will act quickly to eliminate any source of legal risk. When Congress amended Section 230 in 2017 to expose platforms to liability for speech promoting prostitution or sex trafficking, Craigslist did not moderate its personal advertisements page more cautiously, it shut the page down.

Indeed, without Section 230's protections, many smaller forums would simply shut down, or look to be acquired by larger firms. Could the operators of V8Buick.com, a forum for antique car collectors with 38,000 users, afford even a single yearslong defamation lawsuit? The easiest way to avoid legal liability is acquisition.

Apart from suppressing speech, repealing Section 230 would suppress competition, agglomerating activity onto large platforms such as Facebook. Without Section 230, Facebook, but not V8Buick.com, could afford to litigate controversies over user speech.

Repealing Section 230 is a drastic step that would upend the internet, punishing successful firms and internet users for the behavior of an antisocial minority. Heaping legal liability on platforms will not render them more thoughtful or judicious. It will cause some to close, and others to exclude all but the most inoffensive sentiments.

Will Duffield is a policy analyst in the Cato Institute's Center for Representative Governance. He wrote this for InsideSources.com.

Tribune Content Agency

photo Will Duffield

Upcoming Events