Opinion Facebook’s First Amendment rights complicate Section 230 debate
Democrats introduce bill to remove Section 230 protections for Facebook and YouTube algorithms
Top House Democrats introduced a bill Thursday to make social media platforms such as Facebook and YouTube liable when they use their algorithms to recommend content that leads to physical or emotional harm, a move that would drastically change the platforms' business model.Rep. Frank Pallone of New Jersey, the chairman of the House Energy and Commerce Committee, joined with other Democrats to introduce the Justice Against Malicious Algorithms Act to make large social media platforms responsible for "knowingly or recklessly" using their algorithms to prioritize posts that cause harm.
Last week, House Democrats introduced designed to alter social media companies’ business practices — this time by punishing “personalized algorithms.” This is the latest in a year-long bipartisan assault on , the primary statute governing hosting of user-generated content online.
Given the ongoing heat and light on this issue, it’s important to recognize Section 230 is not the only legal framework in play. Even without Section 230, another significant obstacle exists to both parties’ efforts to micromanage platforms’ editorial decisions: the First Amendment.
There’s Nothing Wrong with Section 230
Arguments for making private online platforms legally liable for their users’ speech should be anathema to conservatives.At the outset of his essay, Hochman notes correctly that “Section 230 protects Internet platforms from being held liable for the content that individual users post on their forum, holding that ‘no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.’” But he then immediately errs by drawing a link that does not exist.
The First Amendment right of editorial control
Over 40 years ago, the Supreme Court unanimously recognized that the First Amendment protects editorial choices of what content to publish and how.involved a constitutional challenge to Florida’s right to reply statute, which required newspapers to print a political candidate’s response to criticism printed by the paper. The Herald published an editorial critical of union boss Pat Tornillo, a House of Representatives candidate who would later be imprisoned for embezzling millions from the union’s coffers. When the newspaper refused to publish Tornillo’s response, Tornillo sued under Florida law. But the Supreme Court found that the First Amendment prohibits the government from intruding into editors’ functions. The court continued: “The choice of material to go into a new paper, and the decisions made as to limitations on the size and content of the paper, and treatment of public issues and public officials — whether fair or unfair — constitute the exercise of editorial control and judgment.” The court ultimately concluded that governmental regulation of this “crucial process” cannot “be exercised consistent with First Amendment guarantees.”
Washington Wants to Regulate Facebook's Algorithm. That Might Be Unconstitutional
Washington Wants to Regulate Facebook's Algorithm. That Might Be UnconstitutionalLegislators have shown increasing interest in passing a bill that would hold Facebook more accountable for the content it amplifies using its algorithm. But legal experts are divided over whether such a change could survive a legal challenge on First Amendment grounds because of how it would alter the way speech is promoted on the platform.
Importantly, the court recognized its decision could lead to stifling of certain viewpoints. Tornillo argued that concentration of media in a handful of corporations requires government to enforce equal access in the interests of fairness and accuracy. (Read: “The First Amendment interest of the public in being informed is said to be in peril because the ‘marketplace of ideas’ is today a monopoly controlled by the owners of the market.”) But the court explained that “however much validity may be found in these arguments,” government-coerced access was an unconstitutional solution. Justice Byron White’s concurrence stated starkly: “Liberty of the press is in peril as soon as the government tries to compel what is to go into a newspaper.”
The First Amendment also protects against government censorship, even in the public interest. In, the Richard Nixon administration sought to prohibit the from printing the Pentagon Papers, a classified study about the Vietnam War. But the court found that the government did not overcome the “heavy presumption against” government suppression of content, no matter how harmful publication might be. Justice William Brennan’s concurrence explained that even in wartime, “only proof that publication must inevitably, directly, and immediately” jeopardize lives” can support even the issuance of an interim restraining order.”
Voting Rights By the Numbers 2021
The Voting Rights Act of 1965 was enacted because of voter suppression by state governments, local governments and law enforcement. Over the 56-year period of its existence, it has helped Americans reinforce the citizens right to vote. It addresses the levels of disenfranchisement in existence after ratification of the 15th Amendment to the Constitution.Here is a look at voting rights, by the numbers:1564 -- Senate Bill 1564, an Act to enforce the 15th amendment to the Constitution of the United States, was introduced as the Voting Votes Act of 1965 on March 18, 1965.
Applicability to modern platforms
Read together, the Tornillo and Pentagon Papers cases shield platforms from the strongest impulses from the right and left in the campaign against Big Tech. Republicans want to force Facebook and its competitors to include conservative viewpoints. Butthat “a compulsion to publish that which reason tells them should not be published is unconstitutional.” Similarly, Democrats want to stamp out disinformation and other harmful material online, but the Pentagon Papers case a “heavy presumption against” actions that, in Justice Hugo Black’s words, would “abrogate the fundamental law embodied in the First Amendment.” The government can neither compel publication or suppression of content by private actors. Editorial control also presumably encompasses subsidiary decisions such as how to display certain information (even by algorithm), which are the high-tech equivalents of deciding which story goes on the front page and which is buried on page 13.
Not immune from consequences
As a mob breached the Capitol, anger and dissonance coursed through Facebook's ranks
Facebook whistleblower documents reveal internal dissent over the company's policies and frustrations with its efforts to address its problems.“Hang in there everyone,” he wrote. Facebook should allow for peaceful discussion of the riot but not calls for violence, he added.
Of course, freedom of speech does not insulate the speaker from consequences for the message. Printing a libelous statement can lead to damages; printing classified material can lead to jail. And the First Amendment provides no greater shield from consequences of online speech than offline. That’s where the rubber hits the road in the Section 230 debate, which is primarily a shield against consequences.
But the constitutional backdrop shows how the stakes differ for the right and left. If Section 230 is repealed, the left may achieve some aims. Private law doctrines such as defamation could expose platforms to liability for what users say, which could lead platforms to take down legally questionable content. But after Tornillo, no comparable private law doctrine would compel platforms to host material against their will.
Republicans should tread lightly. Their current flailing against Big Tech could lead to greater deplatforming of the right, not less.
Washington Examiner Videos
The Backstory: 17 news orgs teamed up to examine the Facebook Papers. We found struggles with sex trafficking, hate speech, misinformation. .
Facebook whistleblower Frances Haugen's documents show the platform knew about problems but didn't fully address them. How media outlets collaborated.Facebook knew before the 2020 presidential election that its platform would quickly funnel people to false and misleading information and amplify polarizing political content, yet did not change its practices in fundamental ways.