The Fate of the Internet Shouldn’t Rest with Clarence Thomas

Photo from WikiMedia Commons

SIONA MONDAL: On February 21, 2023, the Supreme Court held oral arguments for Google v. Gonzalez: a case that could potentially overhaul the internet as we know it. The Gonzalez family, after losing their daughter in an ISIS terror attack, are suing Google for aiding and abetting ISIS through their YouTube algorithms, arguing that they spread the terrorist organization's message. 

While it is a First Amendment issue, oral arguments were largely focused on understanding algorithms: how they work, what they are and what their implications are for the internet. This issue arose largely because the law that the Gonzalez Family is challenging, the Communications Decency Act, Section 230, was written in 1996. It was not written to address the advanced algorithms of today, or as Justice Kagan puts it Section 230 is ”a pre-algorithm statute [that] applies in a post-algorithm world.”

Section 230 gives companies immunity from being liable for content that a third party posted. For example, if someone posted a call to violence on Twitter, Twitter could not be held liable for that person’s words. The original intent of Section 230 was to encourage social media sites to screen and block offensive content while promoting speech on the internet. The fear was that without Section 230, companies would over censor content for fear of being sued, violating principles of free speech. Section 230 ideally incentivizes companies to censor violent and vulgar content.  

The question that arises in Gonzales is whether algorithms are a form of content. In other words, if YouTube auto-plays an ISIS video after a cooking video someone just viewed, does that make YouTube complicit in ISIS’s mission? 

I find the answer is largely no. Algorithms are a necessary part of the internet. For the Internet to be consumable, the information has to be organized in some way, and that's what algorithms do. However, if algorithms are created with the intent of perpetuating racism or extremism on a site, that would not be protected. In Google’s case, there was no proof that their algorithm was meant to proliferate ISIS content. 

Simply put, algorithms are messy. They are complex, and I would argue that they are a little too complex for Supreme Court Justices to understand. As Justice Kagan herself admits, “​​these are not like the nine greatest experts on the Internet.” Therein lies the root of the problem, regardless of which way the court rules. You have to ask yourself why the court is even making these technical decisions in the first place. 

Even though they accepted the case, throughout oral arguments, the justices continually seemed to be asking why Congress has not passed new legislation. In the middle of oral arguments Justice Kavanaugh asks, “isn't it better…to put the burden on Congress to change [current statutes] and they can consider the implications and make these predictive judgments?” 

Congress’s inability to pass legislation has put immense pressure on the courts to be making decisions that they are not equipped to handle. In Carpenter vs. US in 2013, the Court decided that companies could not share cell phone data with third parties. However, the most recent statutes on data privacy that the Court could interpret were from 1986. Congress is not passing laws to keep up with changing technology, and putting the court in a Catch 22: either the court rules on something that they do not understand or they let the issue go unaddressed. This is especially relevant in Gonzales, with Chief Justice Roberts himself acknowledging the possibility that  “if we wait for Congress to make [amendments to the statute], the Internet will be sunk.’

Three justices admitted confusion within oral arguments, illustrating how difficult technology cases can be for justices to adjudicate. Ultimately, the issue begs for comprehensive legislation. The social media algorithm guiding Tiktok has been linked to the growing mental health crisis in teens. A new study from the Center for Countering Digital Hate found that within as few as 2.6 minutes the algorithm can push suicidal content at kids. Instagram is no different, with Facebook’s own studies finding that ⅓ of teen girls feel worse about their bodies after being on the app. It is no secret that algorithms can be dangerous, but they are also necessary, filtering billions of results to organize the Internet. 

Nine justices who barely understand how Youtube works should not be making a decision which could control the Internet’s future. Congresspeople should be working together with experts in the field to write comprehensive legislation that can simultaneously hold companies accountable while preserving a free Internet. 

Passing this legislation will not be easy. Despite the fact that calls to revise Section 230 are bipartisan, some Democrats are calling for more screening and some Republicans are calling for less. On top of that, social media companies have the money and power to potentially stop the legislation in its tracks. However, just because it may be difficult does not mean it is not necessary. If our free speech rights are going to be  protected in the age of the Internet, then it is up to Congress, not the courts to preserve them. 


Siona Mondal is a columnist for On the Record originally from Pleasanton, California. She is currently a freshman in the College studying Political Economy with a minor in Statistics.