What Gonzalez v. Google might mean

By Douglas Carswell
February 23, 2023

In the early days of the internet, a law was passed that has had a critical role in enabling it to evolve. Long before many of today’s social media titans even existed, Section 230 of the 1996 Communications Decency Act made it clear that internet companies are not liable for content posted on their sites by third parties.

It is hard to imagine what the internet would look like today without this key protection. If social media sites were liable for what people posted, they would out of necessity need to police what was said.

“Policing what people post is exactly what social media sites already do” say some critics of Section 230. Frustrated at the apparently anti-conservative biases of certain sites, a number of conservatives advocate for an end to Section 230.

Social media companies like Facebook, they argue, are in effect editorializing content when they use algorithms to control what we read. If they are going to editorialize like a newspaper, why not make them liable like a newspaper for what appears on their platform?

Conservative critics of Section 230 have been joined by those on the left who believe that social media sites should do much more to tackle ‘misinformation’. They actively want to oblige tech companies to control what we read and see.

This week the Supreme Court is hearing evidence in a case that could have profound implications on the future of Section 230. In Gonzalez Vs Google, the argument is being made that Google is liable for how the search engine firm’s algorithms present their results.

Think of how many items are uploaded onto the internet every day. The internet would be unusable without being able to order and organize. It is hard to see how we would be able to use the internet without relying on companies like Google to do this for us.

Even the most basic system of ordering content, by date order or alphabetically, means having some sort of algorithm. It seems absurd to me to make a search engine liable for how its algorithm operates.

I worry that if a search engine like Google is made more liable for what it shows me every time I use it, we would soon find all kinds of unintended consequences. Far from searching gazillions of sources, I imagine that the search engine would seek to limit what they search to a pool of approved sources.

Until 1695, anyone in England with a printing press needed a license for it. Only those that published things that the powerful approved of could legally operate one. I worry that we are about to recreate a world in which there are in effect licenses for having a digital printing press. Only social media companies that operate in an approved manner will be viable. Those that post on them will need approval too. The internet will have less diversity of thought and more uniformity. We will all be poorer.

DONATE TO THE MISSISSIPPI PUBLIC POLICY FOUNDATION

magnifiercross linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram