Supreme Court on “Content Moderation”

On April 5th, 2021 Justice Thomas wrote an opinion for the Supreme Court that for the first time correctly frames the debate around internet censorship, or its politically correct euphemism “content moderation”. The censorship by Big Tech of individuals is not a matter for the much discussed Section 230, but a matter of common carriers and public accommodation. For a long time the discourse around censorship online has been around Section 230 and the safe harbor that offers to “platforms” that are not responsible for the content of the people that post on the platform, unlike publishers. Much ink has been spilled on whether or not Facebook/Twitter are neutral platforms or publishers with editorial power. What Justice Thomas made clear was that the real fact is that these entities so ubiquitous and exert monopoly power over communication to such an extent that they are common carriers and places of public accommodation. As such the doctrine of non-discriminatory access and use applies to them the same it does the phone company, a train, bus or taxi.

It is time to bring these companies to the same standards that the rest of the business world must abide by, no more discrimination under the guise of “community standards” or “content moderation”.

For the complete text of the decision here (starting on page 9):

An excellent overview of this by ReclaimTheNet.org here: https://reclaimthenet.org/justice-clarance-thomas-big-tech-public 

Our view

We think that this is the correct way to look at this topic. Facebook/Twitter/etc are so large, and just an integrated part of our communication infrastructure, that the rules that apply to other firms must apply to them. As a society we have long decided that large and powerful companies cannot discriminate against certain people, and those rules must apply to Big Tech as well. It is simple, no discrimination.

Aren’t they are private companies and can do whatever they want?

Let us start here with this constant, and completely incorrect statement. Private companies of all types are subject to government regulation. A restaurant cannot serve rotten food. A pharmaceutical company cannot sell un-tested drugs. In many states you cannot cut hair without a government license. The reality is that all business operate under a regulatory umbrella, and none can “do whatever they want”.

Facebook/Twitter are common carriers, and as such much offer their services in a constant and fair manner to all who wish to use them.

Just as the phone company cannot deny service to a person based on the view of the phone company that a person should not have access to the phone communication service, no platform can deny service. Common carrier doctrine is long established law. In the era of the Robber Barons, railroad companies could not deny carrying cargo of one company to the benefit of others. Whole government bodies, the Interstate Commerce Commission, were created to regulate these common carries and ensure they did not discriminate against anyone. Justice Thomas rightfully points out that internet platforms of today are the modern common carriers and should be subject to the same rules: consistent, open and fair access by all. Judge Thomas states “our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including the general requirement to serve all comers.” 

Facebook/Twitter are places of public accommodation and must play by the rules that other do

In the second part of Justice Thomas’ ruling he points out that digital platforms are also places of public accommodation, same as a restaurant or hotel. Long ago as a society we have decided that businesses that hold themselves out to the public must welcome all and cannot discriminate. A hotel cannot refuse to house the guests of an NAACP convention based on the color of their skin, a restaurant cannot refuse to serve women or LGBTQ+ individuals. The fair exchange that we have made as a society is that if your business holds yourself out to the public at large, then you must accommodate all without discrimination. This must apply to Big Tech as well, they should not be allowed to discriminate against individuals or people. In a fair society, they must abide by the rules.

But these platforms only discriminate based on political views, and that isn’t a protected class

This is incorrect based on the accepted theory of “disparate impact” that is often used in cases of discrimination. For a description see Wikipedia: https://en.wikipedia.org/wiki/Disparate_impact . In summary the theory states that regardless of intent of a policy or action, if the impact has disproportionate results based on a protected factor, then it is de facto discriminatory.

If the effect of content moderation has a greater impact on certain race, gender or national origin (which are protected classes), even if the policy was targeted at political views (which is not a protected idea), then the policy and its action are still discriminatory and must end. The key here is not the intent of the discrimination, but the results of the actions.

Don’t I have a right not to see things I don’t like?

No. Growing up I was told “if you don’t like what is on the TV, just turn it off”. That applies here. If you don’t want to see something on Facebook, then just don’t use it. The burden is on the consumer, not the platform.

Isn’t Section 230 the right discussion to be having?

No. EFF does a good read here: https://www.eff.org/deeplinks/2021/01/its-not-230-you-hate-its-oligopolies 

Other court rulings of interest on this topic

  1. Three cases that show what free speech means | ShareAmerica
  2. Marsh v. Alabama – Wikipedia

Comments |0|

Legend *) Required fields are marked
**) You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>