This has been a tough week.
Starting with the terrible event that occurred last weekend in Charlottesville, VA, where clashes between neo-Nazi and white supremacist groups erupted into fights and violence and led to death of one protester.
Throughout the week, the event continued to gain steam when President Trump commented about the incident, then made a second comment, then held an unprecedented press conference that even members of his own party condemned.
As prominent CEOs's of the President's manufacturing council began to drop out, several tech companies began or intensified their crack down on hate speech and banning of alt-right and neo-Nazi websites. According to PBS News, here are just a few big names and their actions:
- GoDaddy: Ceased hosting the domain provided to The Daily Stormer, a site well known neo-Nazi website.
- Apple Pay: Blocked websites that sell white nationalist merchandise.
- Discord: Booted white nationalist groups and users off the app.
- Spotify: Removed dozens of white supremacist artists that the Southern Poverty Law Center had identified as hate music.
- Facebook: Banned pages associated with the white nationalist movement.
- PayPal: Cut ties with KKK, white supremacist and neo-nazi-affiliated groups.
- Squarespace: Promised to take down white nationalist sites and gave 48-hour notice to those it deemed as hate groups.
- GoFundMe: Shut down crowdfunding pages dedicated towards raising money for the white nationalist who drove into a crowd of people in Charlottesville.
- OkCupid: Removed alt-right members looking for love.
Cloudflare, a company that provides security services to internet companies to protect them from hackers, also joined the movement by also dropping The Daily Stormer from its network services. The move was a bit of a surprise, because Matthew Prince, co-founder and CEO of Cloudflare, has long been an advocate of free speech saying that "a website is speech, it is not a bomb,"
Cloudfire took the action, however, because management determined that the The Daily Stormer was harassing individuals who were reporting their site as abusive. Prince was also clear that he and the company found the content on the site "abhorrent and vile" and in a company memo stated that "the tipping point for us making this decision was that the team behind Daily Stormer made the claim that we were secretly supporters of their ideology ... we could not remain neutral after these claims of secret support by Cloudflare."
While these actions by tech companies seen by most as the proper and moral thing to do, some have rightfully questioned the ability of businesses in general to have such a significant influence on the fundamental right of free speech online -- censoring or even removing it altogether.
Prince goes on to say that entrepreneurs -- and society at large -- need to ask ourselves who should be responsible for policing and regulating online content. "I sit in a very privileged position," said Prince, "I see about 10 percent of all online traffic, and I can make a decision whether they can be online anymore. And I'm not sure I am the one who should be making that kind of decision."
The the question for all of us is who should be?
We are all affording the freedom of speech and expression -- a very unique, precious and delicate gift. We have also been afforded, through the sacrifice of many generations, the right to life, liberty and the pursuit of happiness.
When these two rights intersect and conflict, we need a moral standard -- not the constitution -- to moderate.
Of course, the question then becomes who gets to decide the moral standard?
Luckily, we have a democratic system in place that allows the country's citizens to select representatives who serve as the law makers that mold this standard. Is our system flawed -- absolutely -- but as Winston Churchill astutely recognized, "Democracy is the worst form of government, except for all the others."
When it comes to tech companies -- or any company for that matter -- they have an obligation to follow the law -- and that is about it. As Prince contends, the right policy is for content providers to be "content neutral." The community can be policed by its users in the form reporting reprehensible content, and companies have the obligations to engage experts and authorities in law enforcement to determine what should be removed.
Of course, if some companies wish to write and maintain an internal set of codes and as long as those codes do not infringe upon or otherwise break a law, a company has every right to do so. Customers who disagree can exercise their freedom of speech to voice their opinion or simply "protest with their wallets."
This debate will surely not end anytime soon, and by all indications, it is just getting started.
What do you think? Should censorship be under the management of companies, or should content be continued to be given freedoms under the right to free speech? Please share your (constructive and civil) comments below.