Deplatforming, (no-platforming), a form of Internet censorship of an individual or group by preventing them from posting on the platforms they use to share their information/ideas. This typically involves suspension, outright bans, or reducing spread (shadow banning).[1] [2]
As early as 2015, platforms such as Reddit began to enforce selective bans based, for example, on terms of service that prohibit "hate speech".[3] The most notorious examples of deplatforming were Twitter's ban of then-US President Donald Trump shortly after and the New York Post during the 2020 presidential election.[4]
In the United States, the banning of speakers on university campuses dates back to the 1940s. This was carried out by the policies of the universities themselves. The University of California had a policy known as the Speaker Ban, codified in university regulations under President Robert Gordon Sproul, that mostly, but not exclusively, targeted communists. One rule stated that "the University assumed the right to prevent exploitation of its prestige by unqualified persons or by those who would use it as a platform for propaganda." This rule was used in 1951 to block Max Shachtman, a socialist, from speaking at the University of California at Berkeley. In 1947, former U.S. Vice President Henry A. Wallace was banned from speaking at UCLA because of his views on U.S. Cold War policy,[5] and in 1961, Malcolm X was prohibited from speaking at Berkeley as a religious leader.
Controversial speakers invited to appear on college campuses have faced deplatforming attempts to disinvite them or to otherwise prevent them from speaking. The British National Union of Students established its No Platform policy as early as 1973. In the mid-1980s, visits by South African ambassador Glenn Babb to Canadian college campuses faced opposition from students opposed to apartheid.[6]
In the United States, recent examples include the March 2017 disruption by protestors of a public speech at Middlebury College by political scientist Charles Murray. In February 2018, students at the University of Central Oklahoma rescinded a speaking invitation to creationist Ken Ham, after pressure from an LGBT student group. In March 2018, a "small group of protesters" at Lewis & Clark Law School attempted to stop a speech by visiting lecturer Christina Hoff Sommers. In the 2019 film No Safe Spaces, Adam Carolla and Dennis Prager documented their own disinvitation along with others.[7]
, the Foundation for Individual Rights in Education, a speech advocacy group, documented 469 disinvitation or disruption attempts at American campuses since 2000, including both "unsuccessful disinvitation attempts" and "successful disinvitations"; the group defines the latter category as including three subcategories: formal disinvitation by the sponsor of the speaking engagement; the speaker's withdrawal "in the face of disinvitation demands"; and "heckler's vetoes" (situations when "students or faculty persistently disrupt or entirely prevent the speakers' ability to speak").
Beginning in 2015, Reddit banned several communities on the site ("subreddits") for violating the site's anti-harassment policy. A 2017 study published in the journal Proceedings of the ACM on Human-Computer Interaction, examining "the causal effects of the ban on both participating users and affected communities," found that "the ban served a number of useful purposes for Reddit" and that "Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech." In June 2020 and January 2021, Reddit also issued bans to two prominent online pro-Trump communities over violations of the website's content and harassment policies.
On May 2, 2019, Facebook and the Facebook-owned platform Instagram announced a ban of "dangerous individuals and organizations" including Nation of Islam leader Louis Farrakhan, Milo Yiannopoulos, Alex Jones and his organization InfoWars, Paul Joseph Watson, Laura Loomer, and Paul Nehlen. In the wake of the 2021 storming of the US Capitol, Twitter banned then-president Donald Trump, as well as 70,000 other accounts linked to the event and the far-right movement QAnon.
Some studies have found that the deplatforming of extremists reduced their audience, although other research has found that some content creators became more toxic following deplatforming and migration to alt-tech platform.[8]
On November 18, 2022, Elon Musk, as newly appointed CEO of Twitter, reopened previously banned Twitter accounts of high-profile users, including Kathy Griffin, Jordan Peterson, and The Babylon Bee as part of the new Twitter policy.[9] [10] As Musk exclaimed, "New Twitter policy is freedom of speech, but not freedom of reach".
On August 6, 2018, Facebook, Apple, YouTube and Spotify removed all content by Jones and InfoWars for policy violations. YouTube removed channels associated with InfoWars, including The Alex Jones Channel.[11] On Facebook, four pages associated with InfoWars and Alex Jones were removed over repeated policy violations. Apple removed all podcasts associated with Jones from iTunes.[12] On August 13, 2018, Vimeo removed all of Jones's videos because of "prohibitions on discriminatory and hateful content".[13] Facebook cited instances of dehumanizing immigrants, Muslims and transgender people, as well as glorification of violence, as examples of hate speech.[14] [15] After InfoWars was banned from Facebook, Jones used another of his websites, NewsWars, to circumvent the ban.[16] [17]
Jones's accounts were also removed from Pinterest,[18] Mailchimp[19] and LinkedIn.[20], Jones retained active accounts on Instagram,[21] Google+[22] and Twitter.[23] [24]
In September, Jones was permanently banned from Twitter and Periscope after berating CNN reporter Oliver Darcy.[25] [26] On September 7, 2018, the InfoWars app was removed from the Apple App Store for "objectionable content".[27] He was banned from using PayPal for business transactions, having violated the company's policies by expressing "hate or discriminatory intolerance against certain communities and religions."[28] After Elon Musk's purchase of Twitter several previously banned accounts were reinstated including Donald Trump, Andrew Tate and Ye resulting in questioning if Alex Jones will be unbanned as well. However Musk denied that Alex Jones will be unbanned criticizing Jones as a person that "would use the deaths of children for gain, politics or fame".[29]
InfoWars remained available on Roku devices in January 2019, a year after the channel's removal from multiple streaming services. Roku indicated that they do not "curate or censor based on viewpoint," and that it had policies against content that is "unlawful, incited illegal activities, or violates third-party rights," but that InfoWars was not in violation of these policies. Following a social media backlash, Roku removed InfoWars and stated "After the InfoWars channel became available, we heard from concerned parties and have determined that the channel should be removed from our platform."[30] [31]
In March 2019, YouTube terminated the Resistance News channel due to its reuploading of live streams from InfoWars.[32] On May 1, 2019, Jones was barred from using both Facebook and Instagram.[33] [34] [35] Jones briefly moved to Dlive, but was suspended in April 2019 for violating community guidelines.[36]
In March 2020, the InfoWars app was removed from the Google Play store due to claims of Jones disseminating COVID-19 misinformation. A Google spokesperson stated that "combating misinformation on the Play Store is a top priority for the team" and apps that violate Play policy by "distributing misleading or harmful information" are removed from the store.[37]
See main article: Donald Trump on social media. On January 6, 2021, in a joint session of the United States Congress, the counting of the votes of the Electoral College was interrupted by a breach of the United States Capitol chambers. The rioters were supporters of President Donald Trump who hoped to delay and overturn the President's loss in the 2020 election. The event resulted in five deaths and at least 400 people being charged with crimes.[38] The certification of the electoral votes was only completed in the early morning hours of January 7, 2021. In the wake of several Tweets by President Trump on January 7, 2021 Facebook, Instagram, YouTube, Reddit, and Twitter all deplatformed Trump to some extent.[39] [40] [41] [42] Twitter deactivated his personal account, which the company said could possibly be used to promote further violence. Trump subsequently tweeted similar messages from the President's official US Government account @POTUS, which resulted in him being permanently banned on January 8.[43] Twitter then announced that Trump's ban from their platform would be permanent.
Trump planned to rejoin on social media through the use of a new platform by May or June 2021, according to Jason Miller on a Fox News broadcast.[44] [45]
The same week Musk announced Twitter's new freedom of speech policy, he tweeted a poll to ask whether to bring back Trump into the platform.[46] The poll ended with 51.8% in favor of unbanning Trump's account. Twitter has since reinstated Trump's Twitter account @realDonaldTrump (as of 19 Nov 2022 — but by then Trump's platform was Truth Social).[47]
In 2017, Andrew Tate was banned from Twitter for tweeting that women should "bare some responsibility" in response to the #MeToo movement.[48] Similarly, in August 2022, Tate was banned on four more major social media platforms: Instagram, Facebook, TikTok, and YouTube. These platforms indicated that Tate's misogynistic comments violated their hate speech policies.[49]
Tate has since been unbanned from Twitter as part of the new freedom of speech policy on Twitter.
Social media platforms such as YouTube and Instagram allow their content producers or influencers to earn money based on the content (videos, images, etc.), most typically based around some sort of payment per a set number of new "likes" or clicks etc. When the content is deemed inappropriate for compensation, but still left on the platform, this is called "demonetization" because the content producer is left with no compensation for their content that they created, while at the same time the content is still left up and available for viewing or listening by the general public.[50] In September 2016, Vox reported that demonetization—as it pertained to YouTube specifically—involved the following key points:
Deplatforming tactics have also included attempts to silence controversial speakers through various forms of personal harassment, such as doxing, the making of false emergency reports for purposes of swatting, and complaints or petitions to third parties. In some cases, protesters have attempted to have speakers blacklisted from projects or fired from their jobs.
In 2019, students at the University of the Arts in Philadelphia circulated an online petition demanding that Camille Paglia "should be removed from UArts faculty and replaced by a queer person of color." According to The Atlantics Conor Friedersdorf, "It is rare for student activists to argue that a tenured faculty member should be denied a platform." Paglia, a tenured professor for over 30 years who identifies as transgender, had long been unapologetically outspoken on controversial "matters of sex, gender identity, and sexual assault".
In December 2017, after learning that a French artist it had previously reviewed was a neo-Nazi, the San Francisco punk magazine Maximum Rocknroll apologized and announced that it has "a strict no-platform policy towards any bands and artists with a Nazi ideology".
In May 2021, the UK government under Boris Johnson announced a Higher Education (Freedom of Speech) Bill that would allow speakers at universities to seek compensation for no-platforming, impose fines on universities and student unions that promote the practice, and establish a new ombudsman charged with monitoring cases of no-platforming and academic dismissals.[51] In addition, the government published an Online Safety Bill that would prohibit social media networks from discriminating against particular political views or removing "democratically important" content, such as comments opposing or supporting political parties and policies.[52]
Some critics of deplatforming have proposed that governments should treat social media as a public utility to ensure that constitutional rights of the users are protected, citing their belief that an Internet presence using social media websites is imperative in order to adequately take part in the 21st century as an individual.[53] Republican politicians have sought to weaken the protections established by Section 230 of the Communications Decency Act—which provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users—under allegations that the moderation policies of major social networks are not politically neutral.[54] [55] [56] [57]
See also: Paradox of tolerance and Defensive democracy. According to its defenders, deplatforming has been used as a tactic to prevent the spread of hate speech and disinformation. Social media has evolved into a significant source of news reporting for its users, and support for content moderation and banning of inflammatory posters has been defended as an editorial responsibility required by news outlets.[58]
Supporters of deplatforming have justified the action on the grounds that it produces the desired effect of reducing what they characterize as hate speech. Angelo Carusone, president of the progressive organization Media Matters for America and who had run deplatforming campaigns against conservative talk hosts Rush Limbaugh in 2012 and Glenn Beck in 2010, pointed to Twitter's 2016 ban of Milo Yiannopoulos, stating that "the result was that he lost a lot.... He lost his ability to be influential or at least to project a veneer of influence."
In the United States, the argument that deplatforming violates rights protected by the First Amendment is sometimes raised as a criticism. Proponents say that deplatforming is a legal way of dealing with controversial users online or in other digital spaces, so long as the government is not involved with causing the deplatforming. According to Audie Cornish, host of the NPR show Consider This, "the government can't silence your ability to say almost anything you want on a public street corner. But a private company can silence your ability to say whatever you want on a platform they created."[59]
In the words of technology journalist Declan McCullagh, "Silicon Valley's efforts to pull the plug on dissenting opinions" began around 2018 with Twitter, Facebook, and YouTube denying service to selected users of their platforms; he said they devised "excuses to suspend ideologically disfavored accounts". In 2019, McCullagh predicted that paying customers would become targets for deplatforming as well, citing protests and open letters by employees of Amazon, Microsoft, Salesforce, and Google who opposed policies of U.S. Immigration and Customs Enforcement (ICE), and who reportedly sought to influence their employers to deplatform the agency and its contractors.
Law professor Glenn Reynolds dubbed 2018 the "Year of Deplatforming" in an August 2018 article in The Wall Street Journal. Reynolds criticized the decision of "internet giants" to "slam the gates on a number of people and ideas they don't like", naming Alex Jones and Gavin McInnes. Reynolds cited further restrictions on "even mainstream conservative figures" such as Dennis Prager, as well as Facebook's blocking of a campaign advertisement by a Republican candidate "ostensibly because her video mentioned the Cambodian genocide, which her family survived."
In a 2019 The Atlantic article, Conor Friedersdorf described what he called "standard practice" among student activists. He wrote: "Activists begin with social-media callouts; they urge authority figures to impose outcomes that they favor, without regard for overall student opinion; they try to marshal antidiscrimination law to limit freedom of expression." Friedersdorf pointed to evidence of a chilling effect on free speech and academic freedom. Of the faculty members he had contacted for interviews, he said a large majority "on both sides of the controversy insisted that their comments be kept off the record or anonymous. They feared openly participating in a debate about a major event at their institution—even after their university president put out an uncompromising statement in support of free speech."