Oversight Board | |
Established: | --> |
Founders: | --> |
Defunct: | --> |
Vat Id: | 888 (for European organizations) --> |
Purpose: | "… promot[ing] free expression by making principled, independent decisions … issuing recommendations on the relevant Facebook company content policy."[1] |
Focus: | --> |
Area Served: | or |
Region: | --> |
Product: | --> |
Method: | --> |
Field: | --> |
Languages: | --> |
Owners: | --> |
Leader Title: | Co-chairs |
Publication: | --> |
Parent Organisation: | --> |
Former Name: | --> |
The Oversight Board is a body that makes consequential precedent-setting content moderation decisions (see Table of decisions below) on the social media platforms Facebook and Instagram, in a form of "platform self-governance".[2]
Meta (then Facebook) CEO Mark Zuckerberg approved the creation of the board in November 2018, shortly after a meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook.[3] Zuckerberg originally described it as a kind of "Supreme Court", given its role in settlement, negotiation, and mediation, including the power to override the company's decisions.[4]
Zuckerberg first announced the idea in November 2018, and, after a period of public consultation, the board's 20 founding members were announced in May 2020. The board officially began its work on October 22, 2020,[5] and issued its first five decisions on January 28, 2021, with four out of the five overturning Facebook's actions with respect to the matters appealed. It has been subject to substantial media speculation and coverage since its announcement, and has remained so following the referral of Facebook's decision to suspend Donald Trump after the 2021 United States Capitol attack.
In November 2018, after meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook to oversee content moderation, CEO Mark Zuckerberg approved the creation of the board.[6] [7] [8] Among the board's goals were to improve the fairness of the appeals process, give oversight and accountability from an outside source, and increase transparency.[8] The board was modeled after the United States' federal judicial system, as the Oversight Board gives precedential value to previous board decisions.[9]
Between late 2017 and early 2018, Facebook had hired Brent C. Harris, who had previously worked on the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, and as an advisor to non-profits, to become the company's Director of Global Affairs.[10] [3] [11] Harris led the effort to create the board, reporting to Nick Clegg, who reported directly to Zuckerberg.[12] Harris also credited Clegg's involvement, saying that efforts to establish the board "wouldn't have moved absent Nick's sponsorship", and that it was "stalled within the company until Nick really took it on".[13]
In January 2019, Facebook received a draft charter for the board[14] and began a period of public consultations and workshops with experts, institutions, and people around the world.[15] [16] In June 2019, Facebook released a 250-page report summarizing its findings and announced that they are in the process of looking for people to serve on a 40-person board (the board ended up having 20 members).[17]
In January 2020, it appointed British human rights expert and former Article 19 Executive Director Thomas Hughes as Director of Oversight Board Administration.[18] It also said that board members would be named "in the coming months".[19]
On May 6, 2020, Facebook announced the 20 members that would make up the Oversight Board.[20] Facebook's VP of Global Affairs and Communications Nick Clegg described the group as having a "wide range of views and experiences" and who collectively lived in "over 27 countries", speaking "at least 29 languages,[21] but a quarter of the group and two of the four co-chairs are from the United States, which some free speech and internet governance experts expressed concerns about.[20] In July 2020 it was announced that the board would not start work until "later in the year".[22] It starting accepting cases on October 22, 2020.[5] Members of the board have noted that it will take several years for the full impact of the board and its decisions to be understood.[7] [23] The board officially began to cover cases related to Threads in May 2024.[24]
On January 28, 2021, the board ruled on five moderation decisions made by Facebook, overturning four of them and upholding one.[25] [7] [26] All but one were unanimous.[27] Each ruling was decided by a majority vote of a panel of five members of the board, including at least one member from the region where the moderated post originated.[7]
In October 2020, a Facebook user in Myanmar posted images of photographs taken by Turkish photojournalist Nilüfer Demir of the corpse of Kurdish Syrian toddler Alan Kurdi, accompanied by text in Burmese to the effect that there was "something wrong" with the psychology or the mindset of Muslims or Muslim men.[28] The text further contrasted terrorist attacks in France in response to depictions of Muhammad with an asserted relative silence by Muslims in response to the persecution of Uyghurs in China,[7] [28] and asserted that this conduct had led to a loss of sympathy for those like the child in the photograph.[28]
In reviewing Facebook's decision to remove the post, the board sought a re-translation of the post,[7] and noted that the post could be read as an insult directed towards Muslims, but could also be read as commentary on a perceived inconsistency of reactions by Muslims to the events in France and China addressed.[7] [28]
A post showing churches in Baku, Azerbaijan was captioned with a statement in Russian that "asserted that Armenians had historical ties with Baku that Azerbaijanis didn't", referring to Azerbaijanis with the ethnic slur taziks. The board found that the post was harmful to the safety and dignity of Azerbaijanis, and therefore upheld its removal.[7]
In October 2020, a Brazilian woman posted a series of images on Facebook subsidiary Instagram including uncovered breasts with a visible nipple, as part of an international campaign to raise breast cancer awareness.[29] [28] The photographs were asserted to show breast cancer symptoms, and indicated this in text in Portuguese, which the website's automated review system failed to understand.[7]
The images were removed and then later restored.[7] [28] Facebook asked that the review be dropped as moot, but the board chose to review the action nonetheless, finding that the importance of the issue made it more beneficial for the board to render a judgment on the underlying question.[7] The board further held that removal of the post was improper, as it impacted the human rights of women, and recommended improvements to the decision-making process for the removal of such posts.[7] In particular, the board recommended that users be informed of the use of automated content review mechanisms, that Instagram community standards be revised to expressly permit images with female nipples in breast cancer awareness posts, and that Facebook should clarify that its community standards take precedence over those of Instagram.[29]
In October 2020, a Facebook user posted a quote incorrectly attributed to Nazi propagandist Joseph Goebbels, stating that appeals to emotion and instinct are more important than appeals to truth.[7] The post contained no images or symbols. Facebook took down the post under its policy prohibiting the promotion of dangerous individuals and organizations, including Goebbels. The account user appealed, asserting that the post was intended as a commentary on Donald Trump. The board found that the evidence supported this assertion and held that post did not indicate support for Goebbels, and ordered that it be restored, with the recommendation that Facebook should indicate to users posting about such persons that "the user must make clear that they are not praising or supporting them".[7]
In October 2020, a French user posted a French language-video in a Facebook group criticizing the Agence nationale de sécurité du médicament for its refusal to authorize hydroxychloroquine and azithromycin to treat COVID-19.[26] Facebook removed the post for spreading COVID-19 misinformation, which the board reversed, in part because the drugs mentioned are prescription drugs in France, which would require individuals seeking them to interact with a physician. The board recommended that Facebook correct such misinformation rather than removing it.[7]
Although Facebook restored the post, it also noted that its approach to COVID-19 misinformation reflects the guidance of the U.S. Centers for Disease Control and Prevention and the World Health Organization, and that it would therefore not change its approach to such matters.[7]
On February 12, 2021, the Board overturned the removal of a Facebook forum post made in October 2020, containing an image of a TV character holding a sheathed sword, with Hindi text translated as stating "if the tongue of the kafir starts against the Prophet, then the sword should be taken out of the sheath", with hashtags equating French President Emmanuel Macron to the devil, and calling for a boycott of products from France. The board found that the post was not likely to cause harm.[30]
On April 13, 2021, the board upheld the removal of a Facebook post by a Dutch Facebook containing a 17-second video of a child and three adults wearing traditional Dutch "Sinterklaas" costumes, including two white adults dressed as Zwarte Piet (Black Pete), with faces painted black and wearing Afro wigs. The board found that although the cultural tradition is not intentionally racist, use of blackface is a common racist trope.[31]
Facebook's deplatforming of U.S. President Donald Trump was not among the initial decisions as it was collecting comments from the public.[32]
On January 6, 2021, amidst an attack at the Capitol while Congress was counting the electoral votes, Trump posted a short video to social media in which he praised the rioters, despite urging them to end the violence, and reiterated his baseless claim that the 2020 presidential election was fraudulent.[33] Several platforms, including Facebook, removed it, with Facebook's vice president of integrity, Guy Rosen, explaining that the video "contributes to rather than diminishes the risk of ongoing violence".[34] That day, Facebook also blocked Trump's ability to post new content; the next day, Facebook said the block would remain at least until the end of Trump's term on January 20.[35]
On April 16, 2021, the board announced that it was delaying the decision on whether to overturn Trump's suspensions on Facebook and Instagram to sometime "in the coming weeks" in order to review the more than 9,000 public comments it had received.[36] Notably, on January 27, 2021, incoming board member Suzanne Nossel had published an op-ed in the Los Angeles Times titled "Banning Trump from Facebook may feel good. Here's why it might be wrong",[37] but a spokesperson announced that she would not participate in the deliberations over the Trump's case and would be spending the upcoming weeks in training. On the same day Nossel's appointment was announced, the board also announced a new case.
On May 5, 2021, the board announced its decision to uphold Trump's account suspension, but instructed Facebook to reassess their decision to indefinitely ban Trump within six months.[38] The board specified that Facebook's standard procedures involve either a timed ban or a complete removal of the offending account, stating that Facebook must follow a "clear, published procedure" in the matter.[39]
On June 4, 2021, Facebook announced that it had changed the indefinite ban to a two-year suspension, ending on January 7, 2023.[40] Trump's Facebook account was later reinstated in March 2023, with Meta saying the public should be allowed to hear from politicians, but that Trump would be subject to "heightened penalties" for repeated violations of its rules.[41]
In September 2021, the board announced it would review Facebook's internal XCheck system, which fully exempted high-profile users from some of the platform's rules and regulations as well as partially exempting less high-profile users with their posts subjected only to Facebook's content review. This program was a separate system and queue, intended only for around 5.8 million users.[42] The board's quarterly report, issued on October 21, 2021, stated that the company was not transparent about the XCheck program and did not provide the board with complete information upon which to conduct a review.[43] The board also noted that the company's lack of transparency with users about reasons for content deletion was unfair.[44] In response, the company stated that it would aim for greater clarity in the future.
In October 2021, the board announced that it would be meeting with former Facebook employee and whistleblower, Frances Haugen, to discuss her statements about the company that she previously shared with The Wall Street Journal and United States Senate Commerce Committee's Sub-Committee on Consumer Protection, Product Safety, and Data Security.[45] [46]
As the Oversight Board is not a tribunal, court of law, or quasi-judicial body, it is not guided by enabling legislation created by any government. Instead, a corporate charter, bylaws, and series of governing documents set out the scope and powers of the Board. Opinions written by the board reference Meta's corporate human rights policy, which "voluntarily incorporates the United Nations Guiding Principles on Business and Human Rights, the International Bill of Human Rights, and numerous international human rights treaties".[47]
In order to ensure the board's independence, Facebook established an irrevocable trust with $130 million in initial funding, expected to cover operational costs for over half a decade.[48] [49] The board is able to hear appeals submitted by both Facebook and its users, and Facebook "will be required to respond publicly to any recommendations".[48] Notably, while the initial remit of the board gave it broad scope to hear anything that can be appealed on Facebook, the company stated that it would take the building of technical infrastructure in order for this to extend beyond the appeal of removals of content.[50] [51] The entire Oversight Board is overseen by the Oversight Board Trust, which has the power to confirm or remove new board appointees, as well as ensure that the board is operating in accordance with its stated purpose.[48] [49]
In legal terms, the Oversight Board actually is incorporated as a Delaware-based LLC, with the Oversight Board Trust as its only member.
Board members indicated that the board would begin its work slowly and deliberately, with a focus on producing meaningful opinions in cases carefully selected to be representative of substantial issues. Facebook also developed software to enable it to transfer cases to the board without compromising user privacy. On April 13, 2021, the Oversight Board announced that it would start accepting appeals by users seeking to take down other people's content that had not been removed following an objection.[52]
The charter provides for future candidates to be nominated for board membership, through a recommendations portal operated by the U.S. law firm Baker McKenzie.[53]
The 20 members of the Oversight Board were announced on May 6, 2020.[54] The co-chairs, who selected the other members jointly with Facebook, are former U.S. federal circuit judge and religious freedom expert Michael McConnell, constitutional law expert Jamal Greene, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt.[54] Among the initial cohort were: former European Court of Human Rights judge András Sajó, Internet Sans Frontières Executive Director Julie Owono, Yemeni activist and Nobel Peace Prize laureate Tawakkol Karman, former editor-in-chief of The Guardian Alan Rusbridger, Pakistani digital rights advocate Nighat Dad, and Ronaldo Lemos, lawyer that created the Brazilian Civil Rights Framework for the Internet law.[55]
On April 20, 2021, its newest board member, PEN America CEO Suzanne Nossel, was appointed to replace Pamela S. Karlan, who had resigned in February 2021 to join the Biden administration.[56], the United States has the most substantial representation with five members, including two of the four co-chairs of the board. Two board members come from South American countries, six come from countries all across Asia, three come from Africa including one with both African and European ties, who also counts towards three coming from Europe, and one comes from Australia.
Name | Country | Term | Details | |
---|---|---|---|---|
Helle Thorning-Schmidt, Co-chair | Denmark | 2020–Present | Former Prime Minister of Denmark | |
Catalina Botero Marino, Co-chair | 2020–Present | |||
Michael W. McConnell, Co-chair | 2020–Present | Former Judge of the U.S. Court of Appeals for the 10th Circuit | ||
Evelyn Aswad, Co-chair | 2020–Present | University of Oklahoma College of Law Professor | ||
2020–Present | Human rights lawyer | |||
Indonesia | 2020–Present | Journalist | ||
2020–Present | Public relations and statistics professor at National Chengchi University | |||
Pakistan | 2020–Present | Lawyer and internet activist | ||
Yemen | 2020–Present | Journalist and human rights activist | ||
India | 2020–Present | Vice-Chancellor of the National Law School of India University | ||
Brazil | 2020–Present | Lawyer and academic | ||
Cameroon France | 2020–Present | Lawyer and executive director of Internet Sans Frontières | ||
2020–Present | Former Director General of Israeli Ministry of Justice | |||
United Kingdom | 2020–Present | Journalist | ||
2020–Present | Legal Scholar | |||
2020–Present | Vice President of the Cato Institute | |||
Australia | 2020–Present | Queensland University of Technology Law Professor | ||
United States | 2021–Present | CEO of PEN America | ||
Khaled Mansour | Egypt | 2022–Present | Journalist | |
2022–Present | Lawyer, former National Electoral Institute Councilor | |||
2022–Present | University of Notre Dame Law and Political Science Professor | |||
2023–Present | New York University School of Law Professor of Constitutional Law |
Name | Country | Term | Details | |
---|---|---|---|---|
2020–2021 | Stanford Law School Professor | |||
Jamal Greene, Co-chair | United States | 2020–2023 | Columbia Law School Professor | |
Maina Kiai | Kenya | 2020–2023 | Lawyer and human rights activist |
Name | Country | Term | Details | |
---|---|---|---|---|
Stephen Neal, Chair | 2021–Present | Chairman Emeritus and Senior Counsel at the law firm Cooley LLP, former Board Chairperson of Levi Strauss & Co. | ||
2020–Present | Professor and former Dean of Yale Law School | |||
2020–Present | Former Deputy Chief Justice of South Africa | |||
Kristina Arriaga[57] | 2020–Present | Former Vice-Chair of the U.S. Commission on International Religious Freedom | ||
Cherine Chalaby[58] | 2020–Present | Former Chairman of the Board of the Internet Corporation for Assigned Names & Numbers (ICANN) | ||
Marie Wieck[59] | United States | 2022–Present | Former General Manager for Blockchain for IBM Industry Platform |
Name | Country | Term | Details | |
---|---|---|---|---|
Paul G. Haaga Jr., Inaugural Chairperson | 2020–2021 | Former Chairman of the Capital Group | ||
Wanda Felton[60] | United States | 2020–2021 | Former Vice-Chair of the Export–Import Bank of the United States |
January 28, 2021 | Removal | n/a | Malaysia | Hate speech | 2020-001-FB-UA | ||
January 28, 2021 | Removal | Overturn | Myanmar, France, China | Hate speech | 2020-002-FB-UA | ||
January 28, 2021 | Removal | Uphold | Armenia, Azerbaijan | Hate speech | 2020-003-FB-UA | ||
January 28, 2021 | Removal | Overturn | Brazil | Adult nudity and sexual activity | 2020-004-IG-UA | ||
January 28, 2021 | Removal | Overturn | United States | Dangerous individuals and organizations | 2020-005-FB-UA | ||
January 28, 2021 | Removal | Overturn | France | Violence and incitement | 2020-006-FB-FBR | ||
February 12, 2021 | Removal | Overturn | France, India | Violence and incitement | 2020-007-FB-FBR | ||
April 13, 2021 | Removal | Uphold | Netherlands | Hate speech | 2021-002-FB-UA | ||
April 29, 2021 | Removal | Overturn | India | Dangerous individuals and organizations | 2021-003-FB-UA | ||
May 5, 2021 | Account suspension | Uphold | United States | Dangerous individuals and organizations | 2021-001-FB-FBR | ||
July 10, 2021 | Removal | Overturn | Russia | Bullying And Harassment | 2021-004-FB-UA |
See also: Real Facebook Oversight Board. Facebook's introduction of the Oversight Board elicited a variety of responses, with St. John's University law professor Kate Klonick describing its creation as an historic endeavor,[61] and technology news website The Verge deeming it "a wild new experiment in platform governance".[62] Politico described it as "an unapologetically globalist mix of academic experts, journalists and political figures".[13]
Even before the board made its first decisions, critics speculated that the board would be too strict, too lenient, or otherwise ineffective. In May 2020, Republican Senator Josh Hawley described the board as a "special censorship committee".[63] Other critics expressed doubts that it would be effective, leading to the creation of an unrelated and unaffiliated group of "vocal Facebook critics" calling itself the "Real Facebook Oversight Board".[62] Facebook issued no official comment on the effort, while Slate described it as "a citizen campaign against the board".[7]
Legal affairs blogger Evelyn Douek noted that the board's initial decisions "strike at matters fundamental to the way Facebook designs its content moderation system and clearly signal that the FOB does not intend to play mere occasional pitstop on Facebook's journey to connect the world".[63]