Disinformation Explained

Disinformation is false information deliberately spread to deceive people. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals.[1]

In contrast, misinformation refers to inaccuracies that stem from inadvertent error.[2] Misinformation can be used to create disinformation when known misinformation is purposefully and intentionally disseminated. "Fake news" has sometimes been categorized as a type of disinformation, but scholars have advised not using these two terms interchangeably or using "fake news" altogether in academic writing since politicians have weaponized it to describe any unfavorable news coverage or information.[3]

Etymology

The English word disinformation comes from the application of the Latin prefix dis- to information making the meaning "reversal or removal of information". The rarely used word had appeared with this usage in print at least as far back as 1887.[4] [5] [6] [7]

Some consider it a loan translation of the Russian Russian: дезинформация, transliterated as dezinformatsiya, apparently derived from the title of a KGB black propaganda department. Soviet planners in the 1950s defined disinformation as "dissemination (inthe press, on the radio, etc.) of false reports intended to mislead public opinion."

Disinformation first made an appearance in dictionaries in 1985, specifically, Webster's New College Dictionary and the American Heritage Dictionary. In 1986, the term disinformation was not defined in Webster's New World Thesaurus or New Encyclopædia Britannica. After the Soviet term became widely known in the 1980s, native speakers of English broadened the term as "any government communication (either overt or covert) containing intentionally false and misleading material, often combined selectively with true information, which seeks to mislead and manipulate either elites or a mass audience."

By 1990, use of the term disinformation had fully established itself in the English language within the lexicon of politics. By 2001, the term disinformation had come to be known as simply a more civil phrase for saying someone was lying. Stanley B. Cunningham wrote in his 2002 book The Idea of Propaganda that disinformation had become pervasively used as a synonym for propaganda.

Operationalization

The Shorenstein Center at Harvard University defines disinformation research as an academic field that studies “the spread and impacts of misinformation, disinformation, and media manipulation,” including “how it spreads through online and offline channels, and why people are susceptible to believing bad information, and successful strategies for mitigating its impact”[8] According to a 2023 research article published in New Media & Society, disinformation circulates on social media through deception campaigns implemented in multiple ways including: astroturfing, conspiracy theories, clickbait, culture wars, echo chambers, hoaxes, fake news, propaganda, pseudoscience, and rumors.

In order to distinguish between similar terms, including misinformation and malinformation, scholars collectively agree on the definitions for each term as follows: (1) disinformation is the strategic dissemination of false information with the intention to cause public harm;[9] (2) misinformation represents the unintentional spread of false information; and (3) malinformation is factual information disseminated with the intention to cause harm,[10] [11] these terms are abbreviated 'DMMI'.[12]

In 2019, Camille François devised the "ABC" framework of understanding different modalities of online disinformation:

In 2020, the Brookings Institution proposed amending this framework to include Distribution, defined by the "technical protocols that enable, constrain, and shape user behavior in a virtual space".[14] Similarly, the Carnegie Endowment for International Peace proposed adding Degree ("distribution of the content ... and the audiences it reaches") and Effect ("how much of a threat a given case poses").[15]

Comparisons with propaganda

Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like U.S. Department of State) define propaganda as the use of non-rational arguments to either advance or undermine a political ideal, and use disinformation as an alternative name for undermining propaganda. While others consider them to be separate concepts altogether. One popular distinction holds that disinformation also describes politically motivated messaging designed explicitly to engender public cynicism, uncertainty, apathy, distrust, and paranoia, all of which disincentivize citizen engagement and mobilization for social or political change.

Practice

Disinformation is the label often given to foreign information manipulation and interference (FIMI).[16] [17] Studies on disinformation are often concerned with the content of activity whereas the broader concept of FIMI is more concerned with the "behaviour of an actor" that is described through the military doctrine concept of tactics, techniques, and procedures (TTPs).

Disinformation is primarily carried out by government intelligence agencies, but has also been used by non-governmental organizations and businesses. Front groups are a form of disinformation, as they mislead the public about their true objectives and who their controllers are. Most recently, disinformation has been deliberately spread through social media in the form of "fake news", disinformation masked as legitimate news articles and meant to mislead readers or viewers.[18] Disinformation may include distribution of forged documents, manuscripts, and photographs, or spreading dangerous rumours and fabricated intelligence. Use of these tactics can lead to blowback, however, causing such unintended consequences such as defamation lawsuits or damage to the dis-informer's reputation.

Disinformation by intelligence agencies undermines public integrity and public trust.

Worldwide

American disinformation

The United States Intelligence Community appropriated use of the term disinformation in the 1950s from the Russian dezinformatsiya, and began to use similar strategies during the Cold War and in conflict with other nations. The New York Times reported in 2000 that during the CIA's effort to substitute Mohammed Reza Pahlavi for then-Prime Minister of Iran Mohammad Mossadegh, the CIA placed fictitious stories in the local newspaper. Reuters documented how, subsequent to the 1979 Soviet Union invasion of Afghanistan during the Soviet–Afghan War, the CIA put false articles in newspapers of Islamic-majority countries, inaccurately stating that Soviet embassies had "invasion day celebrations". Reuters noted a former U.S. intelligence officer said they would attempt to gain the confidence of reporters and use them as secret agents, to affect a nation's politics by way of their local media.

In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the Reagan Administration had engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi. White House representative Larry Speakes said reports of a planned attack on Libya as first broken by The Wall Street Journal on August 25, 1986, were "authoritative", and other newspapers including The Washington Post then wrote articles saying this was factual. U.S. State Department representative Bernard Kalb resigned from his position in protest over the disinformation campaign, and said: "Faith in the word of America is the pulse beat of our democracy."

The executive branch of the Reagan administration kept watch on disinformation campaigns through three yearly publications by the Department of State: Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns (1986); Report on Active Measures and Propaganda, 1986–87 (1987); and Report on Active Measures and Propaganda, 1987–88 (1989).

According to a report by Reuters, the United States ran a propaganda campaign to spread disinformation about the Sinovac Chinese COVID-19 vaccine, including using fake social media accounts to spread the disinformation that the Sinovac vaccine contained pork-derived ingredients and was therefore haram under Islamic law.[19] Reuters said the ChinaAngVirus disinformation campaign was designed to "counter what it perceived as China’s growing influence in the Philippines" and was prompted by the "[fear] that China’s COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing". The campaign was also described as "payback for Beijing's efforts to blame Washington for the pandemic".[20] The campaign primarily targeted people in the Philippines and used a social media hashtag for "China is the virus" in Tagalog. The campaign ran from 2020 to mid-2021. The primary contractor for the U.S. military on the project was General Dynamics IT, which received $493 million for its role.

Response

Responses from cultural leaders

Pope Francis condemned disinformation in a 2016 interview, after being made the subject of a fake news website during the 2016 U.S. election cycle which falsely claimed that he supported Donald Trump. He said the worst thing the news media could do was spread disinformation. He said the act was a sin, comparing those who spread disinformation to individuals who engage in coprophilia.

Ethics in warfare

In a contribution to the 2014 book Military Ethics and Emerging Technologies, writers David Danks and Joseph H. Danks discuss the ethical implications in using disinformation as a tactic during information warfare. They note there has been a significant degree of philosophical debate over the issue as related to the ethics of war and use of the technique. The writers describe a position whereby the use of disinformation is occasionally allowed, but not in all situations. Typically the ethical test to consider is whether the disinformation was performed out of a motivation of good faith and acceptable according to the rules of war. By this test, the tactic during World War II of putting fake inflatable tanks in visible locations on the Pacific Islands in order to falsely present the impression that there were larger military forces present would be considered as ethically permissible. Conversely, disguising a munitions plant as a healthcare facility in order to avoid attack would be outside the bounds of acceptable use of disinformation during war.

Use of disinformation is inherently dishonest, as it intends to deceive, and therefore it is always ethically questionable to some degree even when used with the best of intentions. When discovered, disinformation undermines trust and thereby jeopardises the ability of parties to generate a lasting and just peace upon cessation of hostilities. Use of disinformation is a Machiavellian act of duplicity, usually based on the premise that the ends justify the means. However, ethically, such acts distort the truth or are outright falsehoods, and are therefore wrong. They also irresponsibly expose people to the risk of unintended consequences based on false information, in addition to the intended harms or deceptions that can be caused to the targets of disinformation.

Harms can also accrue to those that deploy disinformation, such as intelligence agencies, military commands, paramilitary bodies, and covert operations personnel, including moral injury such as a bad conscience, loss of perceived moral high ground, potential falsification of their own history resulting in dangerous delusions and ignorance, and the generation of long lasting ill will and recriminations against the disseminator by the target groups.

Crucially, there is the serious risk of blowback that can harm the disseminators upon discovery in the form of terrorism, mistrust, or punitive sanctions such as exclusion from international bodies. Accordingly, use of disinformation can undermine the long-term security of those who deploy it. Other risks and harms include the erosion of public life, respect for the truth, and mutual trust in society. Legal accountability for war crimes and provocations is essential under international law.

Research

Research related to disinformation studies is increasing as an applied area of inquiry.[21] [22] The call to formally classify disinformation as a cybersecurity threat is made by advocates due to its increase in social networking sites.[23] Researchers working for the University of Oxford found that over a three-year period the number of governments engaging in online disinformation rose from 28 in 2017, to 40 in 2018, and 70 in 2019. Despite the proliferation of social media websites, Facebook and Twitter showed the most activity in terms of active disinformation campaigns. Techniques reported on included the use of bots to amplify hate speech, the illegal harvesting of data, and paid trolls to harass and threaten journalists.[24]

Whereas disinformation research focuses primarily on how actors orchestrate deceptions on social media, primarily via fake news, new research investigates how people take what started as deceptions and circulate them as their personal views.[25] As a result, research shows that disinformation can be conceptualized as a program that encourages engagement in oppositional fantasies (i.e., culture wars), through which disinformation circulates as rhetorical ammunition for never-ending arguments. As disinformation entangles with culture wars, identity-driven controversies constitute a vehicle through which disinformation disseminates on social media. This means that disinformation thrives, not despite raucous grudges but because of them. The reason is that controversies provide fertile ground for never-ending debates that solidify points of view.

Scholars have pointed out that disinformation is not only a foreign threat as domestic purveyors of disinformation are also leveraging traditional media outlets such as newspapers, radio stations, and television news media to disseminate false information.[26] Current research suggests right-wing online political activists in the United States may be more likely to use disinformation as a strategy and tactic.[27] Governments have responded with a wide range of policies to address concerns about the potential threats that disinformation poses to democracy, however, there is little agreement in elite policy discourse or academic literature as to what it means for disinformation to threaten democracy, and how different policies might help to counter its negative implications.[28]

Consequences of exposure to disinformation online

There is a broad consensus amongst scholars that there is a high degree of disinformation, misinformation, and propaganda online; however, it is unclear to what extent such disinformation has on political attitudes in the public and, therefore, political outcomes.[29] This conventional wisdom has come mostly from investigative journalists, with a particular rise during the 2016 U.S. election: some of the earliest work came from Craig Silverman at Buzzfeed News.[30] Cass Sunstein supported this in #Republic, arguing that the internet would become rife with echo chambers and informational cascades of misinformation leading to a highly polarized and ill-informed society.[31]

Research after the 2016 election found: (1) for 14 percent of Americans social media was their "most important" source of election news; 2) known false news stories "favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times"; 3) the average American adult saw fake news stories, "with just over half of those who recalled seeing them believing them"; and 4) people are more likely to "believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks."[32] Correspondingly, whilst there is wide agreement that the digital spread and uptake of disinformation during the 2016 election was massive and very likely facilitated by foreign agents, there is an ongoing debate on whether all this had any actual effect on the election. For example, a double blind randomized-control experiment by researchers from the London School of Economics (LSE), found that exposure to online fake news about either Trump or Clinton had no significant effect on intentions to vote for those candidates. Researchers who examined the influence of Russian disinformation on Twitter during the 2016 US presidential campaign found that exposure to disinformation was (1) concentrated among a tiny group of users, (2) primarily among Republicans, and (3) eclipsed by exposure to legitimate political news media and politicians. Finally, they find "no evidence of a meaningful relationship between exposure to the Russian foreign influence campaign and changes in attitudes, polarization, or voting behavior."[33] As such, despite its mass dissemination during the 2016 Presidential Elections, online fake news or disinformation probably did not cost Hillary Clinton the votes needed to secure the presidency.[34]

Research on this topic is continuing, and some evidence is less clear. For example, internet access and time spent on social media does not appear correlated with polarisation.[35] Further, misinformation appears not to significantly change political knowledge of those exposed to it.[36] There seems to be a higher level of diversity of news sources that users are exposed to on Facebook and Twitter than conventional wisdom would dictate, as well as a higher frequency of cross-spectrum discussion.[37] [38] Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.[39]

Research is also challenging because disinformation is meant to be difficult to detect and some social media companies have discouraged outside research efforts.[40] For example, researchers found disinformation made "existing detection algorithms from traditional news media ineffective or not applicable...[because disinformation] is intentionally written to mislead readers...[and] users' social engagements with fake news produce data that is big, incomplete, unstructured, and noisy." Facebook, the largest social media company, has been criticized by analytical journalists and scholars for preventing outside research of disinformation.[41] [42] [43] [44]

Alternative perspectives and critiques

Researchers have criticized the framing of disinformation as being limited to technology platforms, removed from its wider political context and inaccurately implying that the media landscape was otherwise well-functioning.[45] "The field possesses a simplistic understanding of the effects of media technologies; overemphasizes platforms and underemphasizes politics; focuses too much on the United States and Anglocentric analysis; has a shallow understanding of political culture and culture in general; lacks analysis of race, class, gender, and sexuality as well as status, inequality, social structure, and power; has a thin understanding of journalistic processes; and, has progressed more through the exigencies of grant funding than the development of theory and empirical findings."[46]

Alternative perspectives have been proposed:

  1. Moving beyond fact-checking and media literacy to study a pervasive phenomenon as something that involves more than news consumption.
  2. Moving beyond technical solutions including AI-enhanced fact checking to understand the systemic basis of disinformation.
  3. Develop a theory that goes beyond Americentrism to develop a global perspective, understand cultural imperialism and Third World dependency on Western news,[47] and understand disinformation in the Global South.[48]
  4. Develop market-oriented disinformation research that examines the financial incentives and business models that nudge content creators and digital platforms to circulate disinformation online.
  5. Include a multidisciplinary approach, involving history, political economy, ethnic studies, feminist studies, and science and technology studies.
  6. Develop understandings of Gendered-based disinformation (GBD) defined as "the dissemination of false or misleading information attacking women (especially political leaders, journalists and public figures), basing the attack on their identity as women."[49] [50]

Strategies for spreading disinformation

Disinformation attack

See main article: Disinformation attack.

The research literature on how disinformation spreads is growing. Studies show that disinformation spread in social media can be classified into two broad stages: seeding and echoing. "Seeding," when malicious actors strategically insert deceptions, like fake news, into a social media ecosystem, and "echoing" is when the audience disseminates disinformation argumentatively as their own opinions often by incorporating disinformation into a confrontational fantasy.

Internet manipulation

Studies show four main methods of seeding disinformation online:

  1. Selective censorship
  2. Manipulation of search rankings
  3. Hacking and releasing
  4. Directly Sharing Disinformation

See also

Further reading

External links

Notes and References

  1. Diaz Ruiz . Carlos . 2023 . Disinformation on digital media platforms: A market-shaping approach . New Media & Society . Online first . 1–24 . 10.1177/14614448231207644 . 264816011 . free .
  2. Web site: Ireton, C & Posetti, J (2018) "Journalism, fake news & disinformation: handbook for journalism education and training" UNESCO . 7 August 2021 . 6 April 2023 . https://web.archive.org/web/20230406163611/https://unesdoc.unesco.org/ark:/48223/pf0000265552 . live .
  3. Freelon . Deen . Wells . Chris . 2020-03-03 . Disinformation as Political Communication . Political Communication . en . 37 . 2 . 145–156 . 10.1080/10584609.2020.1723755 . 1058-4609 . 212897113 . 17 July 2023 . 17 July 2023 . https://web.archive.org/web/20230717173304/https://www.tandfonline.com/doi/full/10.1080/10584609.2020.1723755 . live .
  4. News: 1887-02-17 . City & County Cullings (Early use of the word "disinformation" 1887) . 3 . Medicine Lodge Cresset . 2021-05-24 . 24 May 2021 . https://web.archive.org/web/20210524135425/https://www.newspapers.com/clip/7726932/early-use-of-the-word-disinformation/ . live .
  5. News: 1892-08-18 . Professor Young on Mars and disinformation (1892) . 4 . The Salt Lake Herald . 2021-05-24 . 24 May 2021 . https://web.archive.org/web/20210524135429/https://www.newspapers.com/clip/7729235/professor-young-on-mars-and/ . live .
  6. News: 1907-09-26 . Pure nonsense (early use of the word disinformation) (1907) . 8 . The San Bernardino County Sun . 2021-05-24 . 24 May 2021 . https://web.archive.org/web/20210524135452/https://www.newspapers.com/clip/7729323/pure-nonsense-early-use-of-the-word/ . live .
  7. News: 1917-12-18 . Support for Red Cross helps U.S. boys abroad, Rotary Club is told (1917) . 4 . The Sheboygan Press . 2021-05-24 . 24 May 2021 . https://web.archive.org/web/20210524135431/https://www.newspapers.com/clip/7818737/support-for-red-cross-helps-us-boys/ . live .
  8. Web site: Disinformation . 2023-10-30 . Shorenstein Center . en-US . 30 October 2023 . https://web.archive.org/web/20231030110857/https://shorensteincenter.org/research-initiatives/disinformation/ . live .
  9. Center for Internet Security. (3 October 2022). "Essential Guide to Election Security:Managing Mis-, Dis-, and Malinformation". CIS website Retrieved 18 December 2023.
  10. Baines . Darrin . Elliott . Robert J. R. . April 2020 . Defining misinformation, disinformation and malinformation: An urgent need for clarity during the COVID-19 infodemic . Discussion Papers . en . 14 December 2022 . 14 December 2022 . https://web.archive.org/web/20221214124131/https://ideas.repec.org//p/bir/birmec/20-06.html . live .
  11. Web site: Information disorder: Toward an interdisciplinary framework for research and policy making . 2022-12-14 . Council of Europe Publishing . en . 14 December 2022 . https://web.archive.org/web/20221214125635/https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html . live .
  12. Newman . Hadley . 16 September 2021 . Understanding the Differences Between Disinformation, Misinformation, Malinformation and Information – Presenting the DMMI Matrix . Draft Online Safety Bill (Joint Committee) . UK . UK Government . 4 January 2023 . 4 January 2023 . https://web.archive.org/web/20230104112358/https://committees.parliament.uk/writtenevidence/39289/html/ . live .
  13. Web site: François . Camille . 2019-09-20 . Actors, Behaviors, Content: A Disinformation ABC - Highlighting Three Vectors of Viral Deception to Guide Industry & Regulatory Responses . https://web.archive.org/web/20230321071912/https://docs.house.gov/meetings/SY/SY21/20190926/109980/HHRG-116-SY21-Wstate-FrancoisC-20190926-SD001.pdf . 2023-03-21 . 2024-05-17.
  14. Web site: Alaphilippe . Alexandre . 2020-04-27 . Adding a 'D' to the ABC disinformation framework . https://web.archive.org/web/20231027042531/https://www.brookings.edu/articles/adding-a-d-to-the-abc-disinformation-framework/ . 2023-10-27 . 2024-05-18 . . en-US.
  15. The ABCDE Framework . Pamment . James . 2020 . . 5–9 . https://web.archive.org/web/20240318053702/https://carnegieendowment.org/files/Pamment_-_Crafting_Disinformation_1.pdf . 2024-03-18.
  16. Book: Newman, Hadley . 2022 . Foreign information manipulation and interference defence standards: Test for rapid adoption of the common language and framework 'DISARM' . NATO Strategic Communications Centre of Excellence . 60 . PDF . Latvia . European Centre of Excellence for Countering Hybrid Threats . 978-952-7472-46-0 . 28 December 2022 . 28 December 2022 . https://web.archive.org/web/20221228165610/https://stratcomcoe.org/publications/foreign-information-manipulation-and-interference-defence-standards-test-for-rapid-adoption-of-the-common-language-and-framework-disarm-prepared-in-cooperation-with-hybrid-coe/253 . live .
  17. Web site: European Extrernal Action Service (EEAS) . 27 October 2021 . Tackling Disinformation, Foreign Information Manipulation & Interference .
  18. Tandoc. Edson C. Lim. Darren. Ling. Rich. 2019-08-07. Diffusion of disinformation: How social media users respond to fake news and why. Journalism. 21. 3. en. 381–398. 10.1177/1464884919868325. 202281476. 1464-8849.
  19. News: Bing . Chris . Schechtman . Joel . June 14, 2024 . Pentagon Ran Secret Anti-Vax Campaign to Undermine China during Pandemic . Reuters.
  20. Web site: Toropin . Konstantin . 2024-06-14 . Pentagon Stands by Secret Anti-Vaccination Disinformation Campaign in Philippines After Reuters Report . live . https://web.archive.org/web/20240614223757/https://www.military.com/daily-news/2024/06/14/pentagon-stands-secret-anti-vaccination-disinformation-campaign-philippines-after-reuters-report.html . 2024-06-14 . 2024-06-19 . . en.
  21. Web site: Defining "Disinformation", V1.0. Spies. Samuel. 2019-08-14. MediaWell, Social Science Research Council. en. 2019-11-09. 30 October 2020. https://web.archive.org/web/20201030094140/https://mediawell.ssrc.org/literature-reviews/defining-disinformation/versions/1-0/. live.
  22. Tandoc. Edson C.. 2019. The facts of fake news: A research review. Sociology Compass. en. 13. 9. e12724. 10.1111/soc4.12724. 201392983. 1751-9020.
  23. Book: Caramancion, Kevin Matthe. 2020 3rd International Conference on Information and Computer Technologies (ICICT) . An Exploration of Disinformation as a Cybersecurity Threat . 2020. 440–444. 10.1109/ICICT50521.2020.00076. 978-1-7281-7283-5. 218651389.
  24. Web site: Samantha Bradshaw & Philip N. Howard. (2019) The Global Disinformation Disorder: 2019 Global Inventory of Organised Social Media Manipulation. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk. 17 November 2022. 25 May 2022. https://web.archive.org/web/20220525105011/https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf. live.
  25. Diaz Ruiz . Carlos . Nilsson . Tomas . 16 May 2022 . Disinformation and Echo Chambers: How Disinformation Circulates in Social Media Through Identity-Driven Controversies . Journal of Public Policy & Marketing . 42 . 18–35 . 10.1177/07439156221103852 . 248934562 . 20 June 2022 . 20 June 2022 . https://web.archive.org/web/20220620070343/https://journals.sagepub.com/doi/10.1177/07439156221103852 . live .
  26. Miller . Michael L. . Vaccari . Cristian . July 2020 . Digital Threats to Democracy: Comparative Lessons and Possible Remedies . The International Journal of Press/Politics . en . 25 . 3 . 333–356 . 10.1177/1940161220922323 . 218962159 . 1940-1612 . 14 December 2022 . 14 December 2022 . https://web.archive.org/web/20221214125205/https://journals.sagepub.com/doi/10.1177/1940161220922323 . live .
  27. Freelon. Deen. Marwick. Alice. Kreiss. Daniel. 2020-09-04. False equivalencies: Online activism from left to right. Science. 369. 6508. 1197–1201. EN. 10.1126/science.abb2428. 32883863. 2020Sci...369.1197F. 221471947. 2 February 2022. 21 October 2021. https://web.archive.org/web/20211021142705/https://www.science.org/doi/abs/10.1126/science.abb2428. live.
  28. Tenove . Chris . July 2020 . Protecting Democracy from Disinformation: Normative Threats and Policy Responses . The International Journal of Press/Politics . en . 25 . 3 . 517–537 . 10.1177/1940161220918740 . 219437151 . 1940-1612 . 14 December 2022 . 14 December 2022 . https://web.archive.org/web/20221214125205/https://journals.sagepub.com/doi/10.1177/1940161220918740 . live .
  29. Tucker. Joshua. Guess. Andrew. Barbera. Pablo. Vaccari. Cristian. Siegel. Alexandra. Sanovich. Sergey. Stukal. Denis. Nyhan. Brendan. 2018. Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. SSRN Working Paper Series. 10.2139/ssrn.3144139. 1556-5068. 29 October 2019. 21 February 2021. https://web.archive.org/web/20210221202942/https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3144139. live.
  30. Web site: This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook. BuzzFeed News. 16 November 2016. 2019-10-29. 17 July 2018. https://web.archive.org/web/20180717155014/https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook. live.
  31. Book: Sunstein, Cass R..
    1. Republic : divided democracy in the age of social media
    . 978-0691175515. Princeton. 958799819. 14 March 2017. registration.
  32. Allcott. Hunt. Gentzkow. Matthew. May 2017. Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives. en. 31. 2. 211–236. 10.1257/jep.31.2.211. 32730475. 0895-3309. free.
  33. Eady. Gregory. Paskhalis. Tom. Zilinsky. Jan. Bonneau. Richard. Nagler. Jonathan. Tucker. Joshua A.. 2023-01-09. Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 US Election and its Relationship to Attitudes and Voting Behavior. Nature Communications. 14. 62. 62 . 10.1038/s41467-022-35576-9. 36624094 . 9829855 . 2023NatCo..14...62E . free.
  34. Leyva . Rodolfo . Testing and unpacking the effects of digital fake news: on presidential candidate evaluations and voter support . AI & Society . 2020 . 35 . 4 . 970 . 10.1007/s00146-020-00980-6 . 218592685 . free .
  35. Boxell. Levi. Gentzkow. Matthew. Shapiro. Jesse M.. 2017-10-03. Greater Internet use is not associated with faster growth in political polarization among US demographic groups. Proceedings of the National Academy of Sciences. 114. 40. 10612–10617. 10.1073/pnas.1706588114. 0027-8424. 5635884. 28928150. 2017PNAS..11410612B . free.
  36. Allcott. Hunt. Gentzkow. Matthew. May 2017. Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives. 31. 2. 211–236. 10.1257/jep.31.2.211. 0895-3309. free.
  37. Bakshy. E.. Messing. S.. Adamic. L. A.. 2015-06-05. Exposure to ideologically diverse news and opinion on Facebook. Science. 348. 6239. 1130–1132. 10.1126/science.aaa1160. 25953820. 0036-8075. 2015Sci...348.1130B. 206632821. free.
  38. Wojcieszak. Magdalena E.. Mutz. Diana C.. 2009-03-01. Online Groups and Political Discourse: Do Online Discussion Spaces Facilitate Exposure to Political Disagreement?. Journal of Communication. 59. 1. 40–56. 10.1111/j.1460-2466.2008.01403.x. 18865773 . 0021-9916.
  39. Lanoszka. Alexander. 2019. Disinformation in international politics. European Journal of International Security. 4. 2. 227–248. 10.1017/eis.2019.6. 211312944. 2057-5637.
  40. Shu. Kai. Sliva. Amy. Wang. Suhang. Tang. Jiliang. Liu. Huan. 2017-09-01. Fake News Detection on Social Media: A Data Mining Perspective. ACM SIGKDD Explorations Newsletter. 19. 1. 22–36. 10.1145/3137597.3137600. 1708.01967. 207718082. 1931-0145. 1 February 2022. 5 February 2022. https://web.archive.org/web/20220205204457/https://dl.acm.org/doi/10.1145/3137597.3137600. live.
  41. Web site: Edelson. Laura. McCoy. Damon. How Facebook Hinders Misinformation Research. 2022-02-01. Scientific American. en. 2 February 2022. https://web.archive.org/web/20220202025821/https://www.scientificamerican.com/article/how-facebook-hinders-misinformation-research/. live.
  42. Web site: Edelson. Laura. McCoy. Damon. 2021-08-14. Facebook shut down our research into its role in spreading disinformation. 2022-02-01. The Guardian. en. 24 March 2022. https://web.archive.org/web/20220324171518/https://www.theguardian.com/technology/2021/aug/14/facebook-research-disinformation-politics. live.
  43. Krishnan. Nandita. Gu. Jiayan. Tromble. Rebekah. Abroms. Lorien C.. 2021-12-15. Research note: Examining how various social media platforms have responded to COVID-19 misinformation. Harvard Kennedy School Misinformation Review. en-US. 10.37016/mr-2020-85. 245256590. free. 1 February 2022. 3 February 2022. https://web.archive.org/web/20220203040557/https://misinforeview.hks.harvard.edu/article/research-note-examining-how-various-social-media-platforms-have-responded-to-covid-19-misinformation/. live.
  44. News: Only Facebook knows the extent of its misinformation problem. And it's not sharing, even with the White House.. en-US. Washington Post. 2022-02-01. 0190-8286. 5 February 2022. https://web.archive.org/web/20220205042625/https://www.washingtonpost.com/technology/2021/08/19/facebook-data-sharing-struggle/. live.
  45. Kuo . Rachel . Marwick . Alice . 2021-08-12 . Critical disinformation studies: History, power, and politics . Harvard Kennedy School Misinformation Review . en-US . 10.37016/mr-2020-76 . https://web.archive.org/web/20231015223538/https://misinforeview.hks.harvard.edu/article/critical-disinformation-studies-history-power-and-politics/ . 2023-10-15 . free.
  46. Web site: What Comes After Disinformation Studies? . https://web.archive.org/web/20230203095200/https://citap.unc.edu/ica-preconference-2022/ . 2023-02-03 . 2024-01-16 . Center for Information, Technology, & Public Life (CITAP), University of North Carolina at Chapel Hill . en.
  47. Web site: Tworek . Heidi . 2022-08-02 . Can We Move Beyond Disinformation Studies? . https://web.archive.org/web/20230601164258/https://www.cigionline.org/articles/can-we-move-beyond-disinformation-studies/ . 2023-06-01 . 2024-01-16 . Centre for International Governance Innovation.
  48. Book: Disinformation in the Global South . 2022-04-12 . Wiley . 978-1-119-71444-6 . Wasserman . Herman . 1 . en . 10.1002/9781119714491 . Madrid-Morales . Dani . 4 March 2024 . 4 March 2024 . https://web.archive.org/web/20240304064723/https://onlinelibrary.wiley.com/doi/book/10.1002/9781119714491 . live .
  49. Web site: Sessa . Maria Giovanna . 2020-12-04 . Misogyny and Misinformation: An analysis of gendered disinformation tactics during the COVID-19 pandemic . https://web.archive.org/web/20230919005420/https://www.disinfo.eu/publications/misogyny-and-misinformation:-an-analysis-of-gendered-disinformation-tactics-during-the-covid-19-pandemic/ . 2023-09-19 . 2024-01-16 . EU DisinfoLab . en-US.
  50. Web site: Sessa . Maria Giovanna . 2022-01-26 . What is Gendered Disinformation? . https://web.archive.org/web/20220721155302/https://il.boell.org/en/2022/01/26/what-gendered-disinformation . 2022-07-21 . 2024-01-16 . . en.