TESCREAL is an acronym neologism proposed by computer scientist Timnit Gebru and philosopher Émile P. Torres that stands for "transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism".[1] [2] Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins. They say this is a movement that allows its proponents to use the threat of human extinction to justify expensive or detrimental projects. They consider it pervasive in social and academic circles in Silicon Valley centered around artificial intelligence.[3] As such, the acronym is sometimes used to criticize a perceived belief system associated with Big Tech.[4] [5]
Gebru and Torres coined "TESCREAL" in 2023, first using it in a draft of a paper titled "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday published the paper in April 2024, though Torres and Gebru popularized the term elsewhere before the paper's publication. According to Gebru and Torres, transhumanism, extropianism, singularitarianism, (modern) cosmism, rationalism, effective altruism, and longtermism are a "bundle" of "interconnected and overlapping ideologies" that emerged from 20th-century eugenics, with shared progenitors. They use the term "TESCREAList" to refer to people who subscribe to, or appear to endorse, most or all of the ideologies captured in the acronym.
According to critics of these philosophies, TESCREAL describes overlapping movements endorsed by prominent people in the tech industry to provide intellectual backing to pursue and prioritize projects including artificial general intelligence (AGI), life extension, and space colonization.[6] Science fiction author Charles Stross, using the example of space colonization, argued that the ideologies allow billionaires to pursue massive personal projects driven by a right-wing interpretation of science fiction by arguing that not to pursue such projects poses an existential risk to society.[7] Gebru and Torres write that, using the threat of extinction, TESCREALists can justify "attempts to build unscoped systems which are inherently unsafe". Media scholar Ethan Zuckerman argues that by only considering goals that are valuable to the TESCREAL movement, futuristic projects with more immediate drawbacks, such as racial inequity, algorithmic bias, and environmental degradation, can be justified.[8] Speaking at Radio New Zealand, politics writer Danyl McLauchlan said that many of these philosophies may have started off with good intentions but might have been pushed "to a point of ridiculousness."
Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems and over-engineering solutions, TESCREALists ignore the context in which many problems arise.[9] Camille Sojit Pejcha wrote in Document Journal that TESCREAL is a tool for tech elites to concentrate power. In The Washington Spectator, Dave Troy called TESCREAL an "ends justifies the means" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance". Gil Duran wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases used in early 2024 to describe a new ideology emerging in the tech industry.[10]
Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and eschatology.[11] [12] Writers in Current Affairs compared these philosophies and the ensuing techno-optimism to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future.[13]
Much of the discourse about existential risk from AGI occurs among supporters of the TESCREAL ideologies.[14] [15] TESCREALists are either considered "AI accelerationists", who consider AI the only way to pursue a utopian future where problems are solved, or "AI doomers", who consider AI likely to be unaligned to human survival and likely to cause human extinction. Despite the risk, many doomers consider the development of AGI inevitable and argue that only by developing and aligning AGI first can existential risk be averted.
Gebru has likened the conflict between accelerationists and doomers to a "secular religion selling AGI enabled utopia and apocalypse". Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technology that may adversely affect society, disproportionately harm minorities through algorithmic bias, and have a detrimental impact on the environment.
Neşe Devenot has used the TESCREAL acronym to refer to "global financial and tech elites" who promote new uses of psychedelic drugs as mental health treatments, not because they want to help people, but so that they can make money on the sale of these pharmaceuticals as part of a plan to increase inequality.[16]
Gebru and Torres claim that TESCREAL ideologies directly originate from 20th-century eugenics and that the bundle of ideologies advocates a second wave of new eugenics.[17] Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that were used to justify mass murder and genocide.[18] Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist.[19] [20] McLauchlan has said that, while "some people in these groups want to genetically engineer superintelligent humans, or replace the entire species with a superior form of intelligence" others "like the effective altruists, for example, most of them are just in it to help very poor people ... they are kind of shocked ... that they've been lumped into this malevolent ... eugenics conspiracy".
Ozy Brennan, writing in a magazine affiliated with the Centre for Effective Altruism, criticized Gebru's and Torres's grouping of different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken philosophical thought experiments out of context.[21] James Pethokoukis, of the American Enterprise Institute, disagrees with criticizing proponents of TESCREAL. He argues that the tech billionaires criticized in a Scientific American article for allegedly espousing TESCREAL have significantly advanced society.[22] McLauchlan has noted that critics of the TESCREAL bundle have objected to what they see as disparate and sometimes conflicting ideologies being grouped together, but opines that TESCREAL is a good way to describe and consolidate many of the "grand bizarre ideologies in Silicon Valley". Eli Sennesh and James Hughes, publishing in the blog for the transhumanist Institute for Ethics and Emerging Technologies, have argued that TESCREAL is a left-wing conspiracy theory that unnecessarily groups disparate philosophies together without understanding the mutually exclusive tenets in each.[23]
According to Torres, "If advanced technologies continue to be developed at the current rate, a global-scale catastrophe is almost certainly a matter of when rather than if." Torres believes that "perhaps the only way to actually attain a state of ‘existential security’ is to slow down or completely halt further technological innovation", and criticized the longtermist view that technology, although dangerous, is essential for human civilization to achieve its full potential.[24] Brennan contends that Torres's proposal to slow or halt technological development represents a more extreme position than TESCREAL ideologies, preventing many improvements in quality of life, healthcare, and poverty reduction that technological progress enables.
Venture capitalist Marc Andreessen has self-identified as a TESCREAList.[25] He published the "Techno-Optimist Manifesto" in October 2023, which Jag Bhalla and Nathan J. Robinson have called a "perfect example" of the TESCREAL ideologies. In the document, he argues that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers.
Elon Musk has been described as sympathetic to some TESCREAL ideologies. In August 2022, Musk tweeted that William MacAskill's longtermist book What We Owe the Future was a "close match for my philosophy".[26] Some writers believe Musk's Neuralink pursues TESCREAList goals.[27] Some AI experts have complained about the focus of Musk's XAI company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements.[28] [29] Dave Troy believes Musk's natalist views originate from TESCREAL ideals.
It has also been suggested that Peter Thiel is sympathetic to TESCREAL ideas.[30] Benjamin Svetkey wrote in The Hollywood Reporter that Thiel and other Silicon Valley CEOs who support the Donald Trump 2024 presidential campaign are pushing for policies that would shut down "regulators whose outdated restrictions on things like human experimentation are slowing down progress toward a technotopian paradise".
Sam Altman and much of the OpenAI board has been described as supporting TESCREAL movements, especially in the context of his attempted firing in 2023.[31] [12] Gebru and Torres have urged Altman not to pursue TESCREAL ideals.[32] Lorraine Redaud writing in Charlie Hebdo described Sam Altman and multiple other Silicon Valley executives as supporting TESCREAL ideals.
Self-identified transhumanists Nick Bostrom and Eliezer Yudkowsky, both influential in discussions of existential risk from AI, have also been described as leaders of the TESCREAL movement. Redaud said Bostrom supported some ideals "in line with the TESCREALists movement".
Sam Bankman-Fried, former CEO of the FTX cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community.[33] According to The Guardian, since FTX's collapse, administrators of the bankruptcy estate have been trying to recoup about $5 million that they allege was transferred to a nonprofit to help secure the purchase of a historic hotel that has been repurposed for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property hosted liberal eugenicists and other speakers the Guardian said had racist and misogynistic histories.
Longtermist and effective altruist William MacAskill, who frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList.