"Our Fragile Intellect" is a 2012 article by American biochemist Gerald Crabtree, published in the journal Trends in Genetics. Crabtree's speculative and controversial thesis argues that human intelligence peaked sometime between 2,000 and 6,000 years ago and has been in steady decline since the advent of agriculture and increasing urbanization. Modern humans, according to Crabtree, have been losing their intellectual and emotional abilities due to accumulating gene mutations that are not being selected against as they once were in our hunter-gatherer past.[1] [2] This theory is sometimes referred to as the "Idiocracy hypothesis".[3]
Crabtree argues that advancements in modern science allow new predictions to be made about both the past and the future of humanity and we can predict "that our intellectual and emotional abilities are genetically surprisingly fragile". Recent studies of genes correlated with human intelligence on the X chromosome indicate typical intellectual and emotional activity depends on 10% of genes. Intelligence-dependent (ID) genes appear to be widely distributed throughout the entire genome, leading to a figure of 2,000 and 5,000 genes responsible for our cognitive abilities. Deleterious mutations in these genes can impact normal intellectual and emotional functioning in humans. It is thought that in just the last 120 generations (3000 years), humans have received two or more harmful mutations to these genes, or one every 20-50 generations.[4] [5] Crabtree points out that he loves our society's supportive institutions and wishes that they could be extended to include more of our population. The data that support the theory that our intellectual abilities are particularly susceptible to the accumulation of mutations begins with determinations of the human intergenerational mutation rate. This rate has been determined in several human populations to be about 1.20 x10-8 per position per haploid genome[6] [7] [8] [9] with an average father's age of 29.7 years. This rate doubles every 16.5 years with the father's age and ascribes most of the new mutations to the father during the production of sperm.[10] In contrast to popular opinion, this figure indicates that the biological clock (in terms of accumulation of deleterious mutations with time) is ticking faster for men than for women. This figure of 1.20 x10-8 mutations per nucleotide per generation predicts that about 45 to 60 new mutations will appear in each generation. These mutations might accumulate or be removed by natural selection. The speculation that the nervous system and the brain would be more sensitive than other cell types and organs to the accumulation of these new mutations was based on the estimate of the fraction of genes necessary for normal development of the nervous system. The data quantifying the number of genes required for normal intellectual development comes from thousands of published studies (about 23,000 on PubMed from the National Library of Medicine) in which scientists have identified a mutated gene or a region of DNA associated with or causing human intellectual disability. These genes may not even be expressed in the brain. For example, the phenylalanine hydroxylase gene is expressed only in the liver, yet its mutation leads to severe intellectual disability due to the accumulation of metabolites.[11] [12] Many of these genes operate like links on a chain rather than a robust network underlining the fragility of our intellectual abilities. For example, mutations of a single nucleotide out of the 3 billion human nucleotides in our genomes in one copy of the ARID1B gene are a common cause of intellectual disability.[13] Estimates of the total number of genes that when mutated give rise to intellectual disability is thought to be several thousand, perhaps 10-20% of all human genes, which produces a very large target for random mutations. In addition, neuronal genes tend to be large [14] [15] and hence increase the size of the genomic target region for random mutations. The simple combination of the number and size of genes required for normal brain development (>1000) and the fact that each new human generation has 45-60 new mutations per genome led Crabtree to suggest that our intellectual abilities are particularly genetically fragile over many generations. Seemingly the only practical implication of this theory is that perhaps men should have their children when they are young and that women should prefer younger men for mates.
Several counterarguments are presented. The Flynn effect, for example, shows an apparent increase in IQ around the world since 1930. Crabtree attributes the rise in IQ to advancements in environmental and public health measures as well as improved education and other factors. The Flynn effect also shows, argues Crabtree, not an increase in intelligence, but more intelligent test taking.[4] [16]
Kevin Mitchell, associate professor at the Smurfit Institute of Genetics at Trinity College Dublin agreed that genetic mutations could harm the development of the brain in humans and diminish intelligence; new mutations would become apparent in new generations. However, Mitchell criticizes Crabtree for failing to acknowledge the role of natural selection. According to Mitchell, natural selection "definitely has the ability to weed out new mutations that significantly impair intellectual ability". Mitchell describes Crabtree's argument as a conceptual fallacy and says Crabtree is "thinking about things in a wrong way".[2]
Biologist Steve Jones, Emeritus Professor of Genetics at University College London questioned the journal's decision to publish the paper, calling the study "a classic case of Arts Faculty science. Never mind the hypothesis, give me the data, and there aren't any".[17] Crabtree acknowledges that the data isn't there because a slow genetic deterioration in intelligence can't be detected by comparing it to people today. Instead, Crabtree argues that he is synthesizing already existing data and making a purely mathematical argument that estimates the probability of the number of new mutations that could result in cognitive deficits in future generations.[18]
Anthropologist Robin Dunbar at Oxford University argues against Crabtree's position that brain size was driven by tool use. Instead, Dunbar argues that the social environment drives intelligence. "In reality what has driven human and primate brain evolution is the complexity of our social world", says Dunbar. "That complex world is not going to go away. Doing things like deciding who to have as a mate or how best to rear your children will be with us forever."[19]
Writer Andrew Brown notes that Crabtree's paper represents a familiar, reoccurring notion in both fiction and evolutionary biology. "The idea that civilized man is a degenerate and self-domesticated variation on the wild type is partly a cultural trope, a result of the anxieties of industrialized life", writes Brown. The idea, Brown observes, was popular in the early 20th century fiction of E. M. Forster ("The Machine Stops") and Jack London (The Scarlet Plague). It could also be found in the work of biologists such as Ronald Fisher, who espoused similar concepts in The Genetical Theory of Natural Selection (1930). The most important parts of Fisher's book, Brown writes, expounds on the theme that "civilization is dreadfully threatened by the way the lower classes outbreed the aristocracy." Brown finds related sentiments expressed in the work of W. D. Hamilton, who believed that the "life-saving efforts of modern medicine" threatened the human genome.[20]