In the philosophy of mind, the China brain thought experiment (also known as the Chinese Nation or Chinese Gym) considers what would happen if the entire population of China were asked to simulate the action of one neuron in the brain, using telephones or walkie-talkies to simulate the axons and dendrites that connect neurons. Would this arrangement have a mind or consciousness in the same way that brains do?
Early versions of this scenario were put forward in 1961 by Anatoly Dneprov,[1] [2] [3] in 1974 by Lawrence Davis,[4] and again in 1978 by Ned Block.[5] Block argues that the China brain would not have a mind, whereas Daniel Dennett argues that it would.[6] The China brain problem is a special case of the more general problem whether minds could exist within other, larger minds.[7]
The Chinese room scenario analyzed by John Searle,[8] is a similar thought experiment in philosophy of mind that relates to artificial intelligence. Instead of people who each model a single neuron of the brain, in the Chinese room, clerks who do not speak Chinese accept notes in Chinese and return an answer in Chinese according to a set of rules, without the people in the room ever understanding what those notes mean. In fact, the original short story The Game (1961) by the Soviet physicist and writer Anatoly Dneprov contains both the China brain and the Chinese room scenarios as follows: All 1400 delegates of the Soviet Congress of Young Mathematicians willingly agree to take part in a "purely mathematical game" proposed by Professor Zarubin. The game requires the execution of a certain set of rules given to the participants, who communicate with each other using sentences composed only of the words "zero" and "one". After several hours of playing the game, the participants have no idea of what is going on as they get progressively tired. A young woman becomes too dizzy and leaves the game just before it ends. On the next day, Professor Zarubin reveals to everyone's excitement that the participants were simulating a computer machine that translated a sentence written in Portuguese "Os maiores resultados são produzidos por – pequenos mas contínuos esforços", a language that nobody from the participants understood, into the sentence in Russian "The greatest goals are achieved through minor but continuous ekkedt", a language that everyone from the participants understood. It becomes clear that the last word, which should have been "efforts", is mistranslated due to the young woman who had become dizzy leaving the simulation.[1] [2] [3]
Many theories of mental states are materialist, that is, they describe the mind as the behavior of a physical object like the brain. One formerly prominent example is the identity theory, which says that mental states are brain states. One criticism is the problem of multiple realizability. The physicalist theory that responds to this is functionalism, which states that a mental state can be whatever functions as a mental state. That is, the mind can be composed of neurons, or it could be composed of wood, rocks or toilet paper, as long as it provides mental functionality.
Suppose that the whole nation of China were reordered to simulate the workings of a single brain (that is, to act as a mind according to functionalism). Each Chinese person acts as (say) a neuron, and communicates by special two-way radio in corresponding way to the other people. The current mental state of the China brain is displayed on satellites that may be seen from anywhere in China. The China brain would then be connected via radio to a body, one that provides the sensory inputs and behavioral outputs of the China brain.
Thus, the China brain possesses all the elements of a functional description of mind: sensory inputs, behavioral outputs, and internal mental states causally connected to other mental states. If the nation of China can be made to act in this way, then, according to functionalism, this system would have a mind. Block's goal is to show how unintuitive it is to think that such an arrangement could create a mind capable of thoughts and feelings.
The China brain argues that consciousness is a problem for functionalism. Block's Chinese nation presents a version of what is known as the absent qualia objection to functionalism because it purports to show that it is possible for something to be functionally equivalent to a human being and yet have no conscious experience. A creature that functions like a human being but doesn't feel anything is known as a "philosophical zombie". So the absent qualia objection to functionalism could also be called the "zombie objection".
Some philosophers, like Daniel Dennett, have concluded that the China brain does create a mental state.[6] Functionalist philosophers of mind endorse the idea that something like the China brain can realise a mind, and that neurons are, in principle, not the only material that can create a mental state.[9]