Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG.[1] The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare,[2] and the theory of universal grammar remains controversial among linguists.[3]
The term "universal grammar" is placeholder for whichever domain-specific features of linguistic competence turn out to be innate. Within generative grammar, it is generally accepted that there must be some such features, and one of the goals of generative research is to formulate and test hypotheses about which aspects those are.[4] [5] In day-to-day generative research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.[4]
The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments.[6] [7] For example, one famous poverty of the stimulus argument concerns the acquisition of yes-no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting hierarchical structure even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.[6] [7] [8]
Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature-specifications in the lexicon.[5] [9] On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.[5] [10] In a 2002 paper, Noam Chomsky, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.[11]
In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.
The first hypothesis states that the faculty of language in the broad sense (FLb) is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.
The second hypothesis states that the FLb is a derived and uniquely human adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.
The third hypothesis states that only the faculty of language in the narrow sense (FLn) is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion has evolved recently, and solely in humans. This hypothesis aligns most closely with the typical theory of universal grammar championed by Chomsky.
The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory. Creoles are languages that develop and form when disparate societies with no common language come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as a pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole. Unlike pidgins, creoles have native speakers (those with acquisition from early childhood) and make use of a full, systematic grammar.
According to Bickerton, the idea of universal grammar is supported by creole languages because certain features are shared by virtually all in the category. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creoles can be seen in the fact that questions are created simply by changing the intonation of a declarative sentence, not its word order or content.
However, extensive work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, they tend to standardize the language that they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation (and in the real-life situation of a deaf child whose parents are or were disfluent signers), children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar.[12] [13] Further, it seems to follow that creoles would share features with the languages from which they are derived, and thus look similar in terms of grammar.
Many researchers of universal grammar argue against a concept of relexification, which says that a language replaces its lexicon almost entirely with that of another. This goes against universalist ideas of a universal grammar, which has an innate grammar.
Recent work has also suggested that some recurrent neural network architectures are able to learn hierarchical structure without an explicit constraint. This shows that it may in fact be possible for human infants to acquire natural language syntax without an explicit universal grammar.[14]
The empirical basis of poverty of the stimulus arguments has been challenged by Geoffrey Pullum and others, leading to back-and-forth debate in the language acquisition literature.[15] [16]
Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure.[17]
In addition, it has been suggested that people learn about probabilistic patterns of word distributions in their language, rather than hard and fast rules (see Distributional hypothesis).[18] For example, children overgeneralize the past tense marker "ed" and conjugate irregular verbs as if they were regular, producing forms like goed and eated and correct these deviancies over time.[19] It has also been proposed that the poverty of the stimulus problem can be largely avoided, if it is assumed that children employ similarity-based generalization strategies in language learning, generalizing about the usage of new words from similar words that they already know how to use.[20]
Neurogeneticists Simon Fisher and Sonja Vernes consider Chomsky's "Universal Grammar" as an example of a romantic simplification of genetics and neuroscience. According to them, the link from genes to grammar has not been consistently mapped by scientists. What has been established by research relates primarily to speech pathologies. The arising lack of certainty has provided an audience for unconstrained speculations that have fed the myth of "so-called grammar genes".[21]
Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language.[22] [23] Similarly, Jeffrey Elman argues that the unlearnability of languages assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, that is not in keeping with any actual grammar. In keeping with these points, James Hurford argues that the postulate of a language acquisition device (LAD) essentially amounts to the trivial claim that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories.[24]
Morten H. Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate universal grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics".[25]
Wolfram Hinzen summarizes the most common criticisms of universal grammar:
In the domain of field research, Daniel Everett has claimed that the Pirahã language is a counterexample to the basic tenets of universal grammar because it lacks clausal embedding. According to Everett, this trait results from Pirahã culture emphasizing present-moment concrete matters.[27] Other linguists have responded that Pirahã does in fact have clausal embedding, and that even if it did not this would be irrelevant to current theories of universal grammar.[28]
The modern conception of universal grammar is generally attributed to Noam Chomsky. However, similar ideas are found in older work. A related idea is found in Roger Bacon's Overview of Grammar and Greek Grammar, where he postulates that all languages are built upon a common grammar, even though it may undergo incidental variations. In the 13th century, the speculative grammarians postulated universal rules underlying all grammars.
The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld. They tried to describe a general grammar for languages, coming to the conclusion that grammar has to be universal.[29] There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar".
In the late 19th and early 20th century, Wilhelm Wundt and Otto Jespersen responded to these earlier arguments, arguing that their view of language was overly influenced by Latin and ignored the breadth of worldwide linguistic variation. Jesperson did not fully dispense with the idea of a "universal grammar", but reduced it to universal syntactic categories or super-categories, such as number, tenses, etc.
During the rise of behaviorism, the idea of a universal grammar was discarded in light of the idea that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success.[30] In other words, children learned their mother tongue by simple imitation, through listening and repeating what adults said. For example, when a child says "milk" and the mother will smile and give her child milk as a result, the child will find this outcome rewarding, thus enhancing the child's language development.[31]
In 2016 Chomsky and Berwick co-wrote their book titled Why Only Us, where they defined both the minimalist program and the strong minimalist thesis and its implications to update their approach to UG theory. According to Berwick and Chomsky, the strong minimalist thesis states that "The optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT)."[32] The significance of SMT is to significantly shift the previous emphasis on universal grammars to the concept which Chomsky and Berwick now call "merge". "Merge" is defined in their 2016 book when they state "Every computational system has embedded within it somewhere an operation that applies to two objects X and Y already formed, and constructs from them a new object Z. Call this operation Merge." SMT dictates that "Merge will be as simple as possible: it will not modify X or Y or impose any arrangement on them; in particular, it will leave them unordered, an important fact... Merge is therefore just set formation: Merge of X and Y yields the set ."[33]