The basis of Noam Chomsky's linguistic theory lies in biolinguistics, the linguistic school that holds that the principles underpinning the structure of language are biologically preset in the human mind and hence genetically inherited. He argues that all humans share the same underlying linguistic structure, irrespective of sociocultural differences. In adopting this position Chomsky rejects the radical behaviorist psychology of B. F. Skinner. Chomsky argues that language is a unique evolutionary development of the human species and distinguished from modes of communication used by any other animal species. Chomsky's nativist, internalist view of language is consistent with the philosophical school of "rationalism" and contrasts with the anti-nativist, externalist view of language consistent with the philosophical school of "empiricism", which contends that all knowledge, including language, comes from external stimuli.
See main article: Universal grammar. Since the 1960s, Chomsky has maintained that syntactic knowledge is at least partially inborn, implying that children need only learn certain language-specific features of their native languages. He bases his argument on observations about human language acquisition and describes a "poverty of the stimulus": an enormous gap between the linguistic stimuli to which children are exposed and the rich linguistic competence they attain. For example, although children are exposed to only a very small and finite subset of the allowable syntactic variants within their first language, they somehow acquire the highly organized and systematic ability to understand and produce an infinite number of sentences, including ones that have never before been uttered, in that language. To explain this, Chomsky reasoned that the primary linguistic data must be supplemented by an innate linguistic capacity. Furthermore, while a human baby and a kitten are both capable of inductive reasoning, if they are exposed to exactly the same linguistic data, the human will always acquire the ability to understand and produce language, while the kitten will never acquire either ability. Chomsky referred to this difference in capacity as the language acquisition device, and suggested that linguists needed to determine both what that device is and what constraints it imposes on the range of possible human languages. The universal features that result from these constraints would constitute "universal grammar". Multiple scholars have challenged universal grammar on the grounds of the evolutionary infeasibility of its genetic basis for language, the lack of universal characteristics between languages, and the unproven link between innate/universal structures and the structures of specific languages. Scholar Michael Tomasello has challenged Chomsky's theory of innate syntactic knowledge as based on theory and not behavioral observation.
Although it was influential from 1960s through 1990s, Chomsky's nativist theory was ultimately rejected by the mainstream child language acquisition research community owing to its inconsistency with research evidence. It was also argued by linguists including Geoffrey Sampson, Geoffrey K. Pullum and Barbara Scholz that Chomsky's linguistic evidence for it had been false.
See main article: Transformational grammar, Generative grammar, Chomsky hierarchy and Minimalist program. Transformational-generative grammar is a broad theory used to model, encode, and deduce a native speaker's linguistic capabilities. These models, or "formal grammars", show the abstract structures of a specific language as they may relate to structures in other languages. Chomsky developed transformational grammar in the mid-1950s, whereupon it became the dominant syntactic theory in linguistics for two decades. "Transformations" refers to syntactic relationships within language, e.g., being able to infer that the subject between two sentences is the same person. Chomsky's theory posits that language consists of both deep structures and surface structures: Outward-facing surface structures relate phonetic rules into sound, while inward-facing deep structures relate words and conceptual meaning. Transformational-generative grammar uses mathematical notation to express the rules that govern the connection between meaning and sound (deep and surface structures, respectively). By this theory, linguistic principles can mathematically generate potential sentence structures in a language.
Chomsky is commonly credited with inventing transformational-generative grammar, but his original contribution was considered modest when he first published his theory. In his 1955 dissertation and his 1957 textbook Syntactic Structures, he presented recent developments in the analysis formulated by Zellig Harris, who was Chomsky's PhD supervisor, and by Charles F. Hockett. Their method is derived from the work of the Danish structural linguist Louis Hjelmslev, who introduced algorithmic grammar to general linguistics. Based on this rule-based notation of grammars, Chomsky grouped logically possible phrase-structure grammar types into a series of four nested subsets and increasingly complex types, together known as the Chomsky hierarchy. This classification remains relevant to formal language theory and theoretical computer science, especially programming language theory, compiler construction, and automata theory.
Transformational grammar was the dominant research paradigm through the mid-1970s. The derivative government and binding theory, proposed by Chomsky, replaced it and remained influential through the early 1990s,. Government and binding focused on the principles and parameters framework, which explained children's ability to learn any language by filling open parameters (a set of universal grammar principles) that adapt as the child encounters linguistic data. The Minimalist Program, initiated by Chomsky in the 1990s, asks which minimal principles and parameters theory fits most elegantly, naturally, and simply. In an attempt to simplify language into a system that relates meaning and sound using the minimum possible faculties, Chomsky dispenses with concepts such as "deep structure" and "surface structure" and instead emphasizes the plasticity of the brain's neural circuits, with which come an infinite number of concepts, or "logical forms". When exposed to linguistic data, a hearer-speaker's brain proceeds to associate sound and meaning, and the rules of grammar we observe are in fact only the consequences, or side effects, of the way language works. Thus, while much of Chomsky's prior research focused on the rules of language, he now focuses on the mechanisms the brain uses to generate these rules and regulate speech.
See main article: Noam Chomsky bibliography and filmography. Linguistics