Minimalist program explained

In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.[1]

Following Imre Lakatos's distinction, Chomsky presents minimalism as a program, understood as a mode of inquiry that provides a conceptual framework which guides the development of linguistic theory. As such, it is characterized by a broad and diverse range of research directions. For Chomsky, there are two basic minimalist questions—What is language? and Why does it have the properties it has?—but the answers to these two questions can be framed in any theory.[2]

Conceptual framework

See also: Biolinguistics.

Goals and assumptions

Minimalism is an approach developed with the goal of understanding the nature of language. It models a speaker's knowledge of language as a computational system with one basic operation, namely Merge. Merge combines expressions taken from the lexicon in a successive fashion to generate representations that characterize I-Language, understood to be the internalized intensional knowledge state as represented in individual speakers. By hypothesis, I-language—also called universal grammar—corresponds to the initial state of the human language faculty in individual human development.

Minimalism is reductive in that it aims to identify which aspects of human language—as well the computational system that underlies it—are conceptually necessary. This is sometimes framed as questions relating to perfect design (Is the design of human language perfect?) and optimal computation (Is the computational system for human language optimal?) According to Chomsky, a human natural language is not optimal when judged based on how it functions, since it often contains ambiguities, garden paths, etc. However, it may be optimal for interaction with the systems that are internal to the mind.[3]

Such questions are informed by a set of background assumptions, some of which date back to the earliest stages of generative grammar:[4]

  1. Language is a form of cognition. There is a language faculty (FL) that interacts with other cognitive systems; this accounts for why humans acquire language.
  2. Language is a computational system. The language faculty consists of a computational system (CHL) whose initial state (S0) contains invariant principles and parameters.
  3. Language acquisition consists of acquiring a lexicon and fixing the parameter values of the target language.
  4. Language generates an infinite set of expressions given as a sound-meaning pair (π, λ).
  5. Syntactic computation interfaces with phonology: π corresponds to phonetic form (PF), the interface with the articulatory-perceptual (A-P) performance system, which includes articulatory speech production and acoustic speech perception.
  6. Syntactic computation interfaces with semantics: λ corresponds to logical form (LF), the interface with the conceptual-intentional (C-I) performance system, which includes conceptual structure and intentionality.
  7. Syntactic computations are fully interpreted at the relevant interface: (π, λ) are interpreted at the PF and LF interfaces as instructions to the A-P and C-I performance systems.
  8. Some aspects of language are invariant. In particular, the computational system (i.e. syntax) and LF are invariant.
  9. Some aspects of language show variation. In particular, variation reduces to Saussurean arbitrariness, parameters and the mapping to PF.
  10. The theory of grammar meets the criterion of conceptual necessity; this is the Strong Minimalist Thesis introduced by Chomsky in (2001).[5] Consequently, language is an optimal association of sound with meaning; the language faculty satisfies only the interface conditions imposed by the A-P and C-I performance systems; PF and LF are the only linguistic levels.

Strong minimalist thesis

Minimalism develops the idea that human language ability is optimal in its design and exquisite in its organization, and that its inner workings conform to a very simple computation. On this view, universal grammar instantiates a perfect design in the sense that it contains only what is necessary. Minimalism further develops the notion of economy, which came to the fore in the early 1990s, though still peripheral to transformational grammar. Economy of derivation requires that movements (i.e., transformations) occur only if necessary, and specifically to satisfy to feature-checking, whereby an interpretable feature is matched with a corresponding uninterpretable feature. (See discussion of feature-checking below.) Economy of representation requires that grammatical structures exist for a purpose. The structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.

Within minimalism, economy—recast in terms of the strong minimalist thesis (SMT)—has acquired increased importance.[6] The 2016 book entitled Why Only Us—co-authored by Noam Chomsky and Robert Berwick—defines the strong minimalist thesis as follows:

Under the strong minimalist thesis, language is a product of inherited traits as developmentally enhanced through intersubjective communication and social exposure to individual languages (amongst other things). This reduces to a minimum the "innate" component (the genetically inherited component) of the language faculty, which has been criticized over many decades and is separate from the developmental psychology component.

Intrinsic to the syntactic model (e.g. the Y/T-model) is the fact that social and other factors play no role in the computation that takes place in narrow syntax; what Chomsky, Hauser and Fitch refer to as faculty of language in the narrow sense (FLN), as distinct from faculty of language in the broad sense (FLB). Thus, narrow syntax only concerns itself with interface requirements, also called legibility conditions. SMT can be restated as follows: syntax, narrowly defined, is a product of the requirements of the interfaces and nothing else. This is what is meant by "Language is an optimal solution to legibility conditions" (Chomsky 2001:96).

Interface requirements force deletion of features that are uninterpretable at a particular interface, a necessary consequence of Full Interpretation. A PF object must only consist of features that are interpretable at the articulatory-perceptual (A-P) interface; likewise a LF object must consist of features that are interpretable at the conceptual-intentional (C-I) interface. The presence of an uninterpretable feature at either interface will cause the derivation to crash.

Narrow syntax proceeds as a set of operations—Merge, Move and Agree—carried out upon a numeration (a selection of features, words etc., from the lexicon) with the sole aim of removing all uninterpretable features before being sent via Spell-Out to the A-P and C-I interfaces. The result of these operations is a hierarchical syntactic structure that captures the relationships between the component features.

Technical innovations

The exploration of minimalist questions has led to several radical changes in the technical apparatus of transformational generative grammatical theory. Some of the most important are:[7]

Basic operations

Early versions of minimalism posits two basic operations: Merge and Move. Earlier theories of grammar—as well as early minimalist analyses—treat phrasal and movement dependencies differently than current minimalist analyses. In the latter, Merge and Move are different outputs of a single operation. Merge of two syntactic objects (SOs) is called "external Merge". As for Move, it is defined as an instance of "internal Merge", and involves the re-merge of an already merged SO with another SO.[8] In regards to how Move should be formulated, there continues to be active debate about this, but the differences between current proposals are relatively minute.

More recent versions of minimalism recognize three operations: Merge (i.e. external Merge), Move (i.e. internal Merge), and Agree. The emergence of Agree as a basic operation is related to the mechanism which forces movement, which is mediated by feature-checking.

Merge

See also: Merge (linguistics). In its original formulation, Merge is a function that takes two objects (α and β) and merges them into an unordered set with a label, either α or β. In more recent treatments, the possibility of the derived syntactic object being un-labelled is also considered; this is called "simple Merge" (see Label section).

In the version of Merge which generates a label, the label identifies the properties of the phrase. Merge will always occur between two syntactic objects: a head and a non-head.[9] For example, Merge can combine the two lexical items drink and water to generate drink water. In the Minimalist Program, the phrase is identified with a label. In the case of drink water, the label is drink since the phrase acts as a verb. This can be represented in a typical syntax tree as follows, with the name of the derived syntactic object (SO) determined either by the lexical item (LI) itself, or by the category label of the LI:

- Merge (drink, water) → Merge (drinkV, waterN) →
Merge can operate on already-built structures; in other words, it is a recursive operation. If Merge were not recursive, then this would predict that only two-word utterances are grammatical. (This is relevant for child language acquisition, where children are observed to go through a so-called "two-word" stage. This is discussed below in the implications section.) As illustrated in the accompanying tree structure, if a new head (here γ) is merged with a previously formed syntactic object (a phrase, here), the function has the form Merge (γ, {α, {α, β}}) → {γ, {γ, {α, {α, β}}}}. Here, γ is the head, so the output label of the derived syntactic object is γ.

Chomsky's earlier work defines each lexical item as a syntactic object that is associated with both categorical features and selectional features.[10] Features—more precisely formal features—participate in feature-checking, which takes as input two expressions that share the same feature, and checks them off against each other in a certain domain.[11] In some but not all versions of minimalism, projection of selectional features proceeds via feature-checking, as required by locality of selection:[12] [13] [14]

Selection as projection: As illustrated in the bare phrase structure tree for the sentence The girl ate the food; a notable feature is the absence of distinct labels (see Labels below). Relative to Merge, the selectional features of a lexical item determine how it participates in Merge:

Feature-checking: When a feature is "checked", it is removed.

Locality of selection (LOS) is a principle that forces selectional features to participate in feature checking. LOS states that a selected element must combine with the head that selects it either as complement or specifier. Selection is local in the sense that there is a maximum distance that can occur between a head and what it selects: selection must be satisfied with the projection of the head.

Move

Move arises via "internal Merge".

Movement as feature-checking: The original formulation of the extended projection principle states that clauses must contain a subject in the specifier position of spec TP/IP.[15] In the tree above, there is an EPP feature. This is a strong feature which forces re-Merge—which is also called internal merge—of the DP the girl. The EPP feature in the tree above is a subscript to the T head, which indicates that T needs a subject in its specifier position. This causes the movement of to the specifier position of T.

Beyond basic operations

Label

A substantial body of literature in the minimalist tradition focuses on how a phrase receives a proper label.[16] The debate about labeling reflects the deeper aspirations of the minimalist program, which is to remove all redundant elements in favour of the simplest analysis possible.[17] While earlier proposals focus on how to distinguish adjunction from substitution via labeling, more recent proposals attempt to eliminate labeling altogether, but they have not been universally accepted.

Adjunction and substitution: Chomsky's 1995 monograph entitled The Minimalist Program outlines two methods of forming structure: adjunction and substitution. The standard properties of segments, categories, adjuncts, and specifiers are easily constructed. In the general form of a structured tree for adjunction and substitution, α is an adjunct to X, and α is substituted into SPEC, X position. α can raise to aim for the Xmax position, and it builds a new position that can either be adjoined to [Y-X] or is SPEC, X, in which it is termed the 'target'. At the bottom of the tree, the minimal domain includes SPEC Y and Z along with a new position formed by the raising of α which is either contained within Z, or is Z.[18] Adjunction: Before the introduction of bare phrase structure, adjuncts did not alter information about bar-level, category information, or the target's (located in the adjoined structure) head.[19] An example of adjunction using the X-bar theory notation is given below for the sentence Luna bought the purse yesterday. Observe that the adverbial modifier yesterday is sister to VP and dominated by VP. Thus, the addition of the modifier does not change information about the bar-level: in this case the maximal projection VP. In the minimalist program, adjuncts are argued to exhibit a different, perhaps more simplified, structure. Chomsky (1995) proposes that adjunction forms a two-segment object/category consisting of: (i) the head of a label; (ii) a different label from the head of the label. The label L is not considered a term in the structure that is formed because it is not identical to the head S, but it is derived from it in an irrelevant way. If α adjoins to S, and S projects, then the structure that results is L =, where the entire structure is replaced with the head S, as well as what the structure contains. The head is what projects, so it can itself be the label or can determine the label irrelevantly. In the new account developed in bare phrase structure, the properties of the head are no longer preserved in adjunction structures, as the attachment of an adjunct to a particular XP following adjunction is non-maximal, as shown in the figure below that illustrates adjunction in BPS. Such an account is applicable to XPs that are related to multiple adjunction.[20]

Notes and References

  1. Chomsky, Noam. 1993. A minimalist program for linguistic theory. MIT occasional papers in linguistics no. 1. Cambridge, Massachusetts: Distributed by MIT Working Papers in Linguistics.
  2. Boeckx, Cedric Linguistic Minimalism. Origins, Concepts, Methods and Aims, pp. 84 and 115.
  3. Book: Chomsky, Noam . Adriana . Luigi . Belletti . Rizzi . 2002-10-10 . On Nature and Language . An interview on minimalism . 10.1017/cbo9780511613876. 9780521815482 .
  4. Book: Freidin. Robert. Some Roots of Minimalism in Generative Grammar. Lasnik. Howard. 2011-03-03. Oxford University Press. 10.1093/oxfordhb/9780199549368.013.0001.
  5. Chomsky. Noam. 2001. Beyond explanatory adequacy. MIT Working Papers in Linguistics. 20. 1–22.
  6. For a full description of the checking mechanism see Adger, David. 2003. Core Syntax. A Minimalist Approach. Oxford: Oxford University Press; and also Carnie, Andrew. 2006. Syntax: A Generative Introduction, 2nd Edition. Blackwell Publishers
  7. For some conceptual and empirical advantages of the MP over the traditional view see: Bošković, Željko. 1994. D-Structure, Θ-Criterion, and Movement into Θ-Positions. Linguistic Analysis 24: 247–286, and for more detailed discussions Bošković, Željko and Howard Lasnik (eds). 2006. Minimalist Syntax: The Essential Readings. Malden, MA: Blackwell.
  8. Hornstein. Norbert. 2018. Minimalist Program after 25 Years. Annual Review of Linguistics. 4. 49–65. 10.1146/annurev-linguistics-011817-045452.
  9. Book: Fukui, Naoki. Merge in the Mind-Brain: Essays on Theoretical Linguistics and the Neuroscience of Language. 2017. Routledge. 978-1-315-44280-8. 1. New York. en. 10.4324/9781315442808-2.
  10. Book: Chomsky, Noam. The minimalist program. MIT Press. Howard Lasnik. 2015. 978-0-262-32728-2. 20th anniversary. Cambridge, Massachusetts. en. 899496765.
  11. Book: Baltin, Mark R.. The Handbook of Contemporary Syntactic Theory.. 2007. John Wiley & Sons. Chris Collins. 978-0-470-75635-5. Chichester. 357. 437218785.
  12. Book: Sportiche, Dominique. An introduction to syntactic analysis and theory. Koopman, Hilda Judith. Stabler, Edward P.. 23 September 2013. 978-1-118-47048-0. Hoboken. 861536792.
  13. Book: Bagchi, Tista. On theta role assignment by feature checking. Argument structure. John Benjamins. 2007. 978-90-272-3372-1. Reuland. Eric J.. Amsterdam. 159–174. Bhattacharya. Tanmoy. Spathas. Giorgos.
  14. Book: Zeijlstra, Hedde. Labeling, selection, and feature checking. Agree to Agree: Agreement in the Minimalist Programme. Language Science Press. 2020. Smith. Peter W.. Berlin. 31–70. 10.5281/zenodo.3541745. 9783961102143 . Mursell. Johannes. Hartmann. Katharina.
  15. Book: Fukui, Naoki. The Handbook of Contemporary Syntactic Theory. Blackwell Publishers. 2001. 978-0-470-75641-6. Oxford, UK. 374–408. Phrase Structure. 10.1002/9780470756416.ch12.
  16. Book: Derivation and Explanation in the Minimalist Program. 2002. John Wiley & Sons, Ltd. 9780470755662. Epstein. Samuel David. 1. 10.1002/9780470755662. Seely. T. Daniel.
  17. Lowe. John. Lovestrand. Joseph. 2020-06-29. Minimal phrase structure: a new formalized theory of phrase structure. Journal of Language Modelling. 8. 1. 1. 10.15398/jlm.v8i1.247. 2299-8470. free.
  18. Book: Chomsky. Noam. The Minimalist Program. 1995. MIT Press. Cambridge MA.
  19. Hornstein. Norbert. Nunes. Jairo. 2008. Adjunction, Labeling, and Bare Phrase Structure. Biolinguistics. 2. 1. 057–086 . 10.5964/bioling.8621 . 54832701 . free.
  20. AdjunctionDEFINITION: <H(S), H (S)>, where Label = (S = head)

    - Adjunction in X-bar theory Adjunction in bare phrase structure
    Substitution forms a new category consisting of a head (H), which is the label, and an element being projected. Some ambiguities may arise if the features raising, in this case α, contain the entire head and the head is also XMAX.

    SubstitutionDEFINITION: Label =