Recursive grammar explained

In computer science, a grammar is informally called a recursive grammar if it contains production rules that are recursive, meaning that expanding a non-terminal according to these rules can eventually lead to a string that includes the same non-terminal again. Otherwise it is called a non-recursive grammar.[1]

For example, a grammar for a context-free language is left recursive if there exists a non-terminal symbol A that can be put through the production rules to produce a string with A (as the leftmost symbol).[2] [3] All types of grammars in the Chomsky hierarchy can be recursive and it is recursion that allows the production of infinite sets of words.[1]

Properties

A non-recursive grammar can produce only a finite language; and each finite language can be produced by a non-recursive grammar.[1] For example, a straight-line grammar produces just a single word.

A recursive context-free grammar that contains no useless rules necessarily produces an infinite language. This property forms the basis for an algorithm that can test efficiently whether a context-free grammar produces a finite or infinite language.[4]

Notes and References

  1. .
  2. http://www.cs.may.ie/~jpower/Courses/parsing/parsing.pdf#search='indirect%20left%20recursion' Notes on Formal Language Theory and Parsing
  3. .
  4. .