An L-system or Lindenmayer system is a parallel rewriting system and a type of formal grammar. An L-system consists of an alphabet of symbols that can be used to make strings, a collection of production rules that expand each symbol into some larger string of symbols, an initial "axiom" string from which to begin construction, and a mechanism for translating the generated strings into geometric structures. L-systems were introduced and developed in 1968 by Aristid Lindenmayer, a Hungarian theoretical biologist and botanist at the University of Utrecht.[1] Lindenmayer used L-systems to describe the behaviour of plant cells and to model the growth processes of plant development. L-systems have also been used to model the morphology of a variety of organisms[2] and can be used to generate self-similar fractals.
As a biologist, Lindenmayer worked with yeast and filamentous fungi and studied the growth patterns of various types of bacteria, such as the cyanobacteria Anabaena catenula. Originally, the L-systems were devised to provide a formal description of the development of such simple multicellular organisms, and to illustrate the neighbourhood relationships between plant cells. Later on, this system was extended to describe higher plants and complex branching structures.
The recursive nature of the L-system rules leads to self-similarity and thereby, fractal-like forms are easy to describe with an L-system. Plant models and natural-looking organic forms are easy to define, as by increasing the recursion level the form slowly 'grows' and becomes more complex. Lindenmayer systems are also popular in the generation of artificial life.
L-system grammars are very similar to the semi-Thue grammar (see Chomsky hierarchy). L-systems are now commonly known as parametric L systems, defined as a tuple
G = (V, ω, P),
where
The rules of the L-system grammar are applied iteratively starting from the initial state. As many rules as possible are applied simultaneously, per iteration. The fact that each iteration employs as many rules as possible differentiates an L-system from a formal language generated by a formal grammar, which applies only one rule per iteration. If the production rules were to be applied only one at a time, one would quite simply generate a string in a language, and all such sequences of applications would produce the language specified by the grammar. There are some strings in some languages, however, that cannot be generated if the grammar is treated as an L-system rather than a language specification. For example,[3] suppose there is a rule S→SS in a grammar. If productions are done one at a time, then starting from S, we can get first SS, and then, applying the rule again, SSS. However, if all applicable rules are applied at every step, as in an L-system, then we cannot get this sentential form. Instead, the first step would give us SS, but the second would apply the rule twice, giving us SSSS. Thus, the set of strings produced by an L-systems from a given grammar is a subset of the formal language defined by the grammar, and if we take a language to be defined as a set of strings, this means that a given L-system is effectively a subset of the formal language defined by the L-system's grammar.
An L-system is context-free if each production rule refers only to an individual symbol and not to its neighbours. Context-free L-systems are thus specified by a context-free grammar. If a rule depends not only on a single symbol but also on its neighbours, it is termed a context-sensitive L-system.
If there is exactly one production for each symbol, then the L-system is said to be deterministic (a deterministic context-free L-system is popularly called a D0L system). If there are several, and each is chosen with a certain probability during each iteration, then it is a stochastic L-system.
Using L-systems for generating graphical images requires that the symbols in the model refer to elements of a drawing on the computer screen. For example, the program Fractint uses turtle graphics (similar to those in the Logo programming language) to produce screen images. It interprets each constant in an L-system model as a turtle command.
Lindenmayer's original L-system for modelling the growth of algae.
variables : A B
constants : none
axiom : A
rules : (A → AB), (B → A)
which produces:
n = 0 : A
n = 1 : AB
n = 2 : ABA
n = 3 : ABAAB
n = 4 : ABAABABA
n = 5 : ABAABABAABAAB
n = 6 : ABAABABAABAABABAABABA
n = 7 : ABAABABAABAABABAABABAABAABABAABAAB
n=0: A start (axiom/initiator) / \ n=1: A B the initial single A spawned into AB by rule (A → AB), rule (B → A) couldn't be applied /| \ n=2: A B A former string AB with all rules applied, A spawned into AB again, former B turned into A / | | | \ n=3: A B A A B note all A's producing a copy of themselves in the first place, then a B, which turns ... / | | | \ | \ \ n=4: A B A A B A B A ... into an A one generation later, starting to spawn/repeat/recurse then
The result is the sequence of Fibonacci words. If we count the length of each string, we obtain the famous Fibonacci sequence of numbers (skipping the first 1, due to our choice of axiom):
1 2 3 5 8 13 21 34 55 89 ...
If we would like to not skip the first 1, we can use axiom B. That would place B node before the topmost node (A) of the graph above.
For each string, if we count the k-th position from the left end of the string, the value is determined by whether a multiple of the golden ratio falls within the interval
(k-1,k)
This example yields the same result (in terms of the length of each string, not the sequence of As and Bs) if the rule (A → AB) is replaced with (A → BA), except that the strings are mirrored.
This sequence is a locally catenative sequence because
G(n)=G(n-1)G(n-2)
G(n)
The shape is built by recursively feeding the axiom through the production rules. Each character of the input string is checked against the rule list to determine which character or string to replace it with in the output string. In this example, a '1' in the input string becomes '11' in the output string, while '[' remains the same. Applying this to the axiom of '0', we get:
axiom: | 0 |
1st recursion: | 1[0]0|-| 2nd recursion: || 11[1[0]0]1[0]0|-| 3rd recursion: || 1111[11[1[0]0]1[0]0]11[1[0]0]1[0]0|-| ...|} We can see that this string quickly grows in size and complexity. This string can be drawn as an image by using turtle graphics, where each symbol is assigned a graphical operation for the turtle to perform. For example, in the sample above, the turtle may be given the following instructions:
The push and pop refer to a LIFO stack (more technical grammar would have separate symbols for "push position" and "turn left"). When the turtle interpretation encounters a '[', the current position and angle are saved, and are then restored when the interpretation encounters a ']'. If multiple values have been "pushed," then a "pop" restores the most recently saved values. Applying the graphical rules listed above to the earlier recursion, we get: Example 3: Cantor setvariables : A B constants : none start : A rules : (A → ABA), (B → BBB) Let A mean "draw forward" and B mean "move forward". This produces the famous Cantor's fractal set on a real straight line R. Example 4: Koch curveA variant of the Koch curve which uses only right angles. variables : F constants : + - start : F rules : (F → F+F-F-F+F) Here, F means "draw forward", + means "turn left 90°", and - means "turn right 90°" (see turtle graphics). n = 0: F n = 1: F+F-F-F+F n = 2: F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F n = 3: F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F+ F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F- F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F- F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F+ F+F-F-F+F+F+F-F-F+F-F+F-F-F+F-F+F-F-F+F+F+F-F-F+F Example 5: Sierpinski triangleThe Sierpinski triangle drawn using an L-system. variables : F G constants : + - start : F-G-G rules : (F → F-G+F+G-F), (G → GG) angle : 120° Here, F and G both mean "draw forward", + means "turn left by angle", and - means "turn right by angle". It is also possible to approximate the Sierpinski triangle using a Sierpiński arrowhead curve L-system. variables : A B constants : + - start : A rules : (A → B-A-B), (B → A+B+A) angle : 60° Here, A and B both mean "draw forward", + means "turn left by angle", and - means "turn right by angle" (see turtle graphics). Example 6: dragon curveThe dragon curve drawn using an L-system. variables : F G constants : + − start : F rules : (F → F+G), (G → F-G) angle : 90° Here, F and G both mean "draw forward", + means "turn left by angle", and − means "turn right by angle". Example 7: fractal plantSee also: Barnsley fern. variables : X F constants : + - [] start : X rules : (X → F+X-X]-F[-FX]+X), (F → FF) angle : 25° First you need to initialize an empty stack. This follows the LIFO (Last in, First Out) method to add and remove elements. Here, F means "draw forward", − means "turn right 25°", and + means "turn left 25°". X does not correspond to any drawing action and is used to control the evolution of the curve. The square bracket "[" corresponds to saving the current values for position and angle, so you push the position and angle to the top of the stack, when the "]" token is encountered, pop the stack and reset the position and angle. Every "[" comes before every "]" token. VariationsA number of elaborations on this basic L-system technique have been developed which can be used in conjunction with each other. Among these are stochastic grammars, context sensitive grammars, and parametric grammars. Stochastic grammarsThe grammar model we have discussed thus far has been deterministic—that is, given any symbol in the grammar's alphabet, there has been exactly one production rule, which is always chosen, and always performs the same conversion. One alternative is to specify more than one production rule for a symbol, giving each a probability of occurring. For example, in the grammar of Example 2, we could change the rule for rewriting "0" from: 0 → 1[0]0 to a probabilistic rule: 0 (0.5) → 1[0]0 0 (0.5) → 0 Under this production, whenever a "0" is encountered during string rewriting, there would be a 50% chance it would behave as previously described, and a 50% chance it would not change during production. When a stochastic grammar is used in an evolutionary context, it is advisable to incorporate a random seed into the genotype, so that the stochastic properties of the image remain constant between generations. Context sensitive grammarsA context sensitive production rule looks not only at the symbol it is modifying, but the symbols on the string appearing before and after it. For instance, the production rule: b < a > c → aa transforms "a" to "aa", but only if the "a" occurs between a "b" and a "c" in the input string: …bac… As with stochastic productions, there are multiple productions to handle symbols in different contexts. If no production rule can be found for a given context, the identity production is assumed, and the symbol does not change on transformation. If context-sensitive and context-free productions both exist within the same grammar, the context-sensitive production is assumed to take precedence when it is applicable. Parametric grammarsIn a parametric grammar, each symbol in the alphabet has a parameter list associated with it. A symbol coupled with its parameter list is called a module, and a string in a parametric grammar is a series of modules. An example string might be: a(0,1)[b(0,0)]a(1,2) The parameters can be used by the drawing functions, and also by the production rules. The production rules can use the parameters in two ways: first, in a conditional statement determining whether the rule will apply, and second, the production rule can modify the actual parameters. For example, look at: a(x,y) : x 0 → a(1, y+1)b(2,3)The module a(x,y) undergoes transformation under this production rule if the conditional x=0 is met. For example, a(0,2) would undergo transformation, and a(1,2) would not. In the transformation portion of the production rule, the parameters as well as entire modules can be affected. In the above example, the module b(x,y) is added to the string, with initial parameters (2,3). Also, the parameters of the already existing module are transformed. Under the above production rule, a(0,2) Becomes a(1,3)b(2,3) as the "x" parameter of a(x,y) is explicitly transformed to a "1" and the "y" parameter of a is incremented by one. Parametric grammars allow line lengths and branching angles to be determined by the grammar, rather than the turtle interpretation methods. Also, if age is given as a parameter for a module, rules can change depending on the age of a plant segment, allowing animations of the entire life-cycle of the tree to be created. Bi-directional grammarsThe bi-directional model explicitly separates the symbolic rewriting system from the shape assignment. For example, the string rewriting process in the Example 2 (Fractal tree) is independent on how graphical operations are assigned to the symbols. In other words, an infinite number of draw methods are applicable to a given rewriting system. The bi-directional model consists of 1) a forward process constructs the derivation tree with production rules, and 2) a backward process realizes the tree with shapes in a stepwise manner (from leaves to the root). Each inverse-derivation step involves essential geometric-topological reasoning. With this bi-directional framework, design constraints and objectives are encoded in the grammar-shape translation. In architectural design applications, the bi-directional grammar features consistent interior connectivity and a rich spatial hierarchy.[4] Open problemsThere are many open problems involving studies of L-systems. For example:
Types of L-systemsL-systems on the real line R: Well-known L-systems on a plane R2 are:
See also
Books
External links
|