Grammatical evolution (GE) is an evolutionary computation and, more specifically, a genetic programming (GP) technique (or approach) pioneered by Conor Ryan, JJ Collins and Michael O'Neill in 1998[1] at the BDS Group in the University of Limerick.
As in any other GP approach, the objective is to find an executable program, program fragment, or function, which will achieve a good fitness value for a given objective function. In most published work on GP, a LISP-style tree-structured expression is directly manipulated, whereas GE applies genetic operators to an integer string, subsequently mapped to a program (or similar) through the use of a grammar, which is typically expressed in Backus–Naur form. One of the benefits of GE is that this mapping simplifies the application of search to different programming languages and other structures.
In type-free, conventional Koza-style GP, the function set must meet the requirement of closure: all functions must be capable of accepting as their arguments the output of all other functions in the function set. Usually, this is implemented by dealing with a single data-type such as double-precision floating point. While modern Genetic Programming frameworks support typing, such type-systems have limitations that Grammatical Evolution does not suffer from.
GE offers a solution to the single-type limitation by evolving solutions according to a user-specified grammar (usually a grammar in Backus-Naur form). Therefore the search space can be restricted, and domain knowledge of the problem can be incorporated. The inspiration for this approach comes from a desire to separate the "genotype" from the "phenotype": in GP, the objects the search algorithm operates on and what the fitness evaluation function interprets are one and the same. In contrast, GE's "genotypes" are ordered lists of integers which code for selecting rules from the provided context-free grammar. The phenotype, however, is the same as in Koza-style GP: a tree-like structure that is evaluated recursively. This model is more in line with how genetics work in nature, where there is a separation between an organism's genotype and the final expression of phenotype in proteins, etc.
Separating genotype and phenotype allows a modular approach. In particular, the search portion of the GE paradigm needn't be carried out by any one particular algorithm or method. Observe that the objects GE performs search on are the same as those used in genetic algorithms. This means, in principle, that any existing genetic algorithm package, such as the popular GAlib, can be used to carry out the search, and a developer implementing a GE system need only worry about carrying out the mapping from list of integers to program tree. It is also in principle possible to perform the search using some other method, such as particle swarm optimization (see the remark below); the modular nature of GE creates many opportunities for hybrids as the problem of interest to be solved dictates.
Brabazon and O'Neill have successfully applied GE to predicting corporate bankruptcy, forecasting stock indices, bond credit ratings, and other financial applications. GE has also been used with a classic predator-prey model to explore the impact of parameters such as predator efficiency, niche number, and random mutations on ecological stability.[2]
It is possible to structure a GE grammar that for a given function/terminal set is equivalent to genetic programming.
Despite its successes, GE has been the subject of some criticism. One issue is that as a result of its mapping operation, GE's genetic operators do not achieve high locality[3] [4] which is a highly regarded property of genetic operators in evolutionary algorithms.
Although GE was originally described in terms of using an Evolutionary Algorithm, specifically, a Genetic Algorithm, other variants exist. For example, GE researchers have experimented with using particle swarm optimization to carry out the searching instead of genetic algorithms with results comparable to that of normal GE; this is referred to as a "grammatical swarm"; using only the basic PSO model it has been found that PSO is probably equally capable of carrying out the search process in GE as simple genetic algorithms are. (Although PSO is normally a floating-point search paradigm, it can be discretized, e.g., by simply rounding each vector to the nearest integer, for use with GE.)
Yet another possible variation that has been experimented with in the literature is attempting to encode semantic information in the grammar in order to further bias the search process. Other work showed that, with biased grammars that leverage domain knowledge, even random search can be used to drive GE.
GE was originally a combination of the linear representation as used by the Genetic Algorithm for Developing Software (GADS) and Backus Naur Form grammars, which were originally used in tree-based GP by Wong and Leung[5] in 1995 and Whigham in 1996.[6] Other related work noted in the original GE paper was that of Frederic Gruau, who used a conceptually similar "embryonic" approach, as well as that of Keller and Banzhaf,[7] which similarly used linear genomes.
There are several implementations of GE. These include the following.
Language | Year | Location | |
GRAPE | PYthon | 2022 | https://github.com/bdsul/grape |
GELab | Matlab | 2018 | https://github.com/adilraja/GELAB|-|PonyGE2|Python|2017|https://arxiv.org/abs/1703.08535 |
gramEvol | R | 2016 | https://cran.r-project.org/web/packages/gramEvol/vignettes/ge-intro.pdf|-|PyNeurGen|Python|2012|http://pyneurgen.sourceforge.net/|-|Grammatical_ evolution|Ruby|2011|http://www.cleveralgorithms.com/nature-inspired/evolution/grammatical_evolution.rb|-|AGE|C, Lua|2011|http://nohejl.name/age/pdf/AGE-Documentation-1.0.2.pdf |
PonyGE | Python | 2010 | https://code.google.com/archive/p/ponyge/downloads |
GERET | Ruby | 2010 | https://github.com/bver/GERET/ |
GEVA | Java | 2008 | http://ncra.ucd.ie/Site/GEVA.html|-|ECJ|Java|2008|https://cs.gmu.edu/~eclab/projects/ecj/|-|GENN|C++|2007|https://ritchielab.org/research/past-research/52-grammatical-evolution-neural-networks |
libGE | C++, S-Lang, tinycc | 2004 | http://bds.ul.ie/libGE/|}See also
NotesResources
|