Great deluge algorithm explained

The Great deluge algorithm (GD) is a generic algorithm applied to optimization problems. It is similar in many ways to the hill-climbing and simulated annealing algorithms.

The name comes from the analogy that in a great deluge a person climbing a hill will try to move in any direction that does not get his/her feet wet in the hope of finding a way up as the water level rises.

In a typical implementation of the GD, the algorithm starts with a poor approximation, S, of the optimum solution. A numerical value called the badness is computed based on S and it measures how undesirable the initial approximation is. The higher the value of badness the more undesirable is the approximate solution. Another numerical value called the tolerance is calculated based on a number of factors, often including the initial badness.

A new approximate solution S' , called a neighbour of S, is calculated based on S. The badness of S' , b' , is computed and compared with the tolerance. If b' is better than tolerance, then the algorithm is recursively restarted with S : = S' , and tolerance := decay(tolerance) where decay is a function that lowers the tolerance (representing a rise in water levels). If b' is worse than tolerance, a different neighbour S* of S is chosen and the process repeated. If all the neighbours of S produce approximate solutions beyond tolerance, then the algorithm is terminated and S is put forward as the best approximate solution obtained.

See also

References