Superiorization is an iterative method for constrained optimization. It is used for improving the efficacy of an iterative method whose convergence is resilient to certain kinds of perturbations. Such perturbations are designed to "force" the perturbed algorithm to produce more useful results for the intended application than the ones that are produced by the original iterative algorithm. The perturbed algorithm is called the superiorized version of the original unperturbed algorithm. If the original algorithm is computationally efficient and useful in terms of the target application and if the perturbations are inexpensive to calculate, the method may be used to steer iterates without additional computation cost.
The superiorization methodology is very general and has been used successfully in many important practical applications, such as iterative reconstruction of images from their projections,[1] [2] [3] single-photon emission computed tomography,[4] radiation therapy[5] [6] [7] and nondestructive testing,[8] just to name a few. A special issue of the journal Inverse Problems[9] is devoted to superiorization, both theory[10] [11] [12] and applications.
An important case of superiorization is when the original algorithm is "feasibility-seeking" (in the sense that it strives to find some point in a feasible region that is compatible with a family of constraints) and the perturbations that are introduced into the original iterative algorithm aim at reducing (not necessarily minimizing) a given merit function. In this case, superiorization has a unique place in optimization theory and practice.
Many constrained optimization methods are based on methods for unconstrained optimization that are adapted to deal with constraints. Such is, for example, the class of projected gradient methods wherein the unconstrained minimization inner step "leads" the process and a projection onto the whole constraints set (the feasible region) is performed after each minimization step in order to regain feasibility. This projection onto the constraints set is in itself a non-trivial optimization problem and the need to solve it in every iteration hinders projected gradient methods and limits their efficacy to only feasible sets that are "simple to project onto". Barrier methods or penalty methods likewise are based on unconstrained optimization combined with various "add-on"s that guarantee that the constraints are preserved. Regularization methods embed the constraints into a "regularized" objective function and proceed with unconstrained solution methods for the new regularized objective function.
In contrast to these approaches, the superiorization methodology can be viewed as an antipodal way of thinking. Instead of adapting unconstrained minimization algorithms to handling constraints, it adapts feasibility-seeking algorithms to reduce merit function values. This is done while retaining the feasibility-seeking nature of the algorithm and without paying a high computational price. Furthermore, general-purpose approaches have been developed for automatically superiorizing iterative algorithms for large classes of constraints sets and merit functions; these provide algorithms for many application tasks.
The superiorization methodology and perturbation resilience of algorithms are reviewed in,[13] [14] [15] see also.[16] Current work on superiorization can be appreciated from a continuously updated Internet page.[17] SNARK14[18] is a software package for the reconstruction if 2D images from 1D projections that has a built-in capability of superiorizing any iterative algorithm for any merit function.