Kimeme Explained

Kimeme
Developer:Cyber Dyne s.r.l
Latest Release Version:4.0
Latest Release Date:May 2018
Operating System:Cross-platform
Genre:Technical computing
License:Proprietary

Kimeme is an open platform for multi-objective optimization and multidisciplinary design optimization. It is intended to be coupled with external numerical software such as computer-aided design (CAD), finite element analysis (FEM), structural analysis and computational fluid dynamics tools. It was developed by Cyber Dyne Srl and provides both a design environment for problem definition and analysis and a software network infrastructure to distribute the computational load.[1]

History

Cyber Dyne was founded in 2011 as a research startup to transfer the knowledge of its founders in the field of numerical optimization and computational intelligence methods into a commercial product.

Features

The problem definition workflow is based on the data flow paradigm. Multiple nodes can be interconnected to describe the data flow from the design variables to the desired objectives and constraints. Input/output nodes can be used to calculate any part of the objective(s) computation, using internal (Java, Python or Bash/Batch) or external (third-party) processes. Any of these procedures can be distributed over a LAN or the Cloud, exploiting all the available computational resources. The optimization core is open, and using the memetic computing (MC) approach, which is an extension of the concept of memetic algorithm, the user can define its own optimization algorithm as a set of independent pieces of code called "operators", or "memes". Operators can be implemented either in Java or Python.

Algorithm design

In mathematical folklore, the no free lunch theorem (sometimes pluralized) of David Wolpert and William G. Macready appears in the 1997 "No Free Lunch Theorems for Optimization."[2]

This mathematical result states the need for a specific effort in the design of a new algorithm, tailored to the specific problem to be optimized. Kimeme allows the design and experimentation of new optimization algorithms through the new paradigm of memetic computing, a subject of computational intelligence which studies algorithmic structures composed of multiple interacting and evolving modules (memes).[3]

Design of experiments (DoE)

Different DoE strategies are available, including random generator sequences, Factorial, Orthogonal and Iterative Techniques, as well as D-Optimal or Cross Validation. Monte Carlo and Latin hypercube are available for robustness analysis.

Sensitivity analysis

Local sensitivity as correlation coefficients and partial derivatives can be used only if the correlation between input and output is linear. If the correlation is nonlinear, the global sensitivity analysis has to be used based on a variance-relationship between input and output distribution, such as the Sobol index. With sensitivity analysis, the system complexity can be reduced and the cause-effect chain can be explained.[4] [5]

Multi-objective optimization

In the development process of technical products, there are usually several evaluation goals or criteria to be met, e.g. low cost, high quality, low noise etc. These criteria often conflict with each other, in the sense that the minimization of one entails the maximization of at least another one. Design parameters have to be found in order to find the best trade-off among multiple criteria. Unlike the single-objective case, in multi-objective optimization there is not a unique solution, but rather a front of Pareto optimal solutions. Multi-objective optimization aims at finding the Pareto optimal solutions automatically.

See also

Notes and References

  1. Web site: www.cyberdyne.it. Cyber Dyne s.r.l..
  2. Wolpert, D.H., Macready, W.G. (1997), "No Free Lunch Theorems for Optimization," IEEE Transactions on Evolutionary Computation 1, 67. http://ti.arc.nasa.gov/m/profile/dhw/papers/78.pdf
  3. Neri, F. & Cotta, C. 2011. "A primer on memetic algorithms". In "F. Neri, C. Cotta & P. Moscato (Eds.) Handbook of Memetic Algorithms", "Springer. Studies in Computational Intelligence".
  4. Saltelli, A., Chan, K. and Scott, E.M.: Sensitivity Analysis. John Wiley & Sons Chichester, New York 2000
  5. Oakley J.E., O´Hagan A.: Probabilistic Sensitivity Analysis of Computer Models: a Bayesian Approach. Journal of the Royal Statistical Society, Series B, 66:751-769, 2004