In artificial intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation^{[1]}, a generic populationbased metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators.
Evolutionary algorithms often perform well approximating solutions to all types of problems because they ideally do not make any assumption about the underlying fitness landscape. Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes and planning models based upon cellular processes. In most real applications of EAs, computational complexity is a prohibiting factor.^{[2]} In fact, this computational complexity is due to fitness function evaluation. Fitness approximation is one of the solutions to overcome this difficulty. However, seemingly simple EA can solve often complex problems; therefore, there may be no direct link between algorithm complexity and problem complexity.
Implementation
Step One: Generate the initial population of individuals randomly. (First generation)
Step Two: Evaluate the fitness of each individual in that population (time limit, sufficient fitness achieved, etc.)
Step Three: Repeat the following regenerational steps until termination:
 Select the bestfit individuals for reproduction. (Parents)
 Breed new individuals through crossover and mutation operations to give birth to offspring.
 Evaluate the individual fitness of new individuals.
 Replace leastfit population with new individuals.
Types
Similar techniques differ in genetic representation and other implementation details, and the nature of the particular applied problem.
 Genetic algorithm  This is the most popular type of EA. One seeks the solution of a problem in the form of strings of numbers (traditionally binary, although the best representations are usually those that reflect something about the problem being solved)^{[2]}, by applying operators such as recombination and mutation (sometimes one, sometimes both). This type of EA is often used in optimization problems. Another name for it is fetura, from the Latin for breeding.^{[3]}
 Genetic programming  Here the solutions are in the form of computer programs, and their fitness is determined by their ability to solve a computational problem.
 Evolutionary programming  Similar to genetic programming, but the structure of the program is fixed and its numerical parameters are allowed to evolve.
 Gene expression programming  Like genetic programming, GEP also evolves computer programs but it explores a genotypephenotype system, where computer programs of different sizes are encoded in linear chromosomes of fixed length.
 Evolution strategy  Works with vectors of real numbers as representations of solutions, and typically uses selfadaptive mutation rates.
 Differential evolution  Based on vector differences and is therefore primarily suited for numerical optimization problems.
 Neuroevolution  Similar to genetic programming but the genomes represent artificial neural networks by describing structure and connection weights. The genome encoding can be direct or indirect.
 Learning classifier system  Here the solution is a set of classifiers (rules or conditions). A MichiganLCS evolves at the level of individual classifiers whereas a PittsburghLCS uses populations of classifiersets. Initially, classifiers were only binary, but now include real, neural net, or Sexpression types. Fitness is typically determined with either a strength or accuracy based reinforcement learning or supervised learning approach.
Comparison to biological processes
A possible limitation^{[according to whom?]} of many evolutionary algorithms is their lack of a clear genotypephenotype distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to become a mature phenotype. This indirect encoding is believed to make the genetic search more robust (i.e. reduce the probability of fatal mutations), and also may improve the evolvability of the organism.^{[4]}^{[5]} Such indirect (a.k.a. generative or developmental) encodings also enable evolution to exploit the regularity in the environment.^{[6]} Recent work in the field of artificial embryogeny, or artificial developmental systems, seeks to address these concerns. And gene expression programming successfully explores a genotypephenotype system, where the genotype consists of linear multigenic chromosomes of fixed length and the phenotype consists of multiple expression trees or computer programs of different sizes and shapes.^{[7]}^{[improper synthesis?]}
Related techniques
Swarm algorithms^{[clarification needed]} include
Other populationbased metaheuristic methods
 Hunting Search  A method inspired by the group hunting of some animals such as wolves that organize their position to surround the prey, each of them relative to the position of the others and especially that of their leader. It is a continuous optimization method ^{[9]} adapted as a combinatorial optimization method.^{[10]}
 Adaptive dimensional search  Unlike natureinspired metaheuristic techniques, an adaptive dimensional search algorithm does not implement any metaphor as an underlying principle. Rather it uses a simple performanceoriented method, based on the update of the search dimensionality ratio (SDR) parameter at each iteration.^{[11]}
 Firefly algorithm is inspired by the behavior of fireflies, attracting each other by flashing light. This is especially useful for multimodal optimization.
 Harmony search  Based on the ideas of musicians' behavior in searching for better harmonies. This algorithm is suitable for combinatorial optimization as well as parameter optimization.
 Gaussian adaptation  Based on information theory. Used for maximization of manufacturing yield, mean fitness or average information. See for instance Entropy in thermodynamics and information theory.
 Memetic algorithm  A hybrid method, inspired by Richard Dawkins' notion of a meme, it commonly takes the form of a populationbased algorithm coupled with individual learning procedures capable of performing local refinements. Emphasizes the exploitation of problemspecific knowledge, and tries to orchestrate local and global search in a synergistic way.
Examples
The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics.
Gallery
^{[12]}^{[13]}^{[14]}


A twopopulation EA search over a constrained Rosenbrock function. Global optimum is not bounded.


References
 ^ Vikhar, P. A. "Evolutionary algorithms: A critical review and its future prospects". In proceedings of the 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC). Jalgaon, 2016, pp. 261265. ISBN 9781509004676.
 ^ ^{a} ^{b} Cohoon, J; et al. Evolutionary algorithms for the physical design of VLSI circuits (PDF). Advances in Evolutionary Computing: Theory and Applications. Springer, pp. 683712, 2003. ISBN 9783540433309.
 ^ Wayward World, by Jon Roland. Novel that uses fetura to select candidates for public office.
 ^ G.S. Hornby and J.B. Pollack. Creating highlevel components with a generative representation for bodybrain evolution. Artificial Life, 8(3):223246, 2002.
 ^ Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. "Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding". Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. Trondheim, Norway.
 ^ J. Clune, C. Ofria, and R. T. Pennock, "How a generative encoding fares as problemregularity decreases," in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358367, Springer, 2008.
 ^ Ferreira, C., 2001. Gene Expression Programming: A New Adaptive Algorithm for Solving Problems. Complex Systems, Vol. 13, issue 2: 87129.
 ^ F. MerrikhBayat, The runnerroot algorithm: A metaheuristic for solving unimodal and multimodal optimization problems inspired by runners and roots of plants in nature, Applied Soft Computing, Vol. 33, pp. 292303, 2015
 ^ R. Oftadeh et al. (2010), A novel metaheuristic optimization algorithm inspired by group hunting of animals: Hunting search, 60, 20872098.
 ^ A. Agharghor and M,E. Riffi (2017), First Adaptation of Hunting Search Algorithm for the Quadratic Assignment Problem, 520, 263267. doi=10.1007/9783319465685_27
 ^ Hasançebi, O., Kazemzadeh Azad, S. (2015), Adaptive Dimensional Search: A New Metaheuristic Algorithm for Discrete Truss Sizing Optimization, Computers and Structures, 154, 116.
 ^ Simionescu, P.A.; Beale, D.G.; Dozier, G.V. (2004), Constrained optimization problem solving using estimation of distribution algorithms (PDF), Proc. of the 2004 Congress on Evolutionary Computation  CEC2004, Portland, OR, pp. 16471653, doi:10.1109/CEC.2006.1688506, retrieved 2017
 ^ Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006), A TwoPopulation Evolutionary Algorithm for Constrained Optimization Problems (PDF), Proc 2006 IEEE International Conference on Evolutionary Computation, Vancouver, Canada, pp. 16471653, doi:10.1109/CEC.2006.1688506, retrieved 2017
 ^ Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD Users (1st ed.). Boca Raton, FL: CRC Press. ISBN 9781482252903.
Bibliography
 Ashlock, D. (2006), Evolutionary Computation for Modeling and Optimization, Springer, ISBN 0387221964.
 Bäck, T. (1996), Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford Univ. Press.
 Bäck, T., Fogel, D., Michalewicz, Z. (1997), Handbook of Evolutionary Computation, Oxford Univ. Press.
 Banzhaf, W., Nordin, P., Keller, R., Francone, F. (1998), Genetic Programming  An Introduction, Morgan Kaufmann, San Francisco
 Eiben, A.E., Smith, J.E. (2003), Introduction to Evolutionary Computing, Springer.
 Holland, J. H. (1992), Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor
 Michalewicz Z., Fogel D.B. (2004). How To Solve It: Modern Heuristics, Springer.
 Benk? A., Dósa G., Tuza Z. (2010), Bin Packing/Covering with Delivery, Solved with the Evolution of Algorithms, Proc. 2010 IEEE 5th International Conference on BioInspired Computing: Theories and Applications, BICTA 2010, pp. 298302.
 Poli, R.; Langdon, W. B.; McPhee, N. F. (2008). A Field Guide to Genetic Programming. Lulu.com, freely available from the internet. ISBN 9781409200734.
 Price, K., Storn, R.M., Lampinen, J.A., (2005). "Differential Evolution: A Practical Approach to Global Optimization", Springer.
 Ingo Rechenberg (1971): Evolutionsstrategie  Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (PhD thesis). Reprinted by FrommanHolzboog (1973).
 HansPaul Schwefel (1974): Numerische Optimierung von ComputerModellen (PhD thesis). Reprinted by Birkhäuser (1977).
 Simon, D. (2013): Evolutionary Optimization Algorithms, Wiley.
 Computational Intelligence: A Methodological Introduction by Kruse, Borgelt, Klawonn, Moewes, Steinbrecher, Held, 2013, Springer, ISBN 9781447150121