This paper introduces NEAT (NeuroEvolution of Augmenting Topologies), a genetic algorithm for finding and training neural network topologies. NEAT trains the structure and the weights.
> The weight space is explored through the crossover of network weight vectors and through the mutation of single networks’ weights.
## Evaluation
* XOR-Problem
* Reinforcement learning: balance two poles attached to a cart by moving the cart in appropriate directions to keep the pole from falling
## Topology Encoding
* Direct Encodings
* Bit string: A bit string encodes the connection matrix / matrices. First used in Structured Genetic Algorithm (sGA). Limitations: Crossover might not be useful; fixed size of layers.
* Graph encoding: First used in Parallel Distributed Genetic Programming (PDGP).
* Indirect Encodings
* Cellular Encoding (CE): genomes are programs written in a specialized graph transformation language
## Glossary
For people (like me) who are new to genetic algorithms (GAs):
* [Neuroevolution](https://en.wikipedia.org/wiki/Neuroevolution) (NE): a form of machine learning that uses evolutionary algorithms to train artificial neural networks
* [crossover](https://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)): a genetic operator used to vary the programming of a chromosome or chromosomes from one generation to the next
* [Speciation](https://en.wikipedia.org/wiki/Speciation): the evolutionary process by which biological populations evolve to become distinct species
* TWEANNs: Topology and Weight Evolving Artificial Neural Networks
## Realted
* 2009, K. O. Stanley , D. B. D’Ambrosio and J. Gauci: [A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks.](http://www.shortscience.org/paper?bibtexKey=journals/alife/StanleyDG09): Introduces HyperNEAT