Jason J. Gauci and Kenneth O. Stanley (2007)
Generating Large-Scale Neural Networks Through Discovering Geometric Regularities
In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2007). New York, NY: ACM, 2007 (8 pages)

Connectivity patterns in biological brains exhibit many repeating motifs.  This repetition mirrors inherent geometric regularities in the physical world.  For example, stimuli that excite adjacent locations on the retina map to neurons that are similarly adjacent in the visual cortex.  That way, neural connectivity can exploit geometric locality in the outside world by employing local connections in the brain. If such regularities could be discovered by methods that evolve artificial neural networks (ANNs), then they could be similarly exploited to solve problems that would otherwise require optimizing too many dimensions to solve.  This paper introduces such a method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain. Connective CPPNs encode connectivity patterns as concepts that are independent of the number of inputs or outputs, allowing functional large-scale neural networks to be evolved.  In this paper, this approach is tested in a simple visual task for which it effectively discovers the correct underlying regularity, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.