Sebastian Risi and Kenneth O. Stanley (2011)
Enhancing ES-HyperNEAT to Evolve More Complex Regular Neural Networks
In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2011). New York, NY:ACM (8 pages)

Nominated for Best Paper Award in Generative and Developmental Systems at GECCO-2011

Abstract  

The recently-introduced evolvable-substrate HyperNEAT algorithm (ES-HyperNEAT) demonstrated that the placement and density of hidden nodes in an artificial neural network can be determined based on implicit information in an infinite-resolution pattern of weights, thereby avoiding the need to evolve explicit placement. However, ES-HyperNEAT is computationally expensive because it must search the entire hypercube, and was shown only to match the performance of the original HyperNEAT in a simple benchmark problem. Iterated ES-HyperNEAT, introduced in this paper, helps to reduce computational costs by focusing the search on a sequence of two-dimensional cross-sections of the hypercube and therefore makes possible searching the hypercube at a finer resolution. A series of experiments and an analysis of the evolved networks show for the first time that iterated ES-HyperNEAT not only matches but outperforms original HyperNEAT in more complex domains because ES-HyperNEAT can evolve networks with limited connectivity, elaborate on existing network structure, and compensate for movement of information within the hypercube.