Felix A. Sosa and Kenneth O. Stanley (2018)
Deep HyperNEAT: Evolving the Size and Depth of the Substrate
Evolutionary Complexity Research Group Undergraduate Research Report
University of Central Florida Department of Computer Science (9 pages).

Source code available at https://github.com/flxsosa/DeepHyperNEAT

Abstract  

This report describes DeepHyperNEAT, an extension of HyperNEAT to allow it to alter the topology of its indirectly-encoded neural network (called the substrate) so that it can continue to grow and increase in complexity over evolution.  This aim is accomplished by augmenting HyperNEAT's compositional pattern producing networks (CPPNs) with new information that allows them to represent substrate topology, and by adding three novel mutations to HyperNEAT that exploit this new information. The purpose of this report is to detail the method and validate its ability to evolve, which is shown through a simple XOR-based test.  Significantly more work will be needed to analyze the algorithmic and scientific implications of these new capabilities, but by releasing this report it becomes possible for others also to explore the opportunities that such an extension opens up.