Amy K. Hoover, Paul A. Szerlip, and Kenneth O. Stanley (2011) 
Interactively Evolving Harmonies through Functional Scaffolding
In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2011). New York,NY:ACM (8 pages)

Winner of the Best Paper Award in Digital Entertainment Technologies and Arts at GECCO-2011

Note: This paper is accompanied with a set of musical samples at http://eplex.cs.ucf.edu/fsmc/gecco2011/.

Abstract
While the real-time focus of today's automated accompaniment generators can benefit instrumentalists and vocalists in their practice, improvisation, or performance, an opportunity remains specifically to assist novice composers. This paper introduces a novel such approach based on evolutionary computation called functional scaffolding for musical composition (FSMC), which helps the user explore potential accompaniments for existing musical pieces, or scaffolds. The key idea is to produce accompaniment as a function of the scaffold, thereby inheriting from its inherent style and texture. To implement this idea, accompaniments are represented by a special type of neural network called a compositional pattern producing network (CPPN), which produces harmonies by elaborating on and exploiting regularities in pitches and rhythms found in the scaffold. This paper focuses on how inexperienced composers can personalize accompaniments by first choosing any MIDI scaffold, then selecting which parts (e.g. the piano, guitar, or bass guitar) the CPPN can hear, and finally customizing and refining the computer-generated accompaniment through an interactive process of selection and mutation of CPPNs called interactive evolutionary computation (IEC). The potential of this approach is demonstrated by following the evolution of a specific accompaniment and studying whether listeners appreciate the results.