Amy K. Hoover, Paul A. Szerlip, Marie E. Norton, Trevor A. Brindle, Zachary Merritt, and Kenneth O. Stanley (2012)
Generating a Complete Multipart Musical Composition from a Single Monophonic Melody with Functional Scaffolding
In: Proceedings of the Third International Conference on Computational Creativity (ICCC-2012, Dublin, Ireland).

Note: This paper is accompanied with a set of musical samples at http://eplex.cs.ucf.edu/fsmc/iccc2012/.

Abstract  

This paper advances the state of the art for a computer-assisted approach to music generation called functional scaffolding for musical composition (FSMC), whose representation facilitates creative combination, exploration, and transformation of musical concepts. Music in FSMC is represented as a functional relationship between an existing human composition, or scaffold, and a generated accompaniment. This relationship is encoded by a type of artificial neural network called a compositional pattern producing network (CPPN). A human user without any musical expertise can then explore how accompaniment should relate to the scaffold through an interactive evolutionary process akin to animal breeding. While the power of such a functional representation has previously been shown to constrain the search to plausible accompaniments, this study goes further by showing that the user can tailor complete multipart arrangements from only a single original monophonic track provided by the user, thus enabling creativity without the need for musical expertise.