Amy K. Hoover, Paul A. Szerlip, and Kenneth O. Stanley (2011)
Generating Musical Accompaniment through Functional Scaffolding
In: Proceedings of the 8th Sound and Music Computing Conference (SMC-2011, Padova, Italy).

Note: This paper is accompanied with a set of musical samples at http://eplex.cs.ucf.edu/fsmc/smc2011/.

Abstract  

A popular approach to music generation in recent years is to extract rules and statistical relationships by analyzing a large corpus of musical data. The aim of this paper is to present an alternative to such data-intensive techniques. The main idea, called functional scaffolding for musical composition (FSMC), exploits a simple yet powerful property of multipart compositions: The pattern of notes and rhythms in different instrumental parts of the same song are functionally related. That is, in principle, one part can be expressed as a function of another. The utility of this insight is validated by an application that assists the user in exploring the space of possible accompaniments to preexisting parts through a process called interactive evolutionary computation. In effect, without the need for musical expertise, the user explores transforming functions that yield plausible accompaniments derived from preexisting parts. In fact, a survey of listeners shows that participants cannot distinguish songs with computer-generated parts from those that are entirely human composed. Thus this one simple mathematical relationship yields surprisingly convincing results even without any real musical knowledge programmed into the system. With future refinement, FSMC might lead to practical aids for novices aiming to fulfill incomplete visions.