INMAMUSYS: Intelligent multiagent music system

Published in Expert Systems with Applications. Vol 36(3), pp 4574-4580, 2009

Recommended citation: Delgado, M., Fajardo, W. & Molina-Solana, M. (2009), "INMAMUSYS: Intelligent multiagent music system", Expert Systems with Applications Vol. 36(3), pp. 4574-4580. http://doi.org/10.1016/j.eswa.2008.05.028

Download paper here

Recommended citation: Delgado, M., Fajardo, W. & Molina-Solana, M. (2009), "INMAMUSYS: Intelligent multiagent music system", Expert Systems with Applications Vol. 36(3), pp. 4574-4580.

Abstract: Music generation is a complex task even for human beings. This paper describes a two level competitive/collaborative multiagent approach for autonomous, non-deterministic, computer music composition. Our aim is to build a high modular system that composes music on its own by using Experts Systems technology and rule-based systems principles. To do that, rules issued from musical knowledge are used and emotional inputs from the users are introduced. In fact, users are not allowed to directly control the composition process. Two main goals are sought after: investigating relationships between computers and emotions and how the latter can be represented into the former, and developing a framework for music composition that can be useful for future experiments. The system has been successfully tested by asking several people to match compositions with suggested emotions.

BibTeX: @article{Delgado2009a, author = {Miguel Delgado and Waldo Fajardo and Miguel Molina-Solana}, title = {INMAMUSYS: Intelligent multiagent music system}, journal = {Expert Systems with Applications}, year = {2009}, volume = {36}, number = {3}, pages = {4574--4580}, doi = {http://doi.org/10.1016/j.eswa.2008.05.028} }