Main Article Content
This project aims at exploring interactivity in networked music by exploiting innovative mapping schemes to extend common musical instruments into acoustic controllers of digital sound. The sensors used were limited to only piezo element contact microphones, as they demonstrate a simple and cheap solution to retrieve an acoustic signal free of most environmental noise and crosstalk. The main idea was to employ the feedback that results from processing a particular system as a control signal, in a way that allows interaction with different musical systems by means of network communication. Another goal was to preserve the traditional techniques in which casual musicians play their instruments, so as to minimize the learning process that the creation of new interfaces require, and maximize the usability of the instruments' extensions. The extension itself is both reactive and transformative. We chose to extend the guitar and a djembe, as an example of a common ensemble of music in casual social contexts.
This journal is licensed under the terms of the CC BY 4.0 licence (https://creativecommons.org/licenses/by/4.0/legalcode).
Hsu, W. (2006). Managing gesture and timbre for analysis and instrument control in an interactive environment. In Proceedings of the 2006 International Conference on New Interfaces for Musical Expression,NIME-2006, Paris, France (pp.376-379). Recuperado de http://recherche.ircam.fr/equipes/temps-reel/nime06/proc/nime2006_376.pdf.
Jordà, S. (2003). Interactive music systems for everyone: exploring visual feedback as a way for creating more intuitive, efficient and learnable instruments. En Proceedings of the Stockholm Music Acoustics Conference, August 6-9, SMAC'03. Stockholm, Sweden: KTH. Recuperado de http://mtg.upf.edu/files/publications/smac03jorda.pdf.
Lippe, C. (1993). A composition for clarinet and real-time signal processing: using max on the IRCAM signal processing workstation. In Proceedings of the 10th Italian Colloquium on Computer Music’, Milan, Italy, (pp. 428-432).
Machover, T. (1992). Hyperinstruments: A progress report 1987 - 1991 [technical report]. Boston, MA: Massachusetts Institute of Technology.
O’Callaghan, C. (2007). Sounds. New York, NY: Oxford University Press
Puckette, M. (2011). Re-purposing musical instruments as synthesizer controllers [Video]. Recuperado de http://www.mat.ucsb.edu/595M/?p=200.
Puckette, M., Apel, T., & Zicarelli, D. (1998). real-time audio analysis tools for Pd and MSP. In Proceedings of the International Computer Music Conference, ICMC'98, (pp. 109-112). San Francisco, CA: International Computer Music Association.
Rijkonen, T. (2003). Shaken or stirred. Virtual reverberation spaces and transformative gender identities in kaija saariaho’s Noanoa (1992) for flute and electronics. Organized Sound, 8(1), 109-115.
Rowe, R. (1993), Interactive music systems. Cambridge, MA: MIT Press.
Sethares, W.A. (1997). Tuning, timbre, spectrum, scale. New York, NY: Springer.
O'Connell, J., Román,C., & Schemele, T. (2012). Interactive feedback on extended instruments [video]. Recuperado de http://youtu.be/lyzwpoleNPQ