• Robert Wechsler MotionComposer
Palabras clave: Tecnología musical y terapia, captura de movimiento, danza terapia, música interactiva, danza interactiva


La utilización de tecnología de captura de movimiento basada en el vídeo permite hacer música sin necesidad de tocar un instrumento musical, sino gestualizando en el espacio. Con un foco en contextos terapéuticos y pedagógicos, el MotionComposer (MC) es una de las herramientas que están siendo desarrolladas en la actualidad, que permite a todo tipo de usuarios participar, incluyendo a las personas con capacidades diversas. El objetivo de MotionComposer no consiste en conseguir un juego fácil, sino investigar en las relaciones musculares-musicales que están en la base de la expresión sana de la danza y la música. Tras describir brevemente el dispositivo, este artículo describe algunos de sus aspectos terapéuticos y pedagógicos de su uso, incluyendo su facilidad de utilización; la posibilidad de uso y adaptación del dispositivo al movimiento a diversas partes del cuerpo; las posibilidades de programar diversos modos de uso; la causalidad que permite comprender la relación entre gestualidad y sonoridad; la posibilidad de tocar músicas estéticamente interesantes independientemente de las capacidades de las personas; y la posibilidad de utilizarse con múltiples usuarios.


La descarga de datos todavía no está disponible.



Aschersleben, G., & Prinz, W. (1997). Delayed auditory feedback in synchronization. Journal of Motor Behavior, 29 (1), 35-46.
Baalman M., Lussana, M., Lavau, D., Palacio, P. Reus, Jr., & Wechsler, R. (2016). Touch Matters. Proceedings of International MetaBody Forum (IMF), Madrid, 2016.
Bergsland, A., & Wechsler, R. (2013). Movement-Music Relationships and Sound Design in MotionComposer, an Interactive Environment for Persons with (and without) Disabilities. Proceedings of Re-New Conference of Digital Arts, Copenhagen, 2013.
Bergsland, A., & Wechsler, R. (2015). Composing Interactive Dance Pieces for the MotionComposer, a device for Persons with Disabilities. Proceedings of the New Interfaces of Musical Expression NIME2015, Louisiana State University, 20-24.
Bergsland, A. (2015). Aspects of digital affordances: Openness, skill and exploration. Paper presented at the International MetaBody Forum (IMF), Weimar, March 2015.
Bergsland, A. and Wechsler, R. (2016). Interaction design and use cases for MotionComposer, a device turning movement into music. SoundEffects - An Interdisciplinary Journal of Sound and Sound Experience, special Edition on: Sound and Listening in Healthcare and Therapy. Vol 5, No 1.
Bergsland, A. and Wechsler, R. (2017). Issues and Strategies of Rhythmicality for MotionComposer. Proceedings of MOCO 2017 : International Conference on Movement and Computing2017 - Movement and Computing Conference, June 28-30, 2017, Goldsmiths, University of London.
Camurri, A., & Moeslund, T.B. (2010). Visual Gesture Recognition: From Motion Tracking to Expressive Gesture. In Godøy, R.I. and Leman, M. Musical Gestures. Sound, Movement and Meaning. London: Routledge.
Camurri, A., Coletta, P., Varni, G., & Ghisio, S. (2007). Developing multimodal interactive systems with EyesWeb XMI. Proceedings of the 7th International Conference on New Interfaces for Musical Expression, New York.
Camurri, A., Lagerlöf, I., & Volpe, G. (2003). Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. In International Journal of Human-Computer Studies, 59, p.213-225.
Camurri, A., Mazzarino, B., Volpe, G., Morasso, P., Priano, F., & Re, C. (2003). Application of multimedia techniques in the physical rehabilitation of Parkinson's patients. The Journal of Visualization and Computer Animation, 14(5), 269-278.
Chafe, C., & Gurevich, M. (2004). Network time delay and ensemble accuracy: Effects of latency, asymmetry. Audio Engineering Society Convention 117, October 2004.
Dahl, S., & Bresin, R. (2001). Is the player more infl uenced by the auditory than the tactile feedback from the instrument? Proceedings of the COST-G6 Workshop on Digital Audio Effects (DAFx-01) , Limerick, 194-197.
Dietz, J. & Wechsler, R. (2016). Der MotionComposer - Ein Werkzeug für kreativen Ausdruck (auch) für Menschen mit Komplexer Behinderung. Aktivität und Kreativität bei Menschen mit Komplexer Behinderung. Ed. Nicola Maier-Michalitsch and Gerhard Grunick. Düsseldorf: Verlag Selbstbestimmtes Leben, 2016. 100-07.
Four, G.D. (2002). Decades of music therapy behavioral research designs: a content analysis of Journal of Music Therapy articles. J Music Therapy, 39(1), 56–71.
Gallagher, S. (2000). Philosophical conceptions of the self: implications for cognitive science, Trends in Cognitive Sciences, Volume 4, Issue 1, 1 January 2000, Pages 14-21.
Gruber, H.E., Fink, C.D., & Damm, V. (1957). Effects of experience on perception of causality. Journal of Experimental Psychology, 53 (2), 89-93.
Halmrast, T., Guettler, K., Bader, R., & Godøy, R.I. (2010). Gesture and timbre. In Godøy, R.I., & Leman, M. (Eds.), Musical gestures: Sound, movement, and meaning (pp. 183-211). New York: Routledge.
Higham, Tom (2012). Testing models for the beginnings of the Aurignacian and the advent of figurative art and music: The radiocarbon chronology of Geißenklösterle, Journal of Human Evolution, Vol 62, Iss 6, pp 655-726.
Iwarsson, S., & Ståhl, A. (2003) Accessibility, usability and universal design — positioning and definition of concepts describing person-environment relationships, Disability and Rehabilitation, 25(2), 57-66.
Kirk, R., Abbotson, M., Abbotson, R., Hunt, A. & Cleaton, A. (1994). Computer Music in the Service of Music Therapy: the MIDIGRID and the MIDICREATOR systems. Medical Engineering & Physics 16(3), 253-258.
Konteogeorgakopoulos, A., Wechsler, R. & Kealy-Bright, W. (2013). Camera-Based Motion-Tracking and Camera-Based Motion-Tracking and Performing Arts for Persons with Motor Disabilities and Autism, in Assistive Technologies, Disability Informatics and Computer Access for Motor Limitations, IGI Global Publishers.
Laforge, R. et al. (1999). Stage of Regular Exercise and Health-Related Quality of Life, Journal of Preventive Medicine 28, 349–360.
Levine, Barry (2012). "Here Comes the Next Generation of Eye-Tracking". Top Tech News. Retrieved 17 March 2016. Alvar, Carlos y José Manuel Lucía Megías (2002), Diccionario filológico de literatura medieval española, Madrid, Castalia.
Levitin, D. (2006). This Is Your Brain on Music: The Science of a Human Obsession, Dutton, ISBN 978-0-525-94969-5.
Limerick H, et al. (2014). The experience of agency in human-computer interactions: a review, Front Hum Neurosci.
Mäki-Patola, T., & Hämäläinen, P. (2004a). Effect of latency on playing accuracy of two gesture controlled continuous sound instruments without tactile feedback. Proceedings of the Conference on Digital Audio Effects, Naples, Italy, 11-16.
Murcia, C. Q., & Kreutz, G. (2012). Dance and Health: Exploring Interactions and Implications. In R. A. R. MacDonald, G. Kreutz, & L. Mitchell (Eds.), Music, health, and wellbeing. Oxford: Oxford University Press.
Haggard, P. (2017). Sense of agency in the human brain, Nature Reviews Neuroscience 18, 196–207 (2017)
Paul, S., & Ramsey, D. (2000). Music therapy in physical medicine and rehabilitation. Australian Occupational Therapy Journal, 47(3), pp. 111-118.
Peñalba, A. & Wechsler, R. (2010). Danza interactiva con niños con parálisis cerebral, XXVII Congreso de la Asociación Española de Logopedia, Foniatría y Audiología, 7, 8 y 9 de julio de 2010, Valladolid, Spain.
Peñalba, A. (2015). “Expresión Musical Digital Con Alumnos Con Discapacidad Motora.” Eufonía. Didáctica de La Música 65, pp. 58–63.
Peñalba, A., Valles, M., Partesotti, E., Castanon, R., & Sevillano M. (2015). Types of interaction in the use of MotionComposer, a device that turns movement into sound. Proceedings of ICMEM – The International Conference on the Multimodal Experience of Music, University of Sheffield, England, 2015.
Picotin, R. (2010, December 28). L'inventeur de l'orgue sensoriel récompensé. Sud ouest. Retrieved from
Report of the Secretary-General of the United Nations SGUN (2010), "Keeping the promise: realizing the Millennium Development Goals for persons with disabilities towards 2015 and beyond, July 26, 2010; Sixty-fifth session, Item 27 (b) of the provisional agenda; Social development, including questions relating to the world social situation and to youth, ageing, disabled persons and the family.
Stensæth, K., & Ruud, E. (2014). New possibilities for the field of music and health and for music therapy? A case study of two children with disabilities playing with ‘ORFI’. In K. Stensæth (Ed.), Music, Health, Technology and Design (Vol. 7, pp. 39-66). Oslo: Norwegian Academy of Music.
Stuckey, H.L., & Nobel, J. (2010). The Connection Between Art, Healing, and Public Health: A Review of Current Work. Am J Public Health. 2010 February, 100(2), 254–263.
Swingler, T. (1998). "That Was Me!": Applications of the Soundbeam MIDI Controller as a Key to Creative Communication, Learning, Independence and Joy. Paper presented at California State University Northridge Conference on Technology and Persons with Disabilities 1998.
Tam, C., Schwellnus, H., Eaton, C., Hamdani, Y., Lamont, A., & Chau, T. (2007). Movement-to-music computer technology: a developmental play experience for children with severe physical disabilities. Occupational therapy international, 14(2), 99-112.
Tarabella, L., & Bertini, G. (2004). About the Role of Mapping in Gesture-Controlled Live Computer Music. In Wiil, U.K. (Ed.), Computer Music Modeling and Retrieval: International Symposium, CMMR 2003, Montpellier (pp. 217-224). Berlin, Heidelberg: Springer Berlin Heidelberg.
Wechsler, R., Bergsland, A., & Lavau, D. (2016). Affording Difference : Different Bodies / Different Cultures / Different Expressions / Different Abilities. Proceedings: International MetaBody Forum (IMF), Madrid, 2016.
Wechsler, R. & Bergsland, A. (2016). MotionComposer – a device for persons with (and without) disabilities. Any gesture can be musical. Affording difference in musical interaction design. Paper presented at Porto International Conference on Musical Gesture as Creative Interface, Porto, March 2016.
Weiss, F. (2008, April 08). EyeCon. Retrieved January 21, 2016, from
Wessel, D., & Wright, M. (2002). Problems and prospects for intimate musical control of computers. Computer Music Journal, 26(3), 11-22.
Williamson, V. (2011). The music of silence, Music Psychology. Retrieved from on 31.12.2017.