Of the research conducted in VR music and interaction, topics fall into some definable categories, though not comprehensive account of all research just some good exemplars:
+ Controlling musical characteristics of pre-existing composition [1]
+ Mixer style control of spatial audio characteristics for pre-recorded sound sources, with optional control of effects [2]
+ Virtual audio environment, multi-process 3D instruments [4, 5]
+ Virtual instruments, virtual representations of instruments and synthesis control [3,6,7]
+ Virtual object manipulation with parameterised sound output [8,9]
Many of these implementations offer novel interaction methods coupled with creative feedback and visualisation. Many systems require considerable training and learning to be able to perform with it, though reportedly the basics of user control can be learned quite quickly. This presents a problem for the target audience of the Objects project, where more immediate control and enjoyment is required. Therefore a combination of musical composition control and spatial audio will be explored, using simplified musical interaction that can allow novice users to learn within the experience. Though the control method and interaction metaphors differ considerably from the work presented in [1] and [2].
[1] Xavier Rodet, Jean-philippe Lambert, Thomas Gaudy, and Florian Gosselin. Study of haptic and visual interaction for sound and music control in the Phase project. International Conference on New Interfaces for Musical Expression, pages 109–114, 2005.
[2] Wozniewski, Mike, Zack Settel, and J Cooperstock. A spatial interface for audio and music production. Digital Audio Effects (DAFx), pages 18–21, 2006.
[3] Teemu Maki-patola, Juha Laitinen, Aki Kanerva, and Takala Takala. Experiments with virtual reality instruments. Proceedings of the 2005 international Conference on New Interfaces for Musical Expression, pages 11–16, 2005.
[4] Leonel Valbom and Aderito Marcos. Wave: Sound and music in an immersive environment. Computers & Graphics, 29(6):871–881, 2005.
[5] F. Berthaut, M. Desainte-Catherine, and Martin Hachet. DRILE: an immersive environment for hierarchical live-looping. NIME ’10 Proceedings of the 2010 conference on New interfaces for musical expression, (Nime):192–197, 2010.
[6] S. Gelineck, “Virtual Reality Instruments capable of changing Dimensions in Real-time,” 2005.
[7] A. Mulder, S. Fels, and K. Mase, “Design of Virtual 3D Instruments for Musical Interaction,” Graph. Interface, 1999.
[8] Mulder, A. (1996). Getting a GRIP on alternate controllers: Addressing the variability of gestural expression in musical instrument design. Leonardo music journal, 33-40.
[9] Mulder, A., Fels, S. S., & Mase, K. (1997). Mapping virtual object manipulation to sound variation. IPSJ Sig Notes, 97(122), 63-68.