This post is just mopping up some more interesting research and stimulus used in the development of a VR music interface. The items below range from literal transpositions of composition environments to more experimental concepts in music composition in immersive environments.
The Pensato Fissure project has achieved wide recognition within dance music communities and internet publications for its immersive VR approach to performing and composing using Ableton live. The project has undergone many developments in interface and input methods. Project formed a Masters project in Design and is still currently active, the authors Showtime github is particularly useful for syncing with Ableton, it effectively reveals all possible Live Objects to another system, which enables rapid application development of music interfaces utilising the underlying power and flexibility of the Ableton Live API.
Of the research conducted in VR music and interaction, topics fall into some definable categories, though not comprehensive account of all research just some good exemplars:
+ Controlling musical characteristics of pre-existing composition 
+ Mixer style control of spatial audio characteristics for pre-recorded sound sources, with optional control of effects 
+ Virtual audio environment, multi-process 3D instruments [4, 5]
+ Virtual instruments, virtual representations of instruments and synthesis control [3,6,7]
+ Virtual object manipulation with parameterised sound output [8,9]
Many of these implementations offer novel interaction methods coupled with creative feedback and visualisation. Many systems require considerable training and learning to be able to perform with it, though reportedly the basics of user control can be learned quite quickly. This presents a problem for the target audience of the Objects project, where more immediate control and enjoyment is required. Therefore a combination of musical composition control and spatial audio will be explored, using simplified musical interaction that can allow novice users to learn within the experience. Though the control method and interaction metaphors differ considerably from the work presented in  and .
 Xavier Rodet, Jean-philippe Lambert, Thomas Gaudy, and Florian Gosselin. Study of haptic and visual interaction for sound and music control in the Phase project. International Conference on New Interfaces for Musical Expression, pages 109–114, 2005.
 Wozniewski, Mike, Zack Settel, and J Cooperstock. A spatial interface for audio and music production. Digital Audio Effects (DAFx), pages 18–21, 2006.
 Teemu Maki-patola, Juha Laitinen, Aki Kanerva, and Takala Takala. Experiments with virtual reality instruments. Proceedings of the 2005 international Conference on New Interfaces for Musical Expression, pages 11–16, 2005.
 Leonel Valbom and Aderito Marcos. Wave: Sound and music in an immersive environment. Computers & Graphics, 29(6):871–881, 2005.
 F. Berthaut, M. Desainte-Catherine, and Martin Hachet. DRILE: an immersive environment for hierarchical live-looping. NIME ’10 Proceedings of the 2010 conference on New interfaces for musical expression, (Nime):192–197, 2010.
 S. Gelineck, “Virtual Reality Instruments capable of changing Dimensions in Real-time,” 2005.
 A. Mulder, S. Fels, and K. Mase, “Design of Virtual 3D Instruments for Musical Interaction,” Graph. Interface, 1999.
 Mulder, A. (1996). Getting a GRIP on alternate controllers: Addressing the variability of gestural expression in musical instrument design. Leonardo music journal, 33-40.
 Mulder, A., Fels, S. S., & Mase, K. (1997). Mapping virtual object manipulation to sound variation. IPSJ Sig Notes, 97(122), 63-68.