Media comparison for collaborative music making

Image credit Nicolas Ulloa

Do you create electronic music? Are you a musician, producer, artist or DJ? Or are you a student or professional in Music / Music Technology? If so, I am running a study over the next few weeks (July & August) and would love your participation!

You will be invited to use and compare two different interfaces one in virtual reality and another screen-based. You will be asked to create some drum loops collaboratively with another person using the provided interfaces. You will then be asked to complete a survey about your experience.

The study will take two hours to complete, and you will be paid £25 for your participation. All studies will be done in Computer Science building on the Mile End Campus of Queen Mary University of London.

Study slots are available from 25/07/17 to 18/08/17. Monday-Friday – time slots at 10 am, 12.30 pm, 3.30 pm, and 6 pm. If none of these are suitable for you alternative arrangements can easily be made.

If you would like to take part, please sign up using the button below. It will ask you some qualifier questions, take your contact details and ask you to book a slot. I will then contact you directly to confirm your date and time slot.

Schedule Appointment

Feel free to get in contact using the form below.

 

If you are interested in the context of the research I have some resources here:

  • Polyadic: design and research of an interface for collaborative music making on a desktop or in VR.
  • Design for collaborative music making: some previous work on the user-centred design cycle involved in the progress of my PhD.

Virtual Reality Music pt 2: Interface and Instruments

This post is just mopping up some more interesting research and stimulus used in the development of a VR music interface. The items below range from literal transpositions of composition environments to more experimental concepts in music composition in immersive environments. 

The Pensato Fissure project has achieved wide recognition within dance music communities and internet publications for its immersive VR approach to performing and composing using Ableton live. The project has undergone many developments in interface and input methods. Project formed a Masters project in Design and is still currently active, the authors Showtime github is particularly useful for syncing with Ableton, it effectively reveals all possible Live Objects to another system, which enables rapid application development of music interfaces utilising the underlying power and flexibility of the Ableton Live API.

Of the research conducted in VR music and interaction, topics fall into some definable categories, though not comprehensive account of all research just some good exemplars: 


+ Controlling musical characteristics of pre-existing composition [1]

+ Mixer style control of spatial audio characteristics for pre-recorded sound sources, with optional control of effects [2] 

+ Virtual audio environment, multi-process 3D instruments [4, 5] 

+ Virtual instruments, virtual representations of instruments and synthesis control [3,6,7]

+ Virtual object manipulation with parameterised sound output [8,9]


Many of these implementations offer novel interaction methods coupled with creative feedback and visualisation. Many systems require considerable training and learning to be able to perform with it, though reportedly the basics of user control can be learned quite quickly. This presents a problem for the target audience of the Objects project, where more immediate control and enjoyment is required. Therefore a combination of musical composition control and spatial audio will be explored, using simplified musical interaction that can allow novice users to learn within the experience. Though the control method and interaction metaphors differ considerably from the work presented in [1] and [2].

[1]  Xavier Rodet, Jean-philippe Lambert, Thomas Gaudy, and Florian Gosselin. Study of haptic and visual interaction for sound and music control in the Phase project. International Conference on New Interfaces for Musical Expression, pages 109–114, 2005.

[2]  Wozniewski, Mike, Zack Settel, and J Cooperstock. A spatial interface for audio and music production. Digital Audio Effects (DAFx), pages 18–21, 2006.

[3]  Teemu Maki-patola, Juha Laitinen, Aki Kanerva, and Takala Takala. Experiments with virtual reality instruments. Proceedings of the 2005 international Conference on New Interfaces for Musical Expression, pages 11–16, 2005.

[4]  Leonel Valbom and Aderito Marcos. Wave: Sound and music in an immersive environment. Computers & Graphics, 29(6):871–881, 2005.

[5]  F. Berthaut, M. Desainte-Catherine, and Martin Hachet. DRILE: an immersive environment for hierarchical live-looping. NIME ’10 Proceedings of the 2010 conference on New interfaces for musical expression, (Nime):192–197, 2010. 

[6] S. Gelineck, “Virtual Reality Instruments capable of changing Dimensions in Real-time,” 2005.

[7] A. Mulder, S. Fels, and K. Mase, “Design of Virtual 3D Instruments for Musical Interaction,” Graph. Interface, 1999.

[8] Mulder, A. (1996). Getting a GRIP on alternate controllers: Addressing the variability of gestural expression in musical instrument design. Leonardo music journal, 33-40.

[9] Mulder, A., Fels, S. S., & Mase, K. (1997). Mapping virtual object manipulation to sound variation. IPSJ Sig Notes97(122), 63-68.

Virtual Reality Music pt 1: A wee taster…

In the start of a likely lengthy series of posts, I’ll be presenting some of the internets occurrences in VR music making/experiences. This set of posts will run in parallel with a "Tangible Music Interfaces" set of posts that will cover alternative interaction methods with audio and music applications. For this first one I’m being generous and sharing two things that caught my attention lately.

Squarepushers 360 music video for Stor Eiglass, is a journey through a bubblegum world of bizarre 8 bit escapism. It highlights the possible dangers of VR addication and is also hilarious/scary. Reminds me abit of Osamu Sato’s early psychadelic computer game Eastern Mind: The Lost Souls of Tong-Nou.

Next up is a bit of research form the NIME community on a hierarchical live looping VR music app, DrilleThis technique consists in creating musical trees whose nodes are composed of sound effects applied to a musical content. The system presented requires considerable training and learning to be able to perform with it though the basics of user control can be learned quite quickly. The first video shows a good overview of the system while the second video explores a more structured performance. Further research indicates how metaphors of interaction are utilised in the creation of user control and system design.