Goodbye is a live vocal performance of an originally written song featuring real-time augmented reality (AR) and handtracking. The handtracked real-time augmented reality (AR) effects illuminate the act of waving goodbye. The handtracked AR manipulates the live music while the live music also influences the AR visuals in real-time. This back and forth dance creates a unique performance and combination of instruments.
CONTEXT
As a musician dabbling in AR/VR, I wanted to combine these creative mediums together and explore the possibilities of AR in live music performances, and potentially influencing my artistic practice.
This live performance was a 2-day performance with an audience of about 100 people each day. The performance was created with the venue and audience in mind.
DECIDING THE INPUT
Hand tracking: I used Meta Spark to create an AR instrument that uses hand tracking to control the sound while the sound controls the scale and rotation of the “performing kirby” model.
Try out the AR instrument:
Physical controller: Potentially building a controller to control music in Ableton. I tested this interaction using an Arduino’s accelerometer and mapped it to different Ableton controls.
I received feedback that using handtracking for a performance would feel very magical and that the seamlessness of using hands to control sound incorporated with AR would be extremely innovative. Hence, I decided on handtracking as my performance input.
PROTOTYPE
I used Meta Spark to create a first prototype. I wanted to explore using AR to affect the music more but was unsure how to prevent the song from being too distorted and still have some form of control over the music. Hence, for the first prototype, I focused on having the music affect the AR.
I consulted some experts in music technology and they suggested I use AR to vary the delay, echo, reverb, pitch, tembre, volume to keep the song structure while allowing AR to affect the music more. I performed this initial prototype in front of an audience and I asked them on their thoughts if they would prefer seeing how the technology works (i.e. what is affecting what) or leaving it up to interpretation. The majority felt like it was more magical not knowing what is affecting what but still being able to hear or see the impact of it.
I faced some challenges trying to link Ableton with an AR platform and had to make a couple of pivots. I first tried to link Meta Spark to Ableton, but it was unfortunately not supported and was not possible. Next, I tried connecting Ableton live with UnityOSC. I ran into multiple issues and was still unable to link them successfully. I went back to the drawing board and tried to find other ways I could explore this important connection.
Solution: Connecting Ableton live with webcam handtracking using MIDI was finally successful and I combined the handtracking in Snap Camera to utilize the AR effects.
Once the link was established, I was able to bring my vision to life.
Full dress rehearsal, with focus on testing on stage placement and lighting.
Challenge: Allowing for enough light for handtracking in the webcam to work.
During the dress rehearsal, the webcam was pointed towards me and the projector behind me making this really cool effect*. However, the lighting made it hard for my hands to be identified. Hence, I made a debug preview where I could test the lighting in real time should anything happen during the day of the performances.
*I also added the "cool effect" of the projector by layering multiple cameras on top of each other for certain sections of my performance. It was a nice little accident :)
Real-time debugging: Hand joints can be seen to ensure the hands are able to be tracked and that each hand motion trigger works.
The performances were a huge success! Many came up to me after the performance impressed by the innovative live performance and combining AR with live music vocals. It was a surreal endeavor being able to combine both my passion for music and AR in a live performance. I have learnt so much, from connecting different kinds of technology to stage production and more. This has transformed my practice as a musician and I feel like I have built myself a "template" which I can utilize for live performances. I look forward to continue building upon this and expanding the uses of AR in music and live performances!