Working in Spatial Audio and Music Production, I have always wanted to be able to interact with sounds in a physical manner. Motion Capture makes that dream a reality.
The goal here was to be able to interact with sounds with more than just our ears. I was hoping to connect sound to a visual trigger, a physical trigger and a spatial trigger. Each stem of my track is attached to an object in Unreal Engine, which is then assigned to a rigged physical object that is uniquely identified by asymmetrical infrared markers taped on it. The Optitrack Mocap system recognizes each objects and streams the data from cameras in the space into Unreal Engine using LiveLink. Thus allowing us to move the sounds around with physical objects. If you listen closely there is a doppler affect (exaggerated) on each object to mimic how it would sound when in motion, like an ambulance that drives by. There is also an included blueprint to activate a low pass filter at 200hz when an object is below a certain height thus "muting" the track.
I am hoping this interface is able to inspire different kinds of interactions with sound that encourages us to think more about the space around us, how movement can change the way things sound as well as being able to help us gain a deeper understanding of how interconnected all our senses are.

Ableton Live Session: 74bpm Chord Progression in Minor
Guitars
Kick+Bass
Congas
Keys
Shaker



Rigging Objects with Trackers in Motive to be recognized as rigid bodies in Unreal Engine
Streaming location data out of Motive into Unreal Engine using Live Link

Barebones Unreal Engine Scene with objects in a nice material before I added water

Sound attenuation profiles in Unreal Engine in order to do a custom distance falloff curve because with unreal's default logarithmic curve the sounds do not change fast enough with the distance from the center of the room.
Mixer with Quad Audio so the sounds can be directionally spatialized
Full beat in stereo