InSpace with the Otherness is a co-located collaboration with a deep learning algorithm. It provides embodied and spatial opportunities to exhibition visitors for musical exploration.
InSpace with the Otherness explores what’s possible when the visitor follows the music as a subordinate companion that clarifies and densifies the atmosphere around it with very abstract sound creation at the ever-shifting border between new sounds and a constantly reinvented composition with the “otherness”, artificial intelligence.
Sharing a performance space brings awareness of others and their presence, which is reinforced by being able to control composition’s sounds through the visitor’s spatial interaction. Space and interaction with AI as the other are therefore deeply interconnected in the performance of this composition.
The position of the visitor in the space is tracked using a depth camera and object detection system. The position data is used to control the interpolation of sounds that are generated as samples through Google Magenta team’s NSynth machine learning synthesis algorithm.
The soundworld of the piece is in large part determined by digital samples manipulated through a granular synthesiser and harmonic structures of the waveshaping synthesis running in Pure Data environment.
InSpace with the Otherness uses
- Kinect SDK and Processing for position data
- NSynth Neural Audio Synthesis for sample generation with deep learning algorithm
- Pure Data for real time processing and synthesis of the sounds for the performance of the composition.