{Shape: an adaptive musical interface that optimizes the correlation between gesture and sound

Main Authors: Brandtsegg, Øyvind, Tidemann, Axel
Format: Proceeding
Terbitan: , 2020
Subjects:
HCI
Online Access: https://zenodo.org/record/3932892
Daftar Isi:
  • The development of musical interfaces has moved from static to malleable, where the interaction mode can be designed by the user. However, the user still has to specify which input parameters to adjust, and inherently how it affects the sound generated. We propose a novel way to learn mappings from movements to sound generation parameters, based on inherent features in the control inputs. An assumption is that any correlation between input features and output characteristics is an indication of a meaningful mapping. The goal is to make the user interface evolve with the user, creating a unique, tailor made interaction mode with the instrument.