An open-source prototype designed as a sound-centered Tangible User Interface (TUI) for networked, interactive music systems. It enables real-time sensor-based interaction for embodied sound exploration.
Think of it as a "Lego" set for your musical ideas. Each module is designed with simplicity and versatility in mind, where core functionalities and patching are housed within a compact and user-friendly interface. You can mix, match, customize, and extend your own musical tools in a modular and multichannel workflow. To get started, search for "modulo" in the Max Package Manager or find it under the Featured Package section. Happy patching!
The use of physiological signals from the human brain and body in human-computer interaction has become a major research area in physiological computing. Musicians have been at the forefront of the development of digital musical instruments adopting physiological interfaces, but their efforts have remained at the experimental level, often isolated from the relevant scientific communities.
The Body Brain Digital Musical Instrument (BBDMI) project aims to develop a digital musical instrument using electrophysiological signals from the human body: the electromyogram (EMG) from muscle contraction and the electroencephalogram (EEG) from brain activity. It combines musical organology with biomedical technologies in a modular and accessible manner to make a system that can be used by musicians, artists and educators without specialised knowledge in neuroscience and signal analysis. The instrument will represent a new interface for musical creation that can benefit several distinct user communities: experimental musicians, instrumentalists, and cultural mediation practitioners working in the world of autism and disabled audiences.
GripBeats® isn't just any ordinary musical instrument; it's wearable. And because it's wearable, it uses your hands' movements and touch to make music. Everything you thought you could do with an instrument before you can do now, but without it!
A new pole-dancing instrument and performance together, exploring technical innovation and sensory experience through means of sound and movement.
The death of a beautiful woman is the most poetical topic in the world.
– Edgar Allen Poe
In A Pole Tragedy I search for female autonomy and sexual self-determination in a male-dominated society. Fascinated by Euripides' famous tragedy Iphigenia in Aulis, about the sacrifice of a young girl’s body in a men’s war, I explore the thin line between violence and the erotic. In a sweaty duet between my body and a sky high pole I dive into my darkest desires, with Iphigenia as my heroic alter ego.
A swarm of artificial creatures that make music in response to the sun, clouds and shadows of trees moving in the wind. Komorebi is a Japanese word meaning "sunlight shining through trees". The creatures together form an immersive sensory experience. The work suggests to us that "life" is not an exceptional property of organic life forms, but also a property of complex systems beyond biological lives as we understand it.
A Max object that provides a set of supervised machine learning algorithms for gesture-sound interaction design. It allows to train models such as multilayer perceptrons, k-nearest neighbour and dynamic time warping to perform regression or classification tasks. By combining multiple dimensions of input and target output, the object predicts output values based on new input data.
My master's research looked at ways to integrate the role of the performer in the context of live electronic music in active and engaging ways. The result of my artistic research led to the creation of an interactive music system capable of capturing meaningful musical information from the performer’s bodily gestures and physiological data to control digital sound processes running on the computer. By coupling a custom-made wearable interface with a set of supervised machine learning algorithms, the system allows the performer to record example interactions, associating states, postures and gestures with sounds to build an exploratory and performative gesture-timbre space.
A custom-made wearable digital musical instrument for experimental electronic music improvisation in performance stages. The instrument is made of two soft balls containing each five conductive buttons and a motion sensor that retrieve the orientation of my arms in space. This data is used to train a neural network algorithm capable of shaping sound and space in real time.