smartphone's Inertial Measurement Unit (IMU) integrated within a wearable sleeve interface  



smartphone’s IMU technology for real-time music control


Infinite Loops is an original composition and performance that explores the integration of wearable technology in live music creation. The project utilizes a smartphone’s Inertial Measurement Unit (IMU) embedded within a wearable sleeve to enable real-time manipulation and control of musical elements through hand gestures and movements. This interface transforms physical motion into expressive musical parameters, enhancing both the creative and performative aspects of music.

Key Features


  • Real-Time Interaction: Data from the IMU (gyroscope and accelerometer) is transmitted to Max/MSP, enabling responsive audio processing based on movement.

  • Expressive Control: Gestures such as tilts, sways, and rapid motions control and manipulate audio effects, loops, and parameters.

  • Composition: The wearable interface is central to the compositional process, allowing choreography of movements that integrate seamlessly with the music.

This project demonstrates how wearable technology can aid the performer’s musical expression during live performance.

paper

video



Volviendo de un Sueño – Gesture-Controlled Composition with Reactive Visuals

Volviendo de un Sueño is a performance-composition that fuses gesture-controlled music creation with live-coded, reactive visuals by Sasha Semina. Using a smartphone’s IMU embedded in a wearable sleeve, this piece exemplifies the synergy of technology, movement, and sound. Motion data from the IMU’s gyroscope and accelerometer is transmitted to Max/MSP, enabling dynamic control over musical loops, effects, and pre-programmed elements in real time.

Key Features

  • Dynamic Visuals: The live-coded visuals respond dynamically to the music, creating an immersive, audiovisual experience.

  • Gesture-Driven Interaction: The IMU detects arm and hand movements, translating them into precise audio manipulations.

  • Layered Structure: The piece unfolds in two distinct parts, each showcasing unique musical and gestural interactions:

    • Part One: Live Looping with Accordion
      • Accordion motifs are layered in real time to create  loop stacks.
      • Movements control frequency detuning and tremolo effects, adding a gestural dimension to the sound.

    • Part Two: MIDI-Controlled Piano and Sample Manipulation
      • Rapid gestures trigger pre-programmed piano loops and audio samples (nature sounds, drones, ambient textures).
      • Dynamics, articulation, and loop playback are manipulated for nuanced, layered compositions.

This performance highlights the expressive potential of wearable IMU technology in transforming physical gestures into rich musical textures, while seamlessly integrating reactive visuals.

video