About Gesture Music Generator
Create music with just the movement of your hands and mouth! The Gesture Music
Generator uses hand and facial gestures to produce unique sounds and melodies in real time. Tip: Try
different hand and mouth gestures for unique sounds!
Current Features
- Real-time hand and face gesture detection using your webcam (no data is sent to a server).
- Motion and gestures are mapped to musical notes, volume, and duration.
- Separate instrument selection for left hand, right hand, and mouth gestures.
- Visual feedback on the canvas for detected gestures and notes played.
- Toggleable video overlay and note event display.
Instrument Selection
You can choose a different instrument for each input: left hand, right
hand, and mouth. The available instruments are:
- synth: A classic synthesizer sound (default for left hand).
- fm: Frequency Modulation synth, for richer, bell-like tones (default for right hand).
- am: Amplitude Modulation synth, for metallic or tremolo effects (default for mouth).
- duo: Two synths played together for a fuller sound.
- off: Disables sound for that input.
Just click the radio buttons in the control panel to change the instrument for each gesture input.
How does it work?
When you move your hands or open your mouth in front of the webcam, the
app uses AI models (MediaPipe) to track your hand and face positions. The code then translates these motions
into music:
- Hand X position: Determines which note is played (moving left to right changes the pitch).
- Hand Y position: Controls the volume (higher is louder, lower is softer).
- Distance between thumb and index finger: Sets the note duration (the more open, the longer the note).
- Mouth opening: Triggers a note when you open your mouth wide enough, with the opening size affecting
volume and duration.
In short, your gestures are mapped to musical parameters, letting you "play" music with motion alone!
Current Issues
- Occasional missed gesture detections, especially in low light or with fast movements.
- Audio may not play on some browsers until user interacts with the page (browser autoplay policy).
- Mobile device support is experimental and may have performance issues.
Future Improvements
- More instrument options and sound effects.
- Customizable gesture-to-sound mappings.
- Recording and sharing of generated music.
- Improved gesture recognition and feedback.
Credits
- Developed by BioniChaos. Powered by MediaPipe and Tone.js.