MicBrain: Audio-Driven Brain Activation Visualization

Microphone is off. Click to enable audio mapping.
2 / 2 regions
This visualization maps live microphone audio to brain activation colors. Use the slider to choose how many regions are active, from just a few regions to the full set of available Destrieux atlas regions. Very low frequencies target medial frontal regions, low frequencies target motor and frontal regions, mid frequencies target temporal/auditory regions, high frequencies target parietal regions, and very high frequencies target occipital/visual regions.

About MicBrain

MicBrain is an audio-responsive 3D brain visualization designed to demonstrate how sound can be mapped onto broad functional brain regions. It is intended for education and illustration, rather than as a clinical diagnostic tool. The interface is simplified so anyone can see how live microphone input changes color activations on the brain model.

How It Works

MicBrain uses a 3D brain surface model based on the Destrieux atlas. When the microphone is active, the browser captures live audio and performs real-time frequency analysis using the Web Audio API's AnalyserNode.

The audio is divided into five frequency bands:

Frequency vs. Volume: Frequency determines which brain regions are activated—very low pitches light up medial frontal areas, low pitches motor areas, mid pitches auditory areas, high pitches parietal areas, and very high pitches visual areas. Volume (amplitude) determines how brightly those regions light up—the louder the sound in that frequency band, the more intense the color activation. For example, a loud bass note will brightly illuminate frontal regions, while a soft high-pitched sound will faintly highlight occipital areas.

The slider controls how many regions are active at once, distributing them roughly equally across the three frequency bands. With the slider at maximum, all available Destrieux atlas regions can be activated, each responding to the dominant frequency band in the audio. The 3D brain is rendered using Three.js with transparent materials, making active regions stand out clearly.

Scientific Accuracy

This visualization is intentionally approximate and educational. It does not represent real neural activity measured from the brain, nor does it infer true EEG source localization from microphone audio. The frequency-to-region mappings are simplified analogies: very low frequencies are associated with medial frontal processing, low frequencies with motor control (frontal/precentral), mid frequencies with auditory processing (temporal), high frequencies with parietal processing, and very high frequencies with visual processing (occipital). These are not medically validated models but chosen for intuitive demonstration.

In real neuroscience, brain activation is complex and context-dependent, involving precise physiological signals and functional pathways. Here, microphone audio serves as a proxy stimulus to animate the brain model, highlighting how different sound properties (frequency and amplitude) can be conceptually mapped to regional patterns. The slider allows exploration of varying numbers of active regions, but this is for illustrative purposes only.

Educational Purpose & Disclaimer

MicBrain is intended strictly for educational and informational purposes. It serves as a visual aid to foster curiosity about how audio frequency and amplitude can be represented in a brain-inspired visualization.

Crucial Disclaimer: This visualization is not a substitute for professional medical advice, diagnosis, or treatment. It does not measure actual brain activity. If you have medical concerns, please consult a qualified healthcare professional.

Technology & Data Sources

MicBrain captures live microphone audio using the browser's Web Audio API and performs frequency analysis with an AnalyserNode. The 3D brain surface is rendered in real time using Three.js, and the region geometry is loaded from Destrieux atlas OBJ files.

The written descriptions and educational framing are created to explain the visualization clearly, but they are not clinical advice. This project is meant to illustrate concepts and make the audio-driven visualization accessible.

Future Improvements

Future versions of MicBrain can expand both interaction and realism. Possible improvements include more precise atlas-based functional grouping, better region-specific frequency tuning, richer color grading, and additional controls for choosing which brain networks are shown.

The BioniChaos Project, Acknowledgments & Feedback

MicBrain is a proud initiative of the BioniChaos project. Our overarching mission at BioniChaos is to explore the intersections of biology, technology, and art to create innovative educational tools and raise awareness about complex scientific and medical topics.

We extend our sincere gratitude to the creators of the Destrieux atlas and the resources available at Brainder.org, which were invaluable for obtaining the brain model data used in this application.

Your experience and insights are vital for the continued improvement of MicBrain. We are committed to making this tool as effective and user-friendly as possible. We warmly welcome your feedback, suggestions for new features, or reports of any issues encountered. Please feel free to contact us with your thoughts.

We are continuously working on enhancing MicBrain, with plans for more detailed regional information and additional interactive features in the future.