Live EEG Brain Data Meets Creative Coding
An interactive web platform that combines real-time EEG brain data from Muse headsets with P5.js creative coding, enabling anyone to create brain-controlled art and visualizations in real-time.
BrainImation was developed as part of Kyle Mathewson's neuroscience and brain-computer interface course to provide students with hands-on experience in:
- Real-time EEG data processing and visualization
- Brain-computer interface (BCI) development
- Creative applications of neuroscience
- Interactive programming and data visualization
The platform builds upon the open-source muse-js library and P5.js creative coding framework, making brain-computer interfaces accessible to students, artists, and researchers.
BrainImation is a zero-install web application that lets you:
- Connect to a Muse EEG headset via Bluetooth (Web Bluetooth API)
- Visualize live brain activity in real-time (alpha, beta, theta, delta, gamma waves)
- Code creative animations that respond to your brain state using P5.js
- Learn neuroscience concepts through interactive, visual feedback
- Create brain-controlled art, games, music, and experiments
No installation, no setupโjust open in a browser and start coding with your brain.
- Real-time EEG streaming from Muse 2016, Muse 2, and Muse S headsets
- 5 frequency bands: Alpha (relaxation), Beta (focus), Theta (creativity), Delta (deep states), Gamma (cognition)
- Derived metrics: Attention and meditation levels
- Raw EEG access: Time-series data from all 4 electrodes (TP9, AF7, AF8, TP10)
- 256 Hz sampling rate for research-grade temporal resolution
- Monaco Editor (VS Code engine) with syntax highlighting
- Intelligent autocomplete for P5.js, p5.sound, and EEG data
- Auto-run on save for instant visual feedback
- Error detection with helpful debugging messages
- 20+ example animations to learn from
- Full access to P5.js drawing functions (2D shapes, colors, transforms)
- p5.sound library for brain-controlled music and synthesis
- Mouse, keyboard, and touch event support
- Webcam and media input capabilities
- Automatic function bindingโall P5.js features "just work"
- Brain-controlled synthesis (FM synthesis, oscillators, effects)
- Generative music based on brain states
- Audio analysis (FFT, waveform visualization)
- Effects processing (reverb, delay, filters)
- Built-in sound testing and diagnostics
- Built-in documentation for P5.js functions
- Sound library reference with usage examples
- EEG data API documentation
- Click-to-insert code snippets
- Expandable tooltips with function signatures
- No headset required for learning and testing
- Adjustable sliders to simulate attention and meditation
- Realistic brain wave patterns for development
- Perfect for students without hardware access
- Save/load your sketches as
.jsfiles - URL parameters for sharing sketches (
?sketch=myfile.js) - Embedded code URLs (data URIs) for self-contained sharing
- Auto-reload functionality for iterative development
BrainImation makes neuroscience tangible by transforming abstract brain signals into immediate, visual, and interactive experiences.
- Learn EEG concepts through experimentation, not just lectures
- Develop programming skills while exploring neuroscience
- Create a portfolio of brain-controlled art projects
- Understand signal processing through visual feedback
- Rapid prototyping of BCI experiments and paradigms
- Real-time neurofeedback protocol development
- Accessible platform for participant demonstrations
- Open-source foundation for custom research tools
- New medium for expression: your brain as creative input
- Brain-controlled music, visuals, and interactive installations
- Perform live with your mind as the instrument
- Explore the intersection of neuroscience and art
- Engage students with hands-on brain-computer interfaces
- No complex setup or installation required
- Built-in examples and reference documentation
- Accessible to students with varying programming backgrounds
- Frontend: Pure HTML5 + CSS3 + JavaScript (ES6+)
- EEG Library: muse-js (WebBluetooth)
- Graphics: P5.js (canvas-based creative coding)
- Audio: p5.sound (Web Audio API)
- Editor: Monaco Editor (VS Code engine)
- Deployment: Static hosting (Netlify, GitHub Pages, etc.)
No server required. No dependencies to install. No build process needed.
Simply visit the hosted URL (when deployed to Netlify) and start creating!
- Download
index.htmlandmuse-browser.js - Open
index.htmlin Chrome, Edge, or Opera (Web Bluetooth required) - That's it! The app loads all dependencies from CDNs.
- Click "Simulate Data" to test without a headset
- Select an example from the dropdown menu
- Modify the code in the editor and see changes instantly
- Connect your Muse when ready for real brain data
- Neuroscience Labs: Demonstrate brain wave patterns in real-time
- Signal Processing: Visualize frequency decomposition and filtering
- BCI Courses: Hands-on experience with brain-computer interfaces
- Psychology: Explore attention, meditation, and cognitive states
- Generative Art: Brain-controlled particle systems and fractals
- Live Performance: Use your brain as a musical instrument
- Interactive Installations: Public art that responds to viewer brain states
- Meditation Tools: Visual feedback for mindfulness practice
- Neurofeedback: Real-time feedback protocols
- ERP Experiments: Epoch visualization and averaging
- Cognitive Load: Attention monitoring during tasks
- Relaxation Studies: Alpha wave meditation training
Fully Supported:
- โ Chrome 56+ (Desktop & Android)
- โ Edge 79+
- โ Opera 43+
Requires Web Bluetooth:
- โ Firefox (Web Bluetooth not yet supported)
- โ Safari (Web Bluetooth not yet supported)
- โ iOS browsers (Web Bluetooth not available)
Recommendation: Use Chrome for the best experience.
Compatible Muse Devices:
- โ Muse 2016 (original Muse)
- โ Muse 2
- โ Muse S
Note: The software automatically detects and adapts to different Muse models, handling device-specific characteristics gracefully.
All documentation is located in the /docs folder (development only):
STUDENT_SOUND_GUIDE.md- Comprehensive sound system guideUSING_REFERENCE_PANEL.md- How to use the interactive referenceMUSE_CONNECTION_FIXES.md- Troubleshooting connection issuesBRAINIMATION_URL_LOADING.md- Sharing and loading sketches via URLs
The application itself includes built-in interactive documentation accessible via the Reference Panel.
If you want to modify the muse-js bundle:
npm install
npm run build # Rebuilds muse-browser.js from src/muse-bundle.jsindex.html # Main application (all-in-one)
muse-browser.js # Bundled muse-js library (required)
netlify.toml # Netlify deployment configuration
.gitignore # Excludes docs/ from deployment
package.json # Development dependencies (optional)
src/muse-bundle.js # Source for building muse-browser.js (optional)
docs/ # Documentation (excluded from deployment)
- Connect your Git repository to Netlify, or
- Drag and drop
brainimation.htmlandmuse-browser.jsinto Netlify
The netlify.toml configuration automatically:
- Excludes
/docsfrom being served - Sets proper caching headers
- Configures the site for optimal performance
- All data processing happens locally in your browser
- No data is sent to servers (except optional AI features with your own API key)
- EEG data never leaves your device
- Bluetooth connections are direct device-to-browser
- Optional AI features use your own OpenAI/Anthropic API keys (stored locally)
[Add your chosen license here - e.g., MIT, GPL, etc.]
- muse-js by Uri Shaked - EEG data streaming
- P5.js - Creative coding framework
- p5.sound - Audio synthesis and analysis
- Monaco Editor - Code editor (VS Code engine)
Developed for teaching brain-computer interfaces and computational neuroscience.
Instructor: Kyle Mathewson
Institution: [Add your institution]
Course: [Add course name/number]
- Web Bluetooth required: Not available in Firefox or Safari
- Desktop recommended: Mobile browsers have limited Web Bluetooth support
- Muse 2016 compatibility: Some features (PPG, gyroscope) only work with newer models
- Frequency analysis: Simplified band calculation (not full FFT) for performance
- Single user: One Muse connection at a time
For issues, questions, or contributions:
- GitHub Issues: [Add your repository URL]
- Course Forum: [Add course discussion forum]
- Email: [Add contact email]
- "No device selected": Make sure Muse is charged and in pairing mode
- "Library not loaded": Check internet connection for CDN resources
- No sound: Click canvas first, check browser permissions
- Connection fails: Try turning Muse off/on, move closer to computer
- Visit the hosted site: [Add your Netlify URL]
- Click "Simulate Data" to try without a headset
- Explore the example animations in the dropdown
- Modify the code and see changes in real-time
- Connect your Muse headset when ready for real brain control
No installation. No configuration. Just start creating!
Check out what's possible with BrainImation:
- Neural Networks: Particle systems that respond to alpha waves
- Brain Music: Generative melodies controlled by attention and meditation
- EEG Traces: Real-time visualization of raw brain signals
- Mandala Generator: Meditation-driven geometric patterns
- Brain Drawing: Draw with your mind instead of your mouse
- Frequency Bands: Live spectral analysis and decomposition
- 3D Visualizations: Rotating shapes controlled by brain states
All examples included in the application!
Special thanks to:
- Uri Shaked for creating muse-js and making EEG accessible on the web
- The P5.js community for the incredible creative coding tools
- Students and beta testers who provided feedback and bug reports
- The open-source community for making projects like this possible
Made with ๐ง + โค๏ธ for curious minds exploring the intersection of neuroscience, art, and technology.