Mick Grierson has created a real-time EEG-based brain-computer interface for music synthesis. You can watch a video here.
We’ve been designing experiments to test how classic ERPs (P300/600, N400, etc) may emerge from user interactions with this system, given previous demonstrations that those waveforms are sensitive to the “grammar” and “meaning” of musical harmonies, respectively.
What waveforms would you look for in this system?
Meaning From Melody: Music as Language
Harmony in Grammar: Music as Language
Dynamic Gating in Long-Term Memory (and the N400)
The Attentional Doughnut (and the P300 in SSVEPs)
From Perception To Action: The Role of the P3b in Binding
ERP of Monitoring and Retrieval in Prospective Memory Tasks (and the P300)
Novelty in Adaptive Information Processing (and the N2 and P300)