Here’s an interesting bit of neurological research on speech processing in the brain. The researchers were able to decode the brain activity associated with speech recognition and formulation so they could ‘hear’ what the subject heard and/or was about to say.
This is obviously a great breakthrough and could have many real-world applications for people with disabilities, but for me this raises the question of whether the same technique could be applied to the regions of the brain that process music in order to ‘hear’ the music in your head. Most musicians have had that experience of being out and about and coming up with a melody without any way to record or transcribe it, resulting in that piece of music being lost forever. I think a handy iPhone app that plugged into your brain would make life much easier for composers!