04-08-2023 17:39 via technocracy.news

Brain2Music: Reconstructing Music From Human Brain Activity

A new study from Cornell University demonstrates that scientists can reconstruct music that you are listening to with remarkable precision. AI is used to decipher changes in the brain as you listen to a music recording, then reads your mind in real time.
An overview of our Brain2Music pipeline: High-dimensional fMRI responses are condensed into the semantic, 128-dimensional music embedding space of MuLan (Huang et al., 2022). Subsequently, the music generation model, MusicLM (Agostinelli et al.,
Read more »