Music identification using brain responses to initial snippets
Pankaj Pandey (Indian Institute of Technology Gandhinagar)
Gulshan Sharma (Indian Institute of Technology Ropar)
Krishna Prasad Prasad Miyapuram (Indian Institute of Technology Gandhinagar)
Ramanathan Subramanian (University of Canberra)
J. Derek Lomas (TU Delft - Form and Experience)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Naturalistic music typically contains repetitive musical patterns that are present throughout the song. These patterns form a signature, enabling effortless song recognition. We investigate whether neural responses corresponding to these repetitive patterns also serve as a signature, enabling recognition of later song segments on learning initial segments. We examine EEG encoding of naturalistic musical patterns employing the NMED-T and MUSIN-G datasets. Experiments reveal that (a) training machine learning classifiers on the initial 20s song segment enables accurate prediction of the song from the remaining segments; (b) β and γ band power spectra achieve optimal song classification, and (c) listener-specific EEG responses are observed for the same stimulus, characterizing individual differences in music perception.