Copyright License: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) If you use the IMAC dataset (which comprises of both Music and Image Emotion Dataset), please cite the aforementioned paper along with the paper that proposed Image Emotion Dataset.If you use the Image Emotion Dataset, please cite the paper that originally proposed that dataset:.If you use the Music Emotion Dataset, please cite the aforementioned paper:.The IMAC dataset was proposed in the following paper: Learning Affective Correspondence between Music and Image Gaurav Verma, Eeshan Gunesh Dhekane, Tanaya Guha | In 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Million song dataset download download#
A sample script to download audio from YouTube is available here: View Script Contact Usįor comments or questions, you may write to the authors: Paper and Citations.
Million song dataset download download zip#
Note : The images and audio samples are available as URLs. The IMAC database is constructed by combining an existing image emotion database (You et al., 2016) with a new music emotion database curated by us (Verma et al., 2019). Each data sample is labeled with one of the three emotions: positive, neutral and negative. It consists of more than 85,000 images and 3,812 songs (approximately 270 hours of audio).
To facilitate the study of crossmodal emotion analysis, we constructed a large scale database, which we call the Image-Music Affective Correspondence (IMAC) database.