Emotion-Based Music Recommendation from Quality Annotations andLarge-Scale User-Generated Tags

Marta Moscati, Hannah Strauß, Peer-Ole Jacobsen, A. Peintner, E. Zangerle, Marcel Zentner, Markus Schedl

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Emotions constitute an important aspect when listening to music. While manual annotations from user studies grounded in psychological research on music and emotions provide a well-defined and fine-grained description of the emotions evoked when listening to a music track, user-generated tags provide an alternative view stemming from large-scale data. In this work, we examine the relationship between these two emotional characterizations of music and analyze their impact on the performance of emotion-based music recommender systems individually and jointly. Our analysis shows that (i) the agreement between the two characterizations, as measured with Cohen’s κ coefficient and Kendall rank correlation, is often low, (ii) Leveraging the emotion profile based on the intensity of evoked emotions from high-quality annotations leads to performances that are stable across different recommendation algorithms; (iii) Simultaneously leveraging the emotion profiles based on high-quality and large-scale annotations allows to provide recommendations that are less exposed to the low accuracy that algorithms might reach when leveraging one type of data, only.
Original languageEnglish
Title of host publicationProceedings of the 32nd ACM Conference on User Modeling,Adaptation and Personalization (UMAP), 2024
Number of pages6
Publication statusPublished - 2024

Fields of science

  • 202002 Audiovisual media
  • 102 Computer Sciences
  • 102001 Artificial intelligence
  • 102003 Image processing
  • 102015 Information systems

JKU Focus areas

  • Digital Transformation

Cite this