Cookies

We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Durham University

Music

Tagging Online Music for Emotion

Tagging online music contents for emotion. A systematic approach based on contemporary emotion research (ESRC grant)

Project Overview:

With the growing ease of music’s accessibility through online music catalogues (e.g., Google music, Spotify, last.fm) and music transportations devices (smartphones, ipods) music’s pervasiveness in everyday-life has reached an unprecedented dimension. The project proposes to incorporate the new understanding of music-evoked emotion into the growing number of online musical databases and catalogues, as well as features in media players to recommend music and creating playlists. Specifically, the aim here is to develop an innovative conceptual and technical tool enabling a suitable tagging of online musical content for emotion. This will utilise the state-of-the-art conceptual model, particularly GEMS (Zentner et al., 2008) from field of music psychology but also revising such a model in terms of new music genres and the information provided by online social tags. In a first part we will examine to extent the GEMS provides valid emotion descriptors for a wider range of musical genres than the genres that we originally used for its development. In a second step, we will use advanced data reduction techniques to select the most recurrent and important labels for describing music-evoked emotion. In a third study we will examine the added benefit of the new GEMS compared to conventional approaches to the tagging of music.

PI: Tuomas Eerola, Durham University, UK
Co-PI: Marcel Zentner, University of Innsbruck, Austria