The 13 Emotions Evoked by Music

Music has long been recognized as a catalyst of emotional response and has been used as a therapeutic tool by many in the medical and healing professions. It is only recently that the tools were available to measure brain activity and response in a scientific approach.

Researchers at the University of Berkeley in California mapped the emotional responses of more than 2500 people of different ages and cultures (subjects participating in the experiment were selected from among American and Chinese candidates) while listening to thousands of songs of the most different genres: rock, folk, jazz, classical, commercial, experimental music, up to heavy metal.

The result was that the subjective experience of listening can be contained in 13 emotions:

  • amusement,
  • joy,
  • eroticism,
  • beauty,
  • relaxation,
  • sadness,
  • dreaminess,
  • triumph,
  • anxiety,
  • scariness,
  • annoyance,
  • defiance, and
  • feeling pumped up

The aim of the research

The goal of the study was to create a “library of emotions” which captures and combines the feelings associated with each piece played to record them in some sort of map.

Research documents how emotions are universally perceived through musical language.

It turned out that all the people who took part in the experiment associated similar emotions with the same song. For example, participants tended to associate feelings of challenge/provocation when listening to heavy metal.

Some differences have been noted instead in the value that has been associated with the perceived emotion.
People from different cultures agreed that, for example, a song elicited anger but may have disagreed that this sentiment has a positive or negative value for them, whether it results in a state of well-being or malaise, and about its level of intensity.

To determine this difference is the fact that the instinctive emotion caused by listening is affected by cultural influences.

How the research took place

music
(c) Can Stock Photo / focalpoint

Participants were selected through the crowdsourcing platform Amazon Mechanical Turk, a service through which researchers can recruit people who lend themselves, generally on a voluntary basis, to the development of a project allowing the researchers themselves to coordinate human intelligence in order to perform the required tasks.

The research sample so formed was asked to select music videos on YouTube that evoked in them different emotions.

The research sample was divided as follows: 1591 American subjects and 1258 Chinese subjects. 2168 samples were experienced.

A first experiment involved a subgroup of 1841 participants, Americans and Chinese, called to evaluate 40 musical extracts based on 28 different categories of emotions and their degree of intensity. This allowed researchers to obtain a first long list of emotional responses that different types of music can evoke. From this list was distilled the 13 most chosen categories the users extracted. It also allowed verifying how participants from different cultures experienced the same subjective experiences in listening to the same songs.

To ensure that participants from different continents really felt the same feelings, the researchers conducted a “confirmatory experiment” to eliminate any possible cultural influence. The experiment consisted in making participants listen to more than 300 songs performed with traditional instruments and divided between music from Western and Chinese culture.

The answers confirmed the research showing how in all the participants, both American and Chinese, these pieces evoked the same emotions.

Interactive audio map

The search results have been translated into an interactive audio map where visitors can verify their reactions themselves by moving the cursor among thousands of sampled music selections. Each letter on the map corresponds to an emotional trace and allows each one to test the conclusions that have been reached by verifying whether their emotional response corresponds to that given by the people who took part in the search.

Click here to experience the Interactive Audio Map https://www.ocf.berkeley.edu/~acowen/music.html. (Editor’s note: This is VERY interesting. Be sure to let your mouse hover over the letters rather  than the emotional categories).

Practical use of research

The results of this research can find a practical use in the field of psychological and psychoanalytic therapies. For example, the therapist can select music suitable to arouse certain emotions in patients. A more playful use could be directed to streaming platforms that would use this search to regulate the algorithms on which the creation of their playlists is based, so as to make them more and more responsive to the requests of users.

The research is presented in full on the Proceedings of the National Academy of Sciences, al at the following link: https://www.pnas.org/content/early/2020/01/01/1910704117

Comments are closed.


Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/improve/public_html/wp-includes/functions.php on line 4609

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/improve/public_html/wp-includes/functions.php on line 4609