Rock It Science: New Study Equates Musical Tastes to Personality Traits

"wbeem", Image by Scott Diussa

“wbeem”, Image by Scott Diussa

Many people have had the experience of hearing a new song for the first time and instantly being thunderstruck by it. They ask out loud or else think to themselves “Who and what was that I just heard?!” Next, they quickly race off to start Googling away in an attempt to identify the tune that just so totally captured their senses.

But what was it about the listeners’ musical tastes that led to this? Moreover, are their affinities for certain musical styles and artists a reliable indicator of their personalities – – and vice versa?

Researchers at University of Cambridge in the UK and Stanford University in the US have recently devised a new method for predicting musical tastes. Their study was published on PLOS One in a fascinating paper entitled Musical Preferences Are Linked to Cognitive Styles, by David M. Greenberg, Simon Baron-Cohen, David J. Stillwell, Michal Kosinski and Peter J. Rentfrow.

These findings were written up in an interesting article in the August 8, 2015 edition of The Wall Street Journal entitled If You’re Empathetic, You Probably Aren’t Into AC/DC by Daniel Akst. I will sum up, annotate and, well, orchestrate a few questions of my own.

Until the introduction of this new method, researchers traditionally pursued correlating musical tastes with the big five personality traits. These include:

  • Extroversion
  • Agreeableness
  • Conscientiousness
  • Neuroticism
  • Openness to new experiences

For instance, extroverts tend to prefer pop and funk. However, test subjects were asked to rate their preferences according to genre which, in turn, can have many gradations and variances. The article mentioned that rock music can include everyone from Elton John to AC/DC, the latter of whom appear in an accompanying photo to the story. (For that matter, who would have ever expected Springsteen to cover “Highway to Hell” during his tour of Australia in 2014?)

The five traits were also previously covered in a different context in the March 20, 2015 Subway Fold Post entitled Studies Link Social Media Data with Personality and Health Indicators.

Using this new methodology, the researchers turned to whether a person is an empathizer “who detects and responds to other people’s mental states” or a systemizer “who detects and responds to systems by analyzing their rules”. The article includes a link for readers to test themselves to assess their own musically influenced leanings towards one personality type or the other.

Leading the research was David M. Greenberg, himself a sax player, who is currently pursuing a doctorate in psychology at the University of Cambridge. The combination of his academic and musical interests is what led him to this line of research. He and his team were seeking a more precise and measurable “sonic and psychological” factors in their efforts to develop a system to predict musical tastes.

The data was gathered from 4,000 volunteers who were tested for empathy and then were asked to rate 50 songs. The findings showed that:

  • Empathizers preferred R&B and soft rock (“mellow” music), folk and country (“unpretentious” music), Euro pop (“contemporary” music), but not heavy metal. Within genres, they preferred “gentler jazz”, as well as “sadder, low energy music”.
  • Systemizers preferred “more intense music” including “punk and heavy metal”.

In the future, this research might be helpful to music streaming services like Spotify to further improve their song recommendation engine. (See also the August 14, 2014 Subway Fold post entitled Spotify Enhances Playlist Recommendations Processing with “Deep Learning” Technology.)

Mr. Greenberg is also interested in researching the reciprocal of his research findings in regards to whether particular types of music can raise empathy or systemizing levels.

My questions are as follows:

  • Might this research also be helpful to a startup like Reify which is developing augmented reality apps for music as covered in the July 21, 2015 Subway Fold post entitled Prints Charming: A New App Combines Music With 3D Printing?
  • Is this research applicable to the composition of music scores for movies, plays and TV shows as storytellers and producers seek to heighten the emotional impacts of certain scenes? (The December 19, 2014 update Subway Fold post entitled Applying MRI Technology to Determine the Effects of Movies and Music on Our Brains discussed Flicker: Your Brain on Movies, a book by Dr. Jeffrey Zacks that, among many other things, covered this type of effect.)
  • Is this research applicable to marketers in developing their ad campaigns aimed at specific demographic groups?

Prints Charming: A New App Combines Music With 3D Printing

"Totem", Image by Brooke Novak

“Totem”, Image by Brooke Novak

What does a song actually look like in 3D? Everyone knows that music has always been evocative of all kinds of people, memories, emotions and sensations. In a Subway Fold post back on November 30, 2014, we first looked at Music Visualizations and Visualizations About Music. But can a representation of a tune now be taken further and transformed into a tangible object?

Yes, and it looks pretty darn cool. A fascinating article was posted on Wired.com on July 15, 2015, entitled What Songs Look Like as 3-D Printed Sculptures by Liz Stinson, about a new Kickstarter campaign to raise funding for the NYC startup called Reify working on this. I will sum up, annotate and try to sculpt a few questions of my own.

Reify’s technology uses sound waves in conjunction with 3D printing¹ to shape a physical “totem” or object of it. (The Wired article and the Reify website contain pictures of samples.) Then an augmented reality² app in a mobile device will provide an on-screen visual experience accompanying the song when the camera is pointed towards it. This page on their website contains a video of a demo of their system.

The firm is led by Allison Wood and Kei Gowda. Ms. Wood founded it in order to study “digital synesthesia”. (Synthesia is a rare condition where people can use multiple senses in unusual combinations to, for example, “hear” colors, and was previously covered in the Subway Fold post about music visualization linked to above.) She began to explore how to “translate music’s ephemeral nature” into a genuine object and came up with the concept of using a totem.

Designing each totem is an individualized process. It starts with analyzing a song’s “structure, rhythm, amplitude, and more” by playing it through the Echo Nest API.³ In turn, the results generated correspond to measurements including “height, weight and mass”. The tempo and genre of a song also have a direct influence on the shaping of the totem. As well, the musical artists themselves have significant input into the final form.

The mobile app comes into play when it is used to “read” the totem and interpret its form “like a stylus on a record player or a laser on a CD”. The result is, while the music playing, the augmented reality component of the app captures and then generates an animated visualization incorporating the totem on-screen.  The process is vividly shown in the demo video linked above.

Reify’s work can also be likened to a form of information design in the form of data visualization4. According to Ms. Wood, the process involves “translating data from one form into another”.

My questions are as follows:

  • Is Reify working with, or considering working with, Microsoft on its pending HoloLens augmented reality system and/or companies such as Oculus, Samsung and Google on their virtual reality platforms as covered in the posts linked to in Footnote 2 below?
  • How might Reify’s system be integrated into the marketing strategies of musicians? For example, perhaps printing up a number of totems for a band and then distributing them at concerts.
  • Would long-established musicians and performers possibly use Reify to create totems of some their classics? For instance, what might a totem and augmented reality visualization for Springsteen’s anthem, Born to Run, look like?

1.  See these two Subway Fold posts mentioning 3D printing.

2.  See these eight Subway Fold posts covering some of the latest developments in virtual and augmented reality.

3API’s in a medical and scientific context were covered in a July 2, 2015 Subway Fold Post entitled The Need for Specialized Application Programming Interfaces for Human Genomics R&D Initiatives.

4.  This topic is covered extensively in dozens of Subway Fold posts in the Big Data and Analytics and Visualization categories.