Applying MRI Technology to Determine the Effects of Movies and Music on Our Brains

By a very fortuitous coincidence on August 28, 2014, two articles appeared online in very different publications but with very similar facts and implications about using MRI technology to research the neurological effects of movies and music upon their audiences. Let’s, well, scan these features together and see what we find.

First, everyone loves watching movies and nowadays they can be viewed on screens everywhere in theaters, televisions, computers, mobile devices and gaming systems whenever it is convenient for the viewer. The work of a psychologist named Uri Hasson was reported on in a fascinating article by Greg Miller entitled How Movies Sychronize the Brains of an Audience. As reported here, has Hasson employed MRIs to scan viewers of the same scenes in a series of films from different genres. He recently presented his finding to a group of film industry professionals.

He was surprised to find that highly similar regions of the brain were showing specifically increased activity among the viewers of the clips of the same films, That is, discernible patterns emerged in the scans while viewing westerns, action movies, mysteries and so on. However, a comedy on cable produced a much lower level of Synchronicity among the test subjects. (There are two very informative graphics of the MRI’s outputs accompanying this story.) In effect, different films and different genres produced more highly correlated levels of such synchronicity than others. One Hollywood director is quoted here about his concerns that movie studies might soon be using MRIs to test movies at pre-release test screenings.

Furthermore, I think it would be interesting to know if Netflix might also be able to apply this research. This because starting in 2006 and concluding with the “Netflix Prize” being awarded in 2009, the company ran a contest challenging contestants to devise an algorithm that would improve their movie rating and recommendation system. That is, when subscribers order film A for viewing, Netflix will additionally recommend films B, C and D based on the reviews of the user base. So, would the added application of MRI data and analyses possibly improve the current recommendation algorithm being used at Netflix?

Second, is there actually anyone out there who still doesn’t get chills up and down their spine whenever they hear the opening bars of Born to Run? This likely happens even thought you have heard it 10,000 times before. Do you recall the first time you ever heard it come blasting out of the radio?

Using MRI technology in the context of researching why tunes have such a strongly evocative effect upon our brains, was another engaging report entitled Why Your Favorite Song Takes You Down Memory Lane posted on According to this story, the test subjects in a study were all played six songs (four were “iconic”, one was a favorites and one was unfamiliar), of five minutes each, from very different types of music. The scientists conducting this study found distinct patterns depending on whether that subject either liked or disliked a song and another pattern for the fave among the group.

Moreover, the fave increased activity in the hippocampus*, the brain region that controls memory and emotion, thus causing the resulting connection between music and memory.

I highly recommend clicking through and reading both of these articles together for all of the scientific details of how these studies were done and their conclusions were reached.

Also, for a terrific and thoroughly engaging detailed analysis of the neuroscience of music I also recommend This is Your Brain on Music: The Science of a Human Obsession by Daniel J. Levitin (Plume/Penguin, 2007)

December 19, 2014 Update:

The next set of analyses and enhancements to our cinematic experience can be found in a newly published book that explains the science of how movies affect our brains entitled Flicker: Your Brain on Movies (Oxford University Press, 2014), by Dr. Jeffrey Zacks. The author was interviewed during a fascinating segment of the December 18, 2014 broadcast of The Brian Lehrer Show on WYNC radio. Among other things, he spoke about why audiences cry during movies (even when the films are not very good), sometimes root for the villain, and move to duck out of the way when an object on the screen seems to be coming right at them such as the giant bolder rolling after Indiana Jones at the start of Raiders of the Lost Ark. Much of this is intentionally done by the filmmakers to manipulate audiences into heightened emotional responses to key events as they unfold on the big screen.

* Isn’t that also what they call the place where hippos go to school?

IBM’s New TrueNorth Chip Mimics Brain Functions

To borrow a title made famous by Monty Python to characterize a development announced in the August 7, 2014 edition of the New York Times, now for something completely different in, well, computing architecture, IBM has created a chip called TrueNorth that mimics some of the operations of the human brain. As covered in this report entitled IBM Develops a New Chip That Functions Like a Brain, this chip uses far less power than other chips built on more traditional technologies and, it is hoped, may enable the faster and more extensible processing and interpretation of certain classes of data.  This article contains a link to the August 8, 2014 issue of Science by the IBM researchers with the technical details of their accomplishments. In addition to reading the full details of this fascinating article, I also suggest a click-through to another article on IBM Research’s own website to an article entitled Introducing a Brain-inspired Computer.

This is one of those remarkable developments where the inspiration for a unique technological advancement has been derived from human biology. The field of biomimetics has likewise produced innovative systems, designs and materials in many diverse fields such as, among others, aeronautics, pharmaceuticals and robotics.

As reported in the NYTimes story, the TrueNorth chip, this is being termed a “neuromorphic” chip because it imitates the functions of the brain’s neurons to better recognize patterns such as changes to the intensity and color of light or particular physical movements made by a person. The May/June 2014 edition of MIT’s Technology Review in its annual report on the Top 10 Breakthrough Technologies carried a highly informative article entitled Neuromorphic Chips as among one of 2014’s such areas.

The report further states that the chip’s “neurons” all run in parallel and can compute 46 millions operations per second. While not as fast as many of today’s other chips, by its very nature it is better able to handle certain types of operations that faster chips can process. Moreover, scientists believe that the speed of these chips will continue to scale up.

I am certain that as with many other strikingly original advances such as this, other applications will continue to emerge in the future for these chips that no has currently anticipated. I am greatly looking forward to seeing what they are and where they occur.