In the near future, film and other media will respond to our moods (Image: Luke R. Smith/Millennium)
What happens when we link films and music to devices that capture small changes in our emotions? Welcome to the world of reactive media
I AM an inveterate channel hopper. It has got worse since I switched to streaming media online instead of watching TV because now I have an infinite number of channels. Depending on how I feel, I select a TV episode, film or YouTube video. If it doesn't grab me fast, I choose another. And another.
Soon I may not even have to do that much work. The technology to tune your music to your running rate has been around for a while, with devices such as the Nike+, and adaptive playlisters like Songza can select music according to your mood – even letting you give a "thumbs down" to stuff that doesn't match. There are also headphones, made by Japanese company Neurowear, that scan your brainwaves to select the music on your smartphone that matches your frame of mind.
On the horizon, however, is something altogether different. For good or ill, we are about to find out what happens if you have media built around you, remixed in real time as your mood and engagement changes. The BBC's newly launched Perceptive Radio will use sound, light and proximity sensors in as-yet-unspecified ways to assess how much attention you are paying at any time. And this year the broadcaster plans to launch content that reorganises into shorter or longer versions in real time, depending on how attentively you are listening.
But technology can go further than just monitoring and responding to attention levels. Biosensors such as heart rate monitors and EEGs to measure brainwaves make it possible to use emotive media such as film and music to actively affect an audience's emotions.
Alongside Eduardo Miranda of Plymouth University, UK, I have spent the past two years working with the University of Reading's cybernetics group on a project that melds machine learning and biosensors with music delivery. We are currently experimenting with ways of driving participants' emotional states in a direction they have chosen. In our design, a computer with automated composition algorithms performs music aimed at eliciting a specific emotional response, while the biosensors measure emotional impact and adjust the music accordingly.
A related technique has been used for films. In 2011, tech firm Sensum debuted its interactive horror film Unsound at the South by Southwest festival in Austin, Texas. The firm uses galvanic skin response sensors to alter film content to heighten – or reduce – emotional reaction to it. For my own short film, Many Worlds , algorithms use brainwaves, muscle tension, perspiration and heart rate in a selection of the audience to adjust the story in real time, choosing the most appropriate of the film's four narratives to maximise intensity.
Writing this kind of multiple narrative is a challenge, and one that science fiction author Hannu Rajaniemi has tackled with technologist Samuel Halliday for an e-book project called Neurofiction . The first short story, Snow White is Dead, has 48 different narrative paths, selected automatically based on the results of an EEG brainwave monitor attached to the computer on which the e-book is being read. "Readers have been enthusiastic – several commented that they found the story very moving," says Rajaniemi.
So reactive media is getting better at adapting to emotions all the time, but there are potential pitfalls. For example, if we can use machine learning to effectively drive the audience's emotional response to enhance their pleasure, it could also be used to enhance pain. Unpleasant music has long been used during interrogations or as a political weapon, but now it could be tailored automatically to the individual's psychophysical reactions.
Other issues lie closer to home – literally. In 2012, Microsoft filed a patent to use Kinect-type motion sensing input devices to scan a room and charge viewing fees for all sorts of media. There are also patents filed by other companies involving in-home technology that listens in to what people are saying.
Imagine you are sitting on your sofa watching a TV show on a TV or Xbox that is connected to a Kinect sensor, perhaps while you are wearing headphones with a built-in brainwave scanner. You no longer have to engage consciously with what you are viewing, as you are constantly leaking emotional information about yourself that is being used to change your experience.
As most of these devices are linked to the internet, you could find your emotions, conversations and movements being analysed for who knows what purpose. Unlike social networks, where we have to interact to experience the technology, reactive media's invisible power means we need to be very careful to tick the right privacy boxes when we are installing the kit.
- Subscribe to New Scientist and you'll get:
- New Scientist magazine delivered every week
- Unlimited access to all New Scientist online content -
a benefit only available to subscribers - Great savings from the normal price
- Subscribe now!
If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.