How The Next Generation Of Content Will Be Mind Readers [Future Of Entertainment]
In the fourth week of our ten week series with IQ by Intel, we assess how biometric input is effecting play.
The Future of Entertainment series by iQ by Intel and PSFK Labs is highlighting the latest in entertainment innovation. Over the course of 10 weeks at iq.intel.com, we are showcasing new products, services and technologies, exploring the changing face of how we consume, share and create content and getting reactions from Intel experts.
We look at a new frontier for interface control that has the potential to once again revolutionize the way we use our devices. This week’s trend explores the ways in which technology is being developed to respond to much more innate and natural forms of human control and gesture like eye movements and even brain waves.
The Kinect 2.0 goes a step further than its predecessor by being able to determine where in the room you’re looking and even going so far as to read your facial expressions to determine your mood. What’s more is the sensor’s ability to determine your heart rate based on tiny fluctuations in your skin color, which could usher in a whole new era of ‘emotional gameplay.’
Mico headphones play music based on a user’s emotional state. A forehead sensor can read a wearer’s thought patterns to determine one of three possible mood states: focused, drowsy and stressed. Based on this analysis, it selects music from a database that corresponds with that feeling.
To get an expert perspective on what biometrics could mean for our future entertainment experiences, we caught up with Dr. Jennifer Healey, a research scientist at Intel Labs. She tells us how emotional and physiological data could enable better recommendations and even tailor our content in real-time.
Read more from the series at the links below:
Stay tuned to iQ by Intel and PSFK or subscribe to the Future of Entertainment series on Flipboard to stay on top of the latest content.