The iMotions Biometric Research Platform
We recently wrote an article about how eye tracking technology allows us to take advantage of a phenomenon called “foveated rendering” which means we can significantly reduce the amount of bandwidth needed for fully immersive virtual reality simulations. The idea is that you only need to display high resolution exactly where the eye is looking because that’s how our vision works. That’s just one application of eye tracking technology. You can also use eye tracking in marketing studies to see how humans react to certain visual stimuli. This sort of biometric research not only involves eyesight but also how stressed you feel, what your brain is thinking, and what your facial expressions say about your reaction. One Danish firm that provides a biometric research platform that can measure all of these variables at the same time is iMotions.
Founded in 2007, Danish startup iMotions has taken in $4.3 million in funding to build their biometric research platform that is presently used by clients that include FMCG companies (Proctor & Gamble, Unilever), drug companies (Eli Lilly, GlaxoSmithKline), and leading academic institutions (Harvard, Stanford). In order to help you understand how the iMotions platform works, we’ll explain it through a fictional use case.
Let’s say you’re the Director of Marketing at Coca-Cola and your team has put together a commercial for the Superbowl that will cost you $4 million for only a single viewing. The amount of money you are investing into this one commercial requires that you are absolutely certain that viewers will engage with the content and message. In order to see exactly how humans will react when watching this commercial, you could use the iMotions biometric research platform to conduct a study. You’ll want to use 30 people for the study to make it statistically significant so you’ll need to pay them. Let’s say then that the total cost of paying the study participants and iMotion’s consulting fees comes in around $25,000. You’d easily pay that fee given that it represents just .006 of the total cost to air the commercial.
You hire iMotions to conduct the test and they hook up each participant to the below platform which looks something like this:
On the left hand side you see a monitor which the researcher uses and on the right hand side is the person who is participating in the study. While you can make observations in real-time during the study, all the data is being stored in a central database (presumably in “the cloud” ) so you can then analyze it later. After a few weeks, the study is complete and you can now see how people reacted to your commercial. Here are some of the types of data you were able to extrapolate from this study.
Facial Expressions – Remember when we wrote about a company called Affectiva that can analyze your facial expressions in real-time? The iMotions platform supports the use of Affectiva’s emotion analysis engine so we can see how people are reacting at any given time while our commercial is playing. This data is inputted via a webcam.
Eye Tracking – In a recent article we discussed how eye tracking can be measured and this data can be used to tell us exactly what elements of our commercial people are observing the most. This data is inputted either by hardware that affixes to a monitor (similar to a webcam) or by special glasses that the subject wears while viewing the commercial.
Electroencephalography (EEG) – That may be the biggest word we’ve ever used on Nanalyze and what this actually refers to is the detection of electrical activity in your brain by using small, flat metal discs (electrodes) attached to your scalp. The electrical activity being measured can detect engagement/boredom, frustration, excitement, distraction, and even drowsiness. This data is inputted via a special type of “hat” that the subject wears.
Galvanic Skin Response (GSR) – Strange as it sounds, you can actually detect emotional arousal & stress by measuring changes in the conductivity of your skin. This input is detected via a clip that attaches to the subject’s fingers.
ECG, EMG – We’re not going to delve into what these two acronyms stand for but what they refer to are additional “biosignals” that can be detected and used to analyze how people are reacting to our commercial.
Now if you take all of the data above which has been synced in real-time and analyze it, you can find out a wealth of information about how one person has reacted to our commercial. We can then take the combined data from all 30 of our subjects and see how it compares, even choosing to slice it by factors such as age, ethnicity, sex, etc. What we found out is that a small subset of people found something depicted for a few seconds in our commercial to be distasteful so with a slight modification we were able to make our commercial appeal to a broader audience. That’s just one example of the many use cases there are for the iMotions Biometric Research Platform.
Here at Nanalyze, we complement our tech investments with a portfolio of 30 dividend growth stocks that pay us increasing income every year. Find out which ones in the Quantigence report freely available to Nanalyze subscribers.