Over a million developers have joined DZone.

Is Contactless Emotion Recognition the Next Privacy Invasion?

DZone 's Guide to

Is Contactless Emotion Recognition the Next Privacy Invasion?

Recently, the media boomed with news about MIT researchers who made a machine reading your emotions contactless. Will this machine revolutionize the Artificial Intelligence industry?

· Big Data Zone ·
Free Resource

In September, the media boomed with news about MIT researchers who made a machine reading your emotions contactless. It uses radio waves to recognize your feelings and detect one of four emotional states: sadness, anger, pleasure, and joy. It measures your breathing patterns and heart rates without even touching your body by simply bouncing off waves of subjects. The accuracy of predictions was 87% for one person and 72% for different people.

Will this machine revolutionize the Artificial Intelligence industry? And if contactless emotion recognition, which does not suppose making people aware of measuring their feelings, is the beginning of the huge privacy invasion?

What Are the Tools Used?

Over the past years, such giants as Coca-Cola and Unilever have already used the contactless emotion analytics tools to adjust their marketing campaigns. For example, Emotient is a startup using cloud-based technology to detect emotions by analyzing facial expressions. Affectiva also uses facial expressions for preventing shoplifting and to figure out what a shopper thinks of merchandise. FacioMetrics provides SDKs for incorporating face tracking, pose and gaze tracking, and expression analysis into apps. Vokaturi and EmoVoice software already recognize emotive states from acoustic properties of speech.

“MIT researchers did not revolutionize the Artificial Intelligence industry with their machine in the sense of emotion recognition. There are dozens of tools already existing. However, they offered a brand new approach which is radio wave applications for making the recognition contactless. You may hide your emotions with a poker face, but controlling your breathing and heart rate is somehow harder”, — supports me Dr. Anton Popov, Senior Researcher at Ciklum and Associate Professor at National Technical University of Ukraine 'Kyiv Polytechnic Institute'.

Who Needs Our Emotions?

Basically, the answer is - any field where the user response is essential. For example, in entertainment to find out whether the user perceives the new user interface with the feelings we intend it to. Or for medical purposes, when there is an app detecting how the user feels depressed and frustrated and recommending to call for the psychological help. Another great usage example is the retail industry. A supermarket may use the contactless emotion recognition tools at the shop entrance, like now security frames, to find out if we feel depressed and send push notification suggesting to buy an ice cream with a discount. Technology that reveals your feelings has also been suggested to spot struggling students in a classroom environment or help autistics better interact with others.

Do you already feel like the industry does care about your user perception so much? Or does it rather feel like a privacy invasion when you don’t want anyone to read your feelings?

Privacy Invasion or Sincere Care

What some may consider as a sincere care about the user, others may see the violation of privacy. Indeed, what if I don’t want anyone to read my emotions? How may I refuse?

“The emotion recognition privacy may be regulated by license agreements in apps where the user gives his/her prior consent on information processing. However, it won’t work now for offline brands”, - adds Oleh Bodilovsky, a Scientist at National Technical University of Ukraine 'Kyiv Polytechnic Institute' writing his Ph.D. thesis on contactless measuring of vital parameters with a camera. At the moment it is difficult to say how may companies get a consent on contactless emotion recognition of people in large crowds at the streets or hundreds of people entering the shopping mall.

So far emotions are not listed as the confidential information in the EU or US data protection regulations. However, we may expect more strict opt-in rules as the EU general data protection regulation moves to becoming law in May 2018. Do emotions need to be considered as biometric information or rather as a health data? It has still a potential of becoming a substantial privacy issue if it’s not regulated by the laws.

artificial intelligence ,machine learning ,security ,algorithms ,big data

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}