Do We Need the World's First Emotional Processing Unit? [Audio]

DZone 's Guide to

Do We Need the World's First Emotional Processing Unit? [Audio]

Emoshape has created what they call the first emotional processing unit (EPU) dedicated to understanding and putting a value on human emotions.

· AI Zone ·
Free Resource

When software became more media-heavy, computers added graphical processing units (GPUs) and audio processing units (APUs and sound cards) to take the pressure off the central processing unit. Now, we demand software that shapes around us, changing based on our preferences, tendencies, and demands. As software morphs into more preemptive forms, such as smart assistants and bots providing us with what we need before we even know we need it, will we need computing resources dedicated to assessing human emotions?

Patrick Levy-Rosenthal of Emoshape thinks so, and they have created what they call the first "emotional processing unit" (EPU) dedicated to understanding and putting a value on human emotions. They took an interesting approach to attempting to give the artificial intelligence and pain and reward mechanism to prevent it taking actions that upset humans. If it senses that it causes a negative emotion in a human, then the chip feels a degree of (emotional) pain. If it senses that it causes a positive emotion in a human, then it feels a degree of pleasure.

It’s fascinating that we can reduce so many things to levels of value and lose subtlety, but when we think in these ways, you start to realize that algorithm development can happen from just about any input. Games, eBook readers, and robots that react accordingly to human emotions are a fascinating potential first step to creating a "good" AI, but AI development gets interesting with Emoshape, as even though it has these inbuilt inputs and outputs, the interpretation is up to the developer. As far as I can tell, there is nothing stopping a developer implementing an inverted set of emotional values and setting loose a psychopathic robot apart from market forces.

Emoshape also decided to create a chip instead of a cloud service, as theoretically, it’s more secure. To prevent vulnerabilities in the chip (and we all know now that can happen), they "phone home" at regular intervals, and if they don’t, or the incoming signal doesn’t look right, they are shut down. Again, what happens if that part of the chip has the vulnerability is unclear to me, and so is what happens in the meantime, but at least Emoshape is thinking about the issue.

It’s a fascinating series of products, and while I am torn emotionally (reduce that to a series of values, Emoshape!) on whether this is useful/pointless/scary or amazing, in the meantime, it’s a fascinating product to keep an eye on.

For more, listen to my interview with Patrick Levy-Rosenthal below.

ai ,emotions ,gpu ,machine learning

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}