Tracking Jobseekers’ Emotions Is a New AI Technology
Emotion recognition technology is still in its early stages, but some AI developers are alleging that it can change the way the recruitment sector operates.
Join the DZone community and get the full member experience.Join For Free
Facial recognition technology is appearing in more and more areas of daily life, allowing us to purchase meals, access our smartphones, or even help law enforcement authorities track us down if we’ve committed a crime. This technology is set to go even further as algorithms can begin to recognize not just our identities but also our emotional states. Emotion recognition technology, as it’s being dubbed, is still in its early stages, but some AI developers are alleging that it can change the way the recruitment sector operates.
Eyes on the Prize
Apparently, these algorithms can analyze how keen, disinterested, or trustworthy an applicant is, making it easier for employers to cut the wheat from the chaff when it comes to finding the perfect candidate. Major companies such as Unilever are already starting to incorporate this technology into their recruitment process. One such developer is Human, started in London in 2016, that uses facial analysis algorithms to scan job applications made in video format. They assert that they can identify the emotional responses of candidates to the content of the interview, particularly in the “questions and answers” portion of an interview. Human will then send their analyses back to the recruitment department for each interview question, quantifying the data against values like honesty or passion. If the recruiter has an especially desired quality in mind for a particular position, Human can send them the highest scoring applicants in that particular area.
A Work in Progress
While companies are still obviously free to screen applicants through conventional means, such methods require hours of dedicated manpower, which may not be particularly efficient — especially for a heavily subscribed role. Emotive recognition technology has the potential to make it much easier to shortlist the most promising candidates and even pick up on promising applicants that might have been previously overlooked. Faces can lie, after all, and on a purely personal level, the average recruiter is inevitably likely to employ certain biases when in an interview scenario, whether it's based on attraction, ethnicity, or gender. With emotive recognition technology, the only possible bias could be found in the coding of an algorithm, and in all could provide a more fair and objective way to analyze the suitability of a candidate.
Too Good to Be True?
There are several difficult questions that arise with this sort of technology. One rapidly apparent problem relates to the legal custody of any recorded data, and whether applicants will retain their own image rights or whether they can sign it off to prospective employers. Quite apart from the ethical issues surrounding the technology, there’s a big question mark over how reliable and accurate the findings of this software could ever really be: people are all different, after all, and an individual’s personal mannerisms, or indeed how expressive their countenance is, may not be the most important indicator as to their competence for a particular position.
Safeguards and Oversight
Any data collected by emotive recognition technology would also require specialized training for anyone tasked with interpreting the gathered information so as to extract the most useful and accurate conclusions from what this kind of software can present. Another factor that needs to be discussed is whether candidates aware of such technology will be able to trick the software, much as they might try to manipulate the impressions a recruiter would take away from an interview. On the other hand, someone aware of their behaviour being analyzed might also quite innocently become overly self-conscious and clam up, giving an unfair representation of their character type.
The areas in which this technology has the potential for business don’t just end the interview process. It could also be deployed to help current employees hone their techniques for delivering effective presentations or improve the well-being of workers by scanning staff for signs of depression or fatigue. This does raise the slightly sinister prospect of the technology being used to arbitrarily penalize workers for a perceived lack of employee engagement, further reducing privacy and dehumanizing the workplace. In any case, however, this will be the concern of the respective company employing this technology and the people seeking to work for them. Whatever the worst case scenario we can imagine emotive recognition being used for won’t be sufficient to deter its proponents from further developing these systems.
Opinions expressed by DZone contributors are their own.
Testing, Monitoring, and Data Observability: What’s the Difference?
IntelliJ IDEA Switches to JetBrains YouTrack
How to Optimize CPU Performance Through Isolation and System Tuning
8 Data Anonymization Techniques to Safeguard User PII Data