Company uses facial analysis technology to analyse candidates in job interviews

Facial analysis technology is being used in job interviews by British companies to help identify the best candidates.

Consumer goods company Unilever is the first to pioneer the technology which analyses candidates when they are asked a set of identical interview questions, which the firm film on their mobile phone or laptop.

The algorithm is trained to identify the language, tone and facial expressions to select the best applicants by assessing their performances in the videos.

Facial expressions assessed by the algorithms include brow furrowing, brow raising, eye widening or closing, lip tightening, chin raising and smiling, which are important in sales or other public-facing jobs.

It is trained by deep learning against about 25,000 pieces of facial and linguistic information compiled from previous interviews of those who have gone on to prove to be good at the job.

US company Hirevue, which developed the interview technology, claims it enables hiring firms to interview more candidates in the initial stage rather than relying on CVs.

Read more:

Deep learning vs. machine learning: what's the difference between the two

The company claims that this provides a more reliable indicator of future performance free of human bias.

“It is going to favour people who are good at doing interviews on video and any data set will have biases in it which will rule out people who actually would have been great at the job,” Anna Cox, professor of human-computer interaction at UCL, told the Telegraph.

Multi-billion pound company Carlyle Group says it has already used its technology for 100,000 interviews in the UK.

Worldwide it claims to deliver one million interviews and more than 150,000 pre-hire assessments every 90 days.

Loren Larsen, Hirevue’s chief technology officer, told The Daily Telegraph that 80 to 90 per cent of the predictive assessment was based on the algorithms’ analysis of candidates’ use of language and verbal skills.

“There are 350-ish features that we look at in language: do you use passive or active words? Do you talk about ‘I’ or ‘We.’ What is the word choice or sentence length? In doctors, you might expect a good one to use more technical language,” he said.

“Then we look at the tone of voice. If someone speaks really slowly, you are probably not going to stay on the phone to buy something from them. If someone speaks at 400 words a minute, people are not going to understand them. Empathy is a piece of that.”

Update: This article was updated on 30 September