Automated video interviews (AVIs) that use artificial intelligence (AI) to assess job applicants interview responses are becoming increasingly popular screening tools and are generally used concurrently with job resume submission or as the next stage in the application process.

“Using AI algorithms to evaluate interviewees helps human resources personnel winnow down the applicant pool prior to having any direct communication with the applicants,”  said Louis Hickman, assistant professor in the Department of Psychology. “This reduces the actual time it takes to hire a new employee and saves organizations money.”

Another plus, he said, is that this enables more applicants the chance to express their skills, qualifications, and passion for the position compared to the traditional process that relies on resumes.

“AVIs potentially represent an objective, consistent method for evaluating candidates, providing a promising tool for organizations aiming for a fairer hiring process,” said Hickman, who led a study that examined the psychometric properties — reliability, validity, and generalizability — of automated video interview personality assessments. The research, published in the Journal of Applied Psychology, was recently awarded the Jeanneret Award for Excellence in the Study of Individual or Group Assessment by the Society for Industrial and Organizational Psychology.

The award is given to the authors of a work published in 2022 judged to have the highest potential to further the understanding of individual or group assessment, especially when such assessment supports the creation of a diverse workforce.

Hickman’s study provides validation data to inform research and practice in the emerging area of automated video interviews and illustrates the potential value of AVIs as an alternative to self-reported personality. Such self-reports are susceptible to socially desirable responding and faking in high-stake situations like personnel selection.

In conducting their research, Hickman and co-authors — Nigel Bosch of the University of Illinois Urbana-Champaign, Vincent Ng of the University of Houston, Rachel Saef of Northern Illinois University, and Louis Tay and Sang Eun Woo of Purdue University — collected four samples of mock video interviews as well as self-reports and interviewer observations of the interviewee’s "big five" personality traits: conscientiousness, openness, agreeableness, emotional stability, and extraversion.

Hickman said his team focused on personality constructs because personality predicts job performance across jobs and is commonly assessed by current vendors of automated video interviews. The team proposed a model for understanding automated video interviews and assessing the construct validity of AVI scores that can also be used to evaluate AVIs for measuring knowledge, skills, abilities, and other characteristics beyond personality.

“While we cannot directly see someone's personality traits, these traits manifest in subtle behaviors,” Hickman said.

The algorithms modeled for the study analyzed three types of cues:

  • Verbal cues, or what the interviewees say, such as positive and negative language
  • Paraverbal cues, or how they say it, such as loudness or speech rate
  • Nonverbal cues, or how they act during the interview, which includes facial expressions such as smiles and frowns

Here are some of the study’s key findings:

  • The models trained on interviewer ratings relied on verbal behavior over paraverbal and nonverbal behavior. Interviewees were judged as more extraverted for speaking louder, speaking faster, and smiling more. AVIs also judged interviewees as more conscientiousness for using longer words and fewer assent words such as "OK" or "yes" as well as more agreeable for talking about helping people.
  • AVI conscientiousness ratings were positively correlated with high school GPA, SAT, and ACT scores.
  • Overall, personality assessments had better validity when trained on interviewer observations rather than self-report assessments, especially for more visible traits such as extraversion and conscientiousness.

“From our study, we determined that part of the reason automatically scored interviews can be valid across multiple interview questions is that interviewers tend to evaluate interviewee personality using the same behaviors in different interviews,” Hickman said. “Thus, automating the process can standardize interviews so that all applicants are evaluated equally - regardless of whether the interviewer is tired, in a bad mood, etc., and regardless of the applicant's irrelevant characteristics like whether they are wearing glasses, their gender, race, or attractiveness.”

However, Hickman cautions that the model they proposed may not be appropriate for every role.

“HR managers need to ensure that personality traits are relevant criteria for any given job based on job analysis,” he said. “And future research is needed to investigate whether automated video interview personality assessments results in any adverse impact toward underrepresented demographic groups to ensure the legality and ethicality of using them.

“As organizations become more reliant on AVIs, they must ensure the tools used are not only reliable and valid but also fair, unbiased, and transparent,” he said.

Hickman has continued this line of research by investigating methods for improving their fairness and applying similar approaches to automatically score workplace roleplay simulations and to measure personal qualities from college admissions essays.

Share this story