6/5/22 - The (possibly dystopian) rise of the automated video interview

Companies are embracing automated video interviews to filter through floods of job applicants. But interviews with a computer screen raise big ethical questions and might scare off candidates.
By Anna Kramer
https://www.protocol.com/workplace/automated-video-interviews-hirevue-modernhire 

Applying for a job these days is starting to feel a lot like online dating. Job-seekers send their resume into portal after portal and a silent abyss waits on the other side.

That abyss is silent for a reason and it has little to do with the still-tight job market or the quality of your particular resume. On the other side of the portal, hiring managers watch the hundreds and even thousands of resumes pile up. It’s an infinite mountain of digital profiles, most of them from people completely unqualified. Going through them all would be a virtually fruitless task.

Enter the Tinders of corporate America. These services are the ones that made it so easy for anyone to apply for a job on the internet. But just like online dating, once the entire world is available for a match, you need to introduce some kind of filter to figure out who you should review first.

Most large companies use software to sort through resumes and cover letters, identifying likely candidates based on keywords, professed qualifications or even just where they went to college. But these services have taken their product a step further. Now, when some companies (ranging from major financial institutions like J.P. Morgan to food prep and retail) invite someone for an interview, they have no intention of showing up for the interview themselves.

Instead, these corporate Tinders give people an automated video interview, guiding the candidate through a conversation with their computer screen. The applicant stares at the webcam distortion of their face (instructed to emote normally like they would if speaking with an actual person), tries to explain why they want the job and then once more sends the information back into the abyss, often without being able to review their video first. The software will then produce a report and likely a ranking that will be used to determine if they get an interview with an actual person.

Automated resume and cover letter screening is just not advanced enough in a world where remote work is increasingly common and remote job applications are easier than ever. For hiring departments, automated video interview software makes whittling down the initial hiring pool infinitely easier. As an added bonus, the companies that make this software sell themselves as scientific and less biased than the flawed humans who run actual HR departments. The market is so fruitful that there are nearly endless options with similar services — among them HireVue, Modern Hire, Spark Hire, myInterview, Humanly.io, Willo and Curious Thing. Entry-level college graduates in tech, banking and even consulting almost always get funneled through these systems. In March 2021, HireVue announced that its platform had hosted more than 20 million video interviews since its inception.

But easy, frictionless processes like these always have a catch. Most companies like to talk about hiring like they’re finding the right fit specifically for their workplace. By relying on automated video interviews, they willingly introduce a third party — another company with its own goals, preferences and biases — between themselves and their new hires. Someone or something else is making the initial decision that could make all the difference.

That pesky AI problem
All of these companies use AI buzzwords to sell their services and advertise their tools. Modern Hire calls its service an “AI-Powered Automated Interview Creator;” at HireVue, the words “science-backed” appear frequently on marketing materials, and a HireVue spokesperson told Protocol that its “assessments are designed by psychologists with evidence-based approaches.” Companies deploy machine learning in different ways; HireVue and Modern Hire use AI tools primarily to transcribe the interviews and then to evaluate and rank the interview text.

Although the companies claim to reduce bias in hiring, the researchers and advocates who study AI bias are these companies’ most frequent critics. They argue that most machine-learning tools aren’t properly audited or regulated and commonly recreate or enhance already existing biases, so opting to incorporate AI into the hiring system is knowingly making a choice to take that risk.

The FTC has warned companies against using algorithms that could be unfair or create adverse outcomes, according to Sara Geoghegan, a law fellow at the Electronic Privacy Information Center. In 2019, EPIC filed a complaint with the FTC alleging that HireVue was engaging in unfair and deceptive practices that violated AI standards by using facial recognition AI tools in its video-interview analysis.

Then, in 2021, HireVue removed the facial recognition tools from its system. “HireVue research, conducted early this year, concluded that for the significant majority of jobs and industries, visual analysis has far less correlation to job performance than other elements of our algorithmic assessment,” the company wrote about its decision. “We made the decision to not use any visual analysis in our pre-hire algorithms going forward. We recommend and hope that this decision becomes an industry standard.”

Federal and state regulators have also started to propose legislation that would restrict how these algorithms are used and require independent audits. New York City passed a bill recently that would require “bias audits” for algorithms used in hiring, and Washington, D.C.’s proposed Stop Discrimination by Algorithms Act of 2021 would set a strict list of requirements for companies wanting to use algorithms in employment settings like the automated video interviews.

“We only score by the way the words people say that are transcribed, not the way they sound or the way they look. That is a hard line that we draw and have always drawn; my mentality and our mentality as a company is that we should only be scoring information that candidates consciously provide to us,” said Eric Sydell, the executive vice president of Innovation at Modern Hire. “There are organizations that use that information. I think it’s wrong. I only give you express permission to use my responses; that’s the right way that we need to proceed.”

For the systems’ critics, it’s difficult to actually prove why someone has been filtered out of the system. “What’s particularly tricky about this — it’s really hard to find people who have experienced an adverse outcome because of these systems, because you don’t know. If I do a little 90-second or 60-second video of myself, and I say, ‘Hi, I’m a lawyer and I do tech stuff,’ I won’t know if I don’t get a job if it’s because I wasn’t qualified or if it’s because a system made a call in a matter of seconds, and now I’m subject to that system,” Geoghegan said.

The companies that actually make the systems argue that hiring is already such a flawed and biased process that taking the actual interviewer out of the screening process actually makes it more fair. When people conduct unstructured interviews, they almost always hire the people they like, not necessarily the ones best qualified for the job. One striking example: a University of Texas study found that after its medical school had to accept students it had initially rejected based on interviews, the rejected students and the originally accepted ones had the same performance in school.

“The hiring industry and the hiring process itself has long been broken,” Sydell said. “This is a challenge that algorithms and modern science are suited to help solve, and help make scientific sense of it — which pieces about a candidate are predictive about your success on the job.”

“We are humans; the way our brains process information is very biased. We are always looking for people who are similar to ourselves; we weed out other people who might be different,” he said.

Problem whack-a-mole
Companies implement these systems because they have commercial and practical hiring needs they must meet. “It’s very difficult for them to go through this mass of applicants. They are indispensable, they couldn’t cope without them,” said Zahira Jaser, a professor at the University of Sussex Business School. “Though I am quite critical, I also don’t see a way out of it. I think this is going to become a bigger and bigger phenomenon.”

Jaser studies how people experience automated video interviews and how they affect hiring, not the AI itself. Her research has found that most people who undergo these video interviews don’t understand how the system works or what they’re getting themselves into, and she urges employers to adopt a “glass-box” approach where they provide as much transparency as possible about how their interviews will be processed and screened. At the very least, candidates need to understand that software, not a person, will be analyzing the text of what they say to a webcam. She also recommends employers create their own simple systems for candidates where they can see what successful interviews look like and why, and that they provide feedback to people who are rejected about why and what they can do to improve.

Without some of these changes, companies could run afoul of laws like the Americans with Disabilities Act. Federal regulators just released guidance in May that explains how the use of algorithms could violate the ADA. One of the key recommendations? Applicants need to understand the system and have straightforward ways to ask for alternative interview methods if they have a disability that could interfere with how the algorithm assesses their interview.

Smaller firms also need to consider whether the video interview might turn away potential candidates who see the system as offensive and develop easy alternative interview methods. One job applicant for a major media firm told Protocol that he immediately rescinded his application when the firm asked him to complete a Modern Hire interview. “It’s just the lack of transparency, and the data, and the laziness as well. It wouldn’t be that hard to just ask for a 20-minute chat. The person I actually want to talk to is the hiring manager,” he said.

“Why do they feel their time is more valuable? And this was for a mid-relatively high-up position; I can maybe understand it for graduates where you are receiving thousands of applications, maybe it’s a good tool to filter out from literally thousands. But even that is questionable in my opinion.”

Jaser sees that same sentiment from the people she has interviewed in her research.

“The technology doesn’t care about the human. So effectively it’s very exploitative of the human,” she said. “They are extracting what’s of interest to the employer in a very narrow way, forgetting almost all of humanity. It’s a very narrow way of judging. There is no relationship built.”

Anna Kramer
Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: This email address is being protected from spambots. You need JavaScript enabled to view it.), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.