Adding Insult to Injury: AI and the Graduate Job Market

Image Credit: Ciphr.com via Wikimedia Commons

AI recruitment is adding automated insult to a job market that has turned brutal. In 2025, employers reported receiving an average of 140 applications for every graduate vacancy, the highest level since records began in 1991, a dramatic rise from 38 applications per role two decades ago. Yet, only a tiny 4.3% make it to an interview, many of which are now being conducted by AI.

“No one performs the same way talking to a robot as they do talking to a person,” said Elizabeth Casi, 23, a recent master’s graduate at King's College London, who has already completed several AI-led interviews while job hunting.

AI now screens most applications, often before a human even sees them. Industry data suggests that 75% of CVs are filtered out by Applicant Tracking Systems. If candidates get through to the next stage, their next hurdle is a one-way video interview.

During this stage, applicants are shown a set of pre-determined questions and given a limited time, usually 30 seconds to two minutes, to prepare each answer. They then record their responses on camera. AI analyses these videos using multiple layers of assessment: it checks for key phrases and relevant topics, evaluates tone of voice, measures speaking pace and even examines facial expressions. The system then generates a score for each candidate, ranking them against the requirements of the role. Only those who meet or exceed the AI’s threshold are passed along to a human recruiter… if they are lucky enough.

While AI speeds up recruitment, it often leaves candidates feeling like just another number. Elizabeth said the AI interview process gave her the impression that companies “didn’t really care about the candidates”, leaving her with a negative view of them. Her experience reflects a wider concern among job-seekers: the gradual dehumanisation of recruitment is fuelling disillusionment in an already difficult job market.

Recruiters themselves are not blind to these concerns. One recruiter, at a large finance company which uses AI for interviews and other screening processes, had mixed feelings about the process. “It has made parts of recruitment easier, especially when you’re receiving hundreds of applications a day. We’re very aware of concerns around dehumanisation, which is why we still try to screen candidates ourselves wherever possible. But even so, I’m not sure I’d personally want to be interviewed by AI,” the recruiter said. 

Senior recruiters across industries echo this uncertainty, arguing that human judgment remains essential, particularly for roles where communication and creativity are key. 

A senior recruiter for a media company said automated job interviews did not serve their purposes. “A lot of what we’re looking for can’t really be measured by a system,” they explained. “It’s about context, communication, and how someone thinks, which is why we rely on human judgment. You can have someone who’s the perfect fit for the role but doesn’t sell themselves well on paper. AI might miss them entirely because it’s focused on keywords, not context.”

The debate over AI in recruitment has also caught the attention of policymakers. The UK government recently released guidance on responsible AI in recruitment, stressing that AI should support, not replace, human judgment. Whether that balance can be maintained in an already punishing job market, however, remains unclear.