Universities Warn of ‘Future of Fraud’ as Deepfake Applicants Caught

Image Courtesy: Tero Vesalainen/iStock

UK universities have flagged multiple incidents of ‘deep faking’ in their interview processes for the January 2025 intake. This comes as several higher education institutions have moved to using software platforms such as Enroly, which saves time for admissions teams by automating interviews for international students.

Enroly reported zero deepfake cases during the September 2025 intake. However, January saw shifts to a “small but growing” trend, as Enroly's Head of Services, Phoebe O'Donnell, expressed, "Some of the things we've uncovered in student interviews are straight out of a sci-fi film – lip-syncing, impersonation, and even deepfake technology." 

Out of the 20,000 interviews completed for the January 2025 intake, 0.1% of cases used "third-party support", 0.6% of cases used "lip-syncing", and 0.15% used "deepfake attempts". O'Donnell made light of the concerning situation: "Welcome to the future of fraud, folks." 

But how does a Deepfake work? Deepfake relies on artificial neural networks, which are computer systems that recognize patterns in data. It can mimic someone's facial movements from their videos or pictures. One method to identify use of Deepfake is through movements like blinking or facial tics, but as technology rapidly evolves, it is becoming increasingly difficult to detect.

A UK Home Office spokesperson conveyed that any individual who attempts to "cheat or use deception" may face "a ban from applying for UK visas for 10 years". They also intend to take "tough action" against companies and agents who attempt to "exploit and defraud international students".  

Phoebe O'Donnel warns, "Fraudulent practices will keep evolving, but with the right combination of technology and expertise, universities can stay one step ahead." Deepfakes are extremely dangerous for all, and are by no means limited to university interviews. They can be used to impersonate famous people or politicians, influence people's decisions, and spread misinformation.