Back to top

Preventing ‘deepfake’ threat to English test security

Generative AI is being used in attempted attacks on digital English tests, cybersecurity professionals have revealed to university stakeholders.
September 22 2023
3 Min Read

Generative AI is being used in attempted attacks on digital English tests, cybersecurity professionals have revealed to university stakeholders.

Speaking at a Duolingo event in central London, senior security engineers explained that they are ahead of the curve in ensuring test integrity but continued vigilance and honest discussion is needed in the battle against fraud.

The proliferation of ‘deepfake’ videos online – so called because of the manipulation of fake content to look or sound like someone else – have been fuelling a new wave of extortion and disinformation scams.

The technology was originally used in the adult entertainment industry to superimpose the faces of celebrities on to actors. The same techniques can be used to make an online English test taker appear to be the genuine applicant, when in fact they are an impersonator being digitally masked to avoid detection.

Face-swap apps have become widely available but the sophisticated software needed to convincingly render a live stream is luckily still beyond the budget and expertise of cheating rings.

Nonetheless, Basim Baig, head of cybersecurity for the Duolingo English Test, explained why the company has decided to appoint a lead engineer in deepfake technology as the company proactively tackles security threats.

“​​Deepfakes are a machine learning driven attack but we have access to the same, if not better technology as the bad guys,” explained Baig.

“We are building out our [tech] muscles to understand how deflective technology works. We then implement the latest techniques being developed by the broader cybersecurity community on deepfake detection for things like preventing propaganda, election fraud or extortion.

“Real time deepfakes are still different from offline deepfakes. If you think of Hollywood, an artist might spend days getting the rendering just right. Real time deepfakes are a much, much harder thing. You need much more expensive equipment.”

“The technical barrier to entry is high”

High end graphics processing units from companies like NVIDIA are entering the open market and are powerful tools for people with the technical skills to use them.

“It’s something that’s on the rise,” said Baig, “but it’s minuscule right now.”

“I think the technical barrier to entry is high. That’s the main reason people are not doing it right. To achieve [live deepfakes] you need a single picture rendered in real time, while at the same time it doesn’t look like your CPU is on fire. [For these reasons] not being detected by our systems is an impossible challenge.”

The use of both AI and human proctors by testing companies is helping keep tests secure and spot the tell-tale signs of external interference.

Kimberely Snyder, a senior operations manager at Duolingo explained some of the clues to spotting deepfakes, saying, “We look for behavioural signals and discrepancies in the video like eye movement or people’s teeth. Teeth are a huge giveaway.”

Other digital metrics are monitored, the details of which need to remain obscure to prevent fraudsters from trying to bypass them.

Baig explained that a major benefit of the Duolingo test compared to competitors is they are conducted asynchronously, meaning they are recorded on a secure desktop application and then assessed by several people later, is a huge advantage because video frames can be slowed down and analysed.

“It’s because it was built from scratch to be a digital, remote first test rather than simply digitising analog processes,” continued Baig. “By combining machine learning with a highly trained proctoring team we can deliver greater levels of security than ever before.

“Our test is administered from end to end using in-house technology with no third party involvement, and anonymous, randomly assigned proctors, meaning the chain is secure.”

Digital tests companies do not share the same systems for proctoring and security with many only having been developed in swift response to the pandemic and the switch to online testing and learning.

Earlier this year Pearson reported widespread violations with students who had taken the PTE Academic Online Test.

The company has not openly explained the reasons why tests were compromised which has meant legitimate students have had their results rescinded or rejected by universities who cannot be sure of the test’s integrity.

Baig explained that the education industry is attracting many more coordinated extortion scams because of the legality of test fraud which varies from country to country.

“When you’re helping a student cheat on an exam, you’re not technically breaking the law [depending on the country], you’re just breaking the terms of service of a specific multinational company.

“Whereas if you’re doing wire fraud, it is an international crime. The reason people attempt to defraud any type of test is because there’s money to be made for them.”

The Duolingo English Test was founded in 2016, but there has since been a proliferation of online tests created from a wide range of companies including IELTS, TOEFL, Pearson, Kaplan, Language Cert, Password and Oxford International Education Group as the digitisation of the international higher education sector increases.

Digital tests have increased access for students who do not have access to local test centres, often having to travel to other countries at great expense. This access will be an important factor for universities who are committed to  attracting a diverse student base long term.

The UK government is due to review its own policy on secure English language tests accepted for visa purposes in the near future.

1
Comments
Add Your Opinion
Show Response
Leave Your Comment

Your email address will not be published. Required fields are marked *