An alert from the American FBI
After that of Europol, a note from the FBI never ceases to panic recruiters, but also certain candidates. On June 28, the US federal agency warned of an increase in complaints for the use of “deepfakes” during remote job interviews. Technologies know no borders, there is no reason for France to be spared. Since the appearance of this trick technique, many public figures have been victims. Now, deepfakes can affect anyone.
This word suitcase designates a technique of digital trick developed using artificial intelligence. It allows you to create ultra-realistic images or voices of real people by digitally synthesizing them. These videos can make anyone say anything and fool everyone.
While some deepfakes may be just entertaining, others are more intended to harm the people impersonated. Among the “deepfakes” of celebrities, those of Obama, Volodymyr Zelensky or Mark Zuckerberg were all intended to harm their real double by circulating on social networks. According to US publisher Sensity, the number of deepfake videos has doubled every six months since 2019, when it counted 15,000.
Deepfakes in interviews, how does it work?
For those who manipulate these deepfakes, video interviews are a godsend. Recruiters in a hurry to “close” too, those who neglect identity checks because they don’t want to see the candidate escape them…
The main victims of these imposters are companies that have fully digitized their hiring process, including interviews. To fool employers, criminals take on a false identity before reporting to videoconference interviews. They use both deepfake tech and personal identifying information they stole from the real experts of the job they are applying for.
The more you are present on the Web (especially via videos, which make it easier for them), the more you run the risk of these imposters usurping your identity. Indeed, it is easier for them to create a simulation program based on a video than on static photos of you. Same, it is easier to clone your voice if there are already audio recordings. (read here the experience of Patrick Hillman last August, an example of the nuisance of deepfakes in business).
Unsurprisingly, no company has, for the moment, taken the floor to admit that it had been the victim of interview imposters. But the community of IT profile recruiters is on their toes.
“I will be able to recognize a false candidate, assures Hakim*, consultant in the recruitment of datascientists, profiles that are highly coveted and very scarce at the moment. The movements of the lips are sometimes out of step with the sound, sometimes there are bugs in the image, bad connections at the level of the neck or the person never blinks, for example. » However, the alerts from Europol and the FBI are formal: when the complaints are filed, the crooks have already cracked down without being detected in time.
Fortunately, automatic deepfake detection software is starting to be offered by Gafam to protect itself from these toxic technologies.
Why go to so much trouble to be recruited?
Impersonating you in a job interview allows these thugs to land jobs in order to infiltrate companies to steal them.
According to the FBI, would-be impersonators most often target ” technical positions that the market is struggling to fill, especially in software development and database administration”.
Once recruited, they get their hands on sensitive data, customer databases, industrial secrets or any other strategic data likely to harm the duped company.
Examples of deepfakes
The most famous (2018): how to make Barak Obama say anything
* the first name has been changed at the request of the recruiter who prefers to preserve his anonymity.