This site may earn affiliate commissions from the links on this page. Terms of use.
(Photo: Dylan Ferreira/Unsplash)The tech industry has been rife with recruiting issues in recent years, from post-interview ghosting to bait-and-switch tactics affecting both sides of a job offer. But now the FBI is warning tech companies to look out for an unexpected challenge: deepfake interviewees.
Bad actors are impersonating other people via deepfakes to weasel their way into remote work positions, according to the agency’s latest public service announcement. The wrongdoer starts by gathering enough of their target’s personal information to convincingly apply to jobs as that person. Then they acquire a few high-quality photos of the person, either through theft or a bit of casual online sleuthing. When interview time rolls around, the bad actor uses the photos (and sometimes voice spoofing) to create and deploy a deepfake, which often passes for the target in a video medium.
The FBI says job candidate impersonations often involve IT and programming roles, as well as any role that would “include access to customer [personally identifiable information], financial data, corporate IT databases and/or proprietary information.” Such details could be used to steal money from a company directly, as well as undermine the stock market, release competing products or services, or sell massive amounts of private data. While it’s a little less likely that wrongdoers would want to actually work in their wrongfully-won role long-term, there’s also a chance that they want to earn US currency from outside the US—or enjoy the perks associated with a role they otherwise wouldn’t be able to obtain. Some even wonder if the impersonations could be a part of a larger operation threatening national security.
Right now it’s unclear whether job candidate impersonations are ever caught mid-interview. While some deepfakes are awfully realistic, they’re usually one-directional; rarely are they a part of the conversational two-way street, which is typical of a job interview. Ideally, even the untrained eye would be able to notice something “off” about a deepfake interviewee. But there’s also something to be said about your occasional frazzled recruiter, who—in desperation to fill a role or ten—might not catch an unsettling visual lag, or might chalk it up to a poor internet connection. In this way, technical prowess and a bit of luck could combine to create the “perfect” criminal opportunity.
While the FBI hasn’t offered specific strategies for wary recruiters, it does vaguely warn of uncoordinated audio and visuals. “In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the PSA reads. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”
Now Read:
Source by www.extremetech.com