Job Interview Scam: FBI Warns Deepfaking It Until You Make It Allows Criminals to Infiltrate Sensitive Data
The FBI says that more scammers are using deepfake technology to apply for remote tech jobs as of late. In a June 28 announcement via their Internet Crime Complaint Center (IC3), the FBI indicates that more companies have been reporting people using altered video, images or recordings to look like someone else when applying for jobs.
The complaints were typically linked to remote work or work-from-home positions in information technology and computer programming, database and software-related fields. The FBI added that some positions include access to individuals’ personally identifiable information (PII), financial data, corporate IT databases and/or proprietary information.
“The use of the technology to harass or harm private individuals who do not command public attention and cannot command resources necessary to refute falsehoods should be concerning,” the Department of Homeland Security warned in a 2019 report about deepfake technology, as reported by Business Insider.
The agency said that some IC3 complaints report the use of voice spoofing during online interviews. During these interviews, the actions or lip movements don’t match perfectly with the audio of the person speaking. In some cases, coughing or sneezing are not aligned when visually present.
This trend has gotten worse since the pandemic. Business Insider noted that in a Nov. 2020 LinkedIn post, recruiter Nick Shah wrote that some candidates hire external help to assist them during interviews. In May, recruiters revealed that North Korean fraudsters were interviewing for crypto and Web3 startups posing as Americans.
The FBI did not state how many incidents had been reported.
More From GOBankingRates