Research indicates that scammers are leveraging artificial intelligence to change their appearance and create fake profiles to apply for remote jobs.
At almost every stage of the job application process, these scammers can utilize AI to disguise their real identities. They can fabricate resumes, create professional headshots, establish websites, and even build LinkedIn profiles. When combined, AI enables them to present as ideal candidates for open positions.
Once they gain access, these con artists can retrieve company secrets or introduce malware.
While identity theft has existed for a long time, AI is allowing scammers to scale their tactics, leading to an increasing problem. Research firm Gartner predicts that by 2028, around one in four job applicants could be fraudulent.
Identifying Fakes
A viral video on LinkedIn featured an interview with someone who seemed to be an AI-created candidate, shared by Dawid Moczadlo, co-founder of the cybersecurity firm Vidoc Security. He expressed his surprise upon realizing what was occurring.
“I felt somewhat violated, given our expertise in security,” Moczadlo stated.
When he suspected the candidate was using an AI filter, he posed a simple question: “Can you put your hand in front of your face?”
When they declined to comply, Moczadlo quickly ended the interview. He mentioned that the AI software the scammer was using didn’t appear very advanced, so blocking their face with a hand could likely disrupt the deepfake filter.
“Sometimes, it takes a hacker to catch a hacker,” Moczadlo remarked.
This incident was the second time the company interviewed someone who turned out to be AI-generated. Moczadlo noted that this changed how they conduct hiring. Now, they invite potential hires for one-day, in-person interviews, covering travel expenses and compensating for a full day’s work, believing the added cost is justified for peace of mind.
Patterns of Deceit
These occurrences are not standalone. The Justice Department unearthed various networks where North Koreans used fake identities to secure remote roles in the U.S., often employing AI to create false identities for U.S.-based IT positions to funnel money back to their country.
Estimates suggest these operations generate hundreds of millions of dollars annually, mostly directed to the North Korean Ministry of Defense and its nuclear missile initiative.
Moczadlo noted that researchers indicated Vidoc’s fake job applicants exhibited similar patterns to some of these North Korean networks, though Vidoc’s situation is still under investigation.
“We consider ourselves fortunate to be security experts,” Moczadlo said, “but for companies with regular hiring managers or startup founders, it can be exceedingly difficult to recognize something like this.”
This prompted Vidoc’s co-founders to create a guide to assist HR professionals in identifying potentially fraudulent candidates.
If you’re curious whether you’ve encountered this issue, the CBS News Confirmed team has compiled some helpful tips to verify that the person you’re communicating with is real:
1. Examine their LinkedIn profile closely: While it may seem legitimate initially, verify the profile creation date by selecting “More” and then “About this profile.” You can also confirm that the person has connections at the companies they claim to have worked for.
2. Pose cultural questions: If someone states they grew up in a particular city or country, inquire about local knowledge, such as their favorite cafes or restaurants.
3. In-person meetings are ideal: Ultimately, especially as AI technology evolves, meeting someone face-to-face remains the most reliable way to confirm their identity.