In order to apply for remote jobs, scammers are utilizing artificial intelligence to create phony profiles and change their look, according to studies.
Scammers are already able to conceal their genuine identities at practically every stage of the employment application process by using AI. Fake websites, Linkedin accounts, professional headshots, and resumes can all be produced by them. When combined, AI can assist in producing a candidate who appears to be the ideal fit for a position that is open.
These scammers can install spyware or steal company secrets once they’re inside.
Even while identity theft is not new, the issue is getting worse as criminals use AI to expand their operations. According to Gartner, a research and consultancy group, by 2028, one out of every four job applicants will be fraudulent.
How to recognize a fake
Dawid Moczadlo, co-founder of cybersecurity company Vidoc Security, shared a clip of an interview with what seemed to be an AI-generated job seeker on LinkedIn, and the video went viral. He admitted that he was taken aback upon realizing what was going on.
We are the security experts, so I felt a little violated, Moczadlo remarked.
“Can you take your hand and put it in front of your face?” was the straightforward inquiry Moczadlo claims he posed when he thought the individual was utilizing an AI filter.
When they declined, Moczadlo abruptly terminated the interview. He explained that the scammer’s software did not appear advanced, therefore blocking the person’s face with a hand would most likely “break” the deepfake face filter.
“Sometimes it takes a hacker to find a hacker,” Moczadlo explained.
This was the company’s second interview with someone who turned out to be AI-generated. Moczadlo stated that it dramatically transformed the company’s employment process. Potential employees are now flown in for a day-long in-person interview. The company covers travel expenses and compensates for a full day of labor. It believes that the added cost is worth the peace of mind.
A pattern of deception
These occurrences are not unique. The Justice Department has discovered several networks where North Koreans obtained remote jobs in the United States by using false identities. In order to transfer American funds back to their homeland, they frequently use AI to create false identities and work in IT positions in the United States.
According to the Justice Department, these schemes bring in hundreds of millions of dollars a year, with a large portion of the proceeds going straight to the North Korean nuclear missile program and Ministry of Defense.
Vidoc’s problem is still being investigated, but Moczadlo claimed researchers told him that Vidoc’s phony job seekers had a pattern comparable to several of these North Korean networks.
Their expertise in security makes them extremely fortunate. However, it’s very difficult for businesses with regular employees, such as hiring managers or startup founders, to recognize anything like this, Moczadlo continued.
Vidoc’s co-founders were motivated by the response to create a manual that would assist HR managers in various sectors in identifying applicants who might be fake.
In case this has ever happened to you, here is a list of basic best practices to make sure the person you’re talking to is genuine:
1. Examine their LinkedIn profile in greater detail: Although it may appear authentic at first, determine the creation date by clicking the “More” tab and choosing “About this profile.” The individual’s contacts at the places they claim to have worked can also be verified.
2. Inquire about culture: If someone mentions that they were raised in a particular nation or city, inquire about local knowledge, such as their preferred eateries.
3. Face-to-face is the best: Given the state of AI technology, meeting someone in person is ultimately the only reliable way to determine whether they are who they claim to be.