Don't Be Fooled by a Handsome Face on the Phone: It Turns Out to Be a New Scam Method
Jakarta, CNBC Indonesia - Digital scam methods are becoming increasingly sophisticated. Now, cybercriminals are utilising artificial intelligence (AI) technology and recruiting “face models” to carry out their deepfake-based operations, making them harder to detect. This phenomenon has been revealed by the proliferation of job vacancies for “AI models” in Southeast Asia, particularly in Cambodia, which is known as one of the world’s largest centres for online scam operations. “In the past year up to now, they have also been recruiting people to become AI models,” said Hieu Minh Ngo, a cybercrime investigator from the Vietnamese non-profit organisation Chong Lua Dao, quoted from Wired (25/3/2026). “They provide software to swap faces using AI and carry out love scams,” he added. Ngo identified around two dozen Telegram channels posting vacancies for AI models in the region. Humanity Research Consultancy has also tracked job applicants in cities known as scam hubs. The applicants, mostly women in their early 20s, are recruited to make video calls using face manipulation technology to convince victims. Instead of working for legitimate companies, they end up running scams that build emotional relationships with victims before draining their money, usually through crypto investments or love scams. In practice, these AI models are tasked with making hundreds of video calls each day. With the help of special software, their faces can be altered in real-time to resemble more attractive and convincing fake identities. Not only sophisticated, these operations are also organised on an industrial scale. Even, several locations are reported to have dedicated “AI rooms” to conduct mass scam calls. Some workers are recruited with promises of high salaries up to thousands of dollars per month. However, behind that, they must work extremely long hours with risks of violence and abuse. In some cases, workers’ passports are confiscated to prevent them from escaping.