Abstract:Artificial intelligence is transforming scam operations across Southeast Asia, enabling criminal syndicates to operate faster, scale wider and evade crackdowns more easily. While authorities step up enforcement, Interpol warns that AI-driven scams are becoming more sophisticated, global and harder to detect, posing growing risks to victims and law enforcement alike.

Criminal syndicates across Southeast Asia are increasingly turning to low-cost artificial intelligence tools to keep scam operations running at scale, even as governments intensify crackdowns. Senior officials at Interpol say AI is allowing scam centres to reach more victims, move faster and adapt more easily, making enforcement efforts far more challenging.
In the past, many scams were relatively easy to spot. Poorly written job ads, awkward messaging and obvious red flags often gave them away. Romance and investment scams relied heavily on scripted conversations that lacked realism. Today, that has changed. With the help of large language models and other AI tools, scammers are producing messages, profiles and advertisements that look and sound convincingly authentic.
According to Interpols Cybercrime Directorate head Neal Jetton, the use of AI has dramatically improved efficiency inside scam centres. Tools such as voice cloning and AI-generated images allow criminals to create realistic online personas almost instantly. With minimal cost and effort, operators can change tactics, target new victims and even relocate operations when pressure builds in one area.
This shift is happening at a time when countries like Cambodia are under growing international pressure to act. Recent arrests and deportations of alleged scam kingpins, along with executions linked to scam centres in Myanmar, show that authorities are stepping up enforcement. But Interpol officials believe scams are unlikely to disappear. Instead, they are evolving.
Rather than shutting down entirely, scam networks are adapting their business models. AI makes it easier to scale operations quickly, reducing the risk of losses when centres are exposed or shut down. Criminals are increasingly willing to take chances, knowing they can rebuild elsewhere with fewer people and more automation.
Interpol analysts note that one of the biggest changes is in recruitment-style advertisements. Previously, fake job ads often contained obvious mistakes. Now, with the right prompts, scammers can generate professional-looking ads within seconds. This not only helps them lure victims but also makes it harder for the public to tell what is legitimate.
Experts warn that AI is reshaping the entire scam ecosystem. It affects not just the scammers, but also the victims and even trafficked workers. Deepfake technology has enabled voice and video impersonation, where scammers pretend to be family members or trusted contacts to demand money. Language barriers are no longer an issue, as AI can generate fluent messages in almost any language.
There is no precise data on the full economic damage caused by scam operations linked to Southeast Asia, partly because many victims do not report their losses. However, estimates suggest the global cost runs into tens of billions of dollars annually, and continues to rise.
The technology also has implications for where scam centres operate. While Southeast Asia remains a hotspot, authorities are seeing similar operations emerge in the Americas, Africa and the Middle East. Some show links to Asian syndicates, while others appear to be run by local or regional criminal groups, signalling a global spread of the same tactics.
For now, AI has not reduced human trafficking linked to scam operations. But experts believe that may change. With fewer people needed on the ground and more tasks automated, future scam centres could rely on smaller teams supported by AI systems. Recruiters, money handlers and technical specialists will still be required, but the number of low-level workers may gradually decline.
Law enforcement agencies are exploring how AI can also be used to detect and disrupt scams, but the race is uneven. As technology becomes more accessible, criminals are often quicker to adopt new tools. For authorities, keeping pace will require constant adaptation, stronger cooperation and greater public awareness.


A Miri man lost RM257,408 after being drawn into an online investment scheme through RedNote and WhatsApp, involving a fake trading app and multiple bank transfers.

AI is reshaping financial trading through faster analysis, personalisation, and risk detection, but its real value depends on governance, transparency, and whether it improves investor decision-making rather than simply increasing activity.

Hong Kong reported 9,427 fraud-related cases in Q1, with losses reaching HK$1.85 billion. Elderly investors were heavily affected, especially through online investment schemes using social media and WhatsApp groups to build trust before larger losses occurred.

A Kuching man lost RM411,000 after joining a stock investment scheme promoted through TikTok, involving multiple transfers to different accounts before police opened an investigation under Section 420.