A San Jose widow lost nearly $1 million to a crypto romance scam that drained her retirement accounts and put her home at risk. She only discovered the fraud after asking ChatGPT whether an investment demand made sense, highlighting both the sophistication of pig butchering operations and an unexpected use case for AI chatbots.
Margaret Loke met a man calling himself “Ed” on Facebook last May. The relationship moved quickly to WhatsApp, where he claimed to be a wealthy businessman and sent affectionate messages daily.
“He was really nice to me, greeted me every morning,” Loke told ABC7 News. “He sends me every day the message ‘good morning.’ He says he likes me.”
After weeks of building trust through daily check-ins, Ed steered conversations toward crypto investing. Loke had no trading experience, but he guided her through wiring funds into an online account that he controlled, showing her app screenshots displaying massive profits that appeared within seconds.

Loke started with a $15,000 transfer. The fabricated gains convinced her to send more. She eventually wired over $490,000 from her IRA, then took out a $300,000 second mortgage on her home. Altogether, she sent close to $1 million to accounts controlled by the scammers.
The setup follows the classic pig butchering playbook: scammers build relationships over weeks or months before steering victims into fake investment platforms. The name comes from the practice of “fattening up” victims with fake profits before draining their accounts completely.
When Loke’s supposed crypto account suddenly “froze,” Ed demanded an additional $1 million to release the funds. That request finally triggered doubt. Panicked and confused about why she’d need to pay another million dollars to access money that was supposedly already hers, Loke turned to ChatGPT for advice.
“ChatGPT told me: No, this is a scam, you’d better go to the police station,” she told ABC7. The AI identified the setup as matching known scam patterns, prompting her to confront Ed and contact authorities.
Investigators later confirmed she had been routing money to a bank in Malaysia, where scammers withdrew the funds. “Why am I so stupid. I let him scam me!” Loke said. “I was really, really depressed.”
See also: Africa Bitcoin Corporation Goes Live on JSE, Secures $633,000 in Landmark BTC Treasury Fundraise
ChatGPT as Scam Detector
Loke’s case represents an emerging pattern of people using ChatGPT to verify suspicious situations. Last week, an IT professional in Delhi said he used the AI to “vibe code” a website that revealed a scammer’s location and photo. The trend suggests victims are turning to AI when they can’t trust the people they’ve been talking to for months.
OpenAI didn’t respond to requests for comment about ChatGPT’s role in identifying scams, but the use case makes sense. Large language models trained on vast datasets can recognize patterns across thousands of documented fraud schemes. When Loke described her situation: frozen account, demand for additional payment to “unlock” funds, the AI matched it to pig butchering tactics almost immediately.
The irony is that scammers themselves increasingly use AI to craft convincing messages, generate fake profile photos, and maintain multiple simultaneous conversations with victims.
ChatGPT and similar tools now serve both sides of the fraud equation: helping scammers operate at scale while occasionally exposing their schemes when victims ask the right questions.
The Pig Butchering Industry
Pig butchering scams operate primarily from compounds in Southeast Asia, where organized crime groups force workers into modern slavery conditions while running industrial-scale fraud operations. These aren’t individual scammers working alone but sometimes entire call centers dedicated to romance and investment fraud.
Meta removed over 6.8 million WhatsApp accounts linked to pig butchering operations in August alone, according to the company’s statement.
The U.S. Treasury also sanctioned 19 entities across Burma and Cambodia in September that it says scammed Americans out of hundreds of millions. “Southeast Asia’s cyber scam industry not only threatens the well-being and financial security of Americans, but also subjects thousands of people to modern slavery,” said John K. Hurley, Under Secretary of the Treasury for Terrorism and Financial Intelligence.
According to the FBI’s Internet Crime Complaint Center, Americans lost $9.3 billion to crypto-related scams in 2024 across all age groups. Seniors age 60 and over lost approximately $2.8 billion specifically to crypto fraud, with pig butchering schemes representing a substantial portion of those losses.

The Federal Trade Commission and Securities and Exchange Commission warn that unsolicited crypto “coaching” beginning inside an online relationship is a hallmark of romance-based investment fraud. Scammers often invest weeks or months building emotional bonds before introducing investment opportunities.
Loke’s case followed that pattern perfectly. Daily good morning messages. Expressions of affection. Gradual building of trust. Only after establishing an emotional connection did Ed begin discussing crypto investing. By that point, Loke trusted him enough to hand over retirement savings she’d spent decades accumulating.
The fake profits shown through app screenshots seal the deal. Victims see their initial $15,000 “grow” to $30,000 or $50,000 within days. That apparent success justifies larger deposits. Why wouldn’t you invest more when you’re already making money? The numbers on the screen look real, and the person you’re falling for is guiding you toward wealth.
See also: The Best AI: What is ChatGPT And How Can It Help Crypto In 2023
Loke now faces losing her home due to the $300,000 second mortgage she can’t repay. She also owes substantial taxes on the $490,000 IRA withdrawal, which the IRS treats as taxable income regardless of what happened to the money afterward. Early withdrawal penalties add another layer of financial damage.
The emotional toll compounds the financial devastation. Loke believed she’d found companionship after losing her husband. Instead, she discovered that months of daily conversations and affectionate messages were scripted manipulation designed to extract maximum cash before disappearing. The betrayal hits harder than the money loss for many victims.
The ChatGPT Moment
Would she have eventually realized the truth without ChatGPT? Probably, but possibly after sending even more money or taking out additional loans. The AI intervention came at a critical decision point—whether to comply with one more demand or stop the bleeding.
For other potential victims, Loke’s story offers a lesson: if you’re considering a major financial decision based on advice from someone you’ve only met online, describing the situation to ChatGPT costs nothing and might save everything. The AI won’t always be right, but it can identify patterns that emotional investment makes hard to see.
ChatGPT couldn’t prevent that damage, but it stopped the scammers from taking even more. In pig butchering cases, that’s often the best outcome victims can hope for: realizing the truth before absolutely nothing remains.
Discover more from Dipprofit
Subscribe to get the latest posts sent to your email.
