I never thought I would become a statistic. But in 2025, that’s exactly what happened to me — and to dozens of others across the Bay Area.
San Francisco’s dating scene was already chaotic, but what we didn’t realize was that we weren’t just competing with other singles — we were competing with machines and professional criminals. The person I believed was real, the one sending thoughtful messages and polished photos, wasn’t a partner. It was a calculated fraud operation powered by artificial intelligence and cryptocurrency schemes.
Across 14 Northern California counties, people like me collectively lost $43.34 million to romance scams in 2025 — more than double the $21.52 million lost the year before, according to the FBI San Francisco Division. In San Francisco alone, reported losses exploded from $734,479 to $6,341,570. Victims jumped from 34 to 61. I am one of those numbers now.
I learned the hard way that scammers aren’t just using fake selfies anymore. They are deploying AI-generated photos, scripted conversations, even video clips that look terrifyingly real. As AI literacy educator Jeremy Carrasco warned, this technology is no longer something only older people fall for — it is advanced enough to fool anyone. Tech-savvy twenty-somethings. Retirees. Widowers. Professionals. Me.
Take Rajni Goswami. She lost $300,000 after matching with what she thought was a real person on OkCupid. That “person” was nothing more than a carefully engineered illusion. Then there’s the Brentwood widower who lost more than $1 million after a simple wrong-number text spiraled into a crypto investment trap. That’s how these criminals operate — they mix romance with grief, affection with opportunity, and then they drain everything.
In my case, the scammer — the fake persona who called himself “Daniel” — didn’t rush. He played the long game. He built trust. He flirted. He listened. And then, like clockwork, he pivoted to cryptocurrency investments that promised security and fast returns. It was methodical. It was manipulative. And it was designed to destroy me financially.
These criminals are running what investigators call “pig butchering” scams. They “fatten” you emotionally before slaughtering your bank account. They weaponize loneliness. They weaponize trust. And now, with AI, they can do it at scale.
Law enforcement is now urging people to slow down. Meet matches in person before sending money. Never send cryptocurrency or wire transfers to someone you haven’t verified in real life. Keep records. File reports with the FBI’s Internet Crime Complaint Center (IC3). Contact your bank immediately. These are steps I wish I had taken sooner.
If I could go back, I would reverse-image search every photo. I would demand live video calls. I would treat any sudden “can’t-miss” investment opportunity as the blazing red flag it truly is. Instead, I learned after the damage was done.
The surge is happening because AI tools are cheap, accessible, and brutally effective. And too many victims stay silent out of embarrassment — something these criminals count on. Shame is part of their strategy.
But I refuse to stay quiet. This is not just online dating drama. It is organized financial exploitation disguised as love. And until people understand that these “relationships” can be entirely synthetic — powered by AI and designed to rob you — the numbers will keep climbing.
I trusted. They engineered that trust. And then they took everything.
