Join Fox News for access to this content
Plus special access to select articles and other premium content with your account - free of charge.
By entering your email and pushing continue, you are agreeing to Fox News' Terms of Use and Privacy Policy, which includes our Notice of Financial Incentive.
Please enter a valid email address.
By entering your email and pushing continue, you are agreeing to Fox News' Terms of Use and Privacy Policy, which includes our Notice of Financial Incentive.

A California family endured a terrifying ordeal when scammers using artificial intelligence fooled them into believing their son had been in a serious accident.

Amy Trapp was working a normal day at Mill Valley school near San Francisco when she received a call from an unknown number and picked up the phone to the sound of a voice that she believed to be her son, according to a report from the San Francisco Chronicle.

"It was my son’s voice on the phone crying, telling me, ‘Mom, mom, I’ve been in a car accident,'" Trapp told the outlet.

The mother said she instantly felt a panic and had visions of her son, who was away at college near California's central coast, lying underneath a car or on the side of the road in a pool of his own blood. Instead, another voice came on the phone and told Trapp he was a police officer and that her son, Will, had injured a pregnant woman in the crash and was taken to jail.

ARIZONA MOM TERRIFIED AI KIDNAPPING SCAM TRIED TO LURE HER INTO BEING ABDUCTED AS SHE FEARED FOR DAUGHTER

AI hacker photo illustration

An artificial intelligence hacker behind a computer. (Fox News)

Trapp said she bought the story because the voice on the other line was unmistakably that of her son's. She also put trust in another man who claimed to be a public defender for Will, who asked her to withdraw over $15,500 to cover the cost of Will's bail.

Hours later, Trapp's husband called the police directly, and the couple found out the ordeal was actually a scam. The men had duped the worried mother by using sophisticated AI technology to replicate Will's voice. Will had been studying in his living room the entire time, unaware of the drama unfolding with his parents.

"Scams like this are growing every day as deepfake technology improves and becomes easier to access," Christopher Alexander, chief analytics officer at Pioneer Development Group, told Fox News Digital. "Scams like this succeed because immense stress is placed on the victims in the hopes that imperfections in the deepfake go undetected." 

While phone scams have been around for years, AI scams threaten to become a growing problem that will make victims less likely to detect they are being duped.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

"Scam calls have been a consistent thorn in the side of Americans for years, and no one expects to be the person falling for it," Aiden Buzzetti, president of the Bull Moose Project, told Fox News Digital. "The proliferation of AI voice replication is going to make even the most tech-savvy user susceptible to deception, especially when human instinct comes into play. 

"The criminals behind these robocalls fabricate high-stakes situations to keep parents and grandparents short-sighted, emotionally aggravated and prevent them from thinking clearly. We need to keep encouraging the federal government and state agencies to crack down on robocalls by prosecuting services that allow for foreign spam callers."

AI DEVELOPMENT EXPECTED TO ‘EXPLODE’ IN 2024, EXPERTS SAY

"We are witnessing an alarming explosion in Al-powered voice scams, which now pose a growing threat to families. Criminals and scammers are capitalizing on advancements in generative Al to impersonate individuals with frightening accuracy," Jake Denton, a research associate at the Heritage Foundation’s Tech Policy Center, told Fox News Digital. 

"These scams tend to prey on unsuspecting people's emotions, often mimicking the voice of a family member pleading urgently for help. To stay safe from these scams, families should take proactive measures now to protect themselves."

In Trapp's case, the worried mother recalled that she had "zero doubt" that she was talking to her son, which led to a rush of adrenaline that clouded her judgment when the supposed police officer came on the line.

Minutes later, when a man identifying himself as a public defender named David Bell called, he told the worried mother he was able to negotiate her son's bail from $50,000 to $15,000 and asked if she could get the money quickly. Trapp agreed.

smartphone facebook

Logging into Facebook on a mobile phone. (Fox News)

"He said, ‘Don’t tell [the bank] why you’re getting the money because you don’t want to tarnish your son’s reputation,’" Trapp said, noting that in her state of mind she "would have done anything he said."

Trapp then got in her car to pick up her husband, who recalls being convinced that his wife had spoken to their son. The couple went to the bank and withdrew $15,500. 

"It all comes down to that voice being recognized by his own mother, who he speaks to several times a week," the father, Andy Trapp, said. "I never, ever, thought I would ever fall for anything like that." 

OPENAI LAUNCHES GPT STORE FOR CUSTOM CHATBOTS

Once home, the supposed public defender told the couple that a courier would be sent to pick up the money from their home, which finally set off alarm bells in the couple's heads.

"That sounded totally wrong," Andy Trapp said.

That was the moment the situation became too much for Amy Trapp, who said she sank to her knees in front of the family's house.

"Where is my son? Where is my son?" she screamed.

Andy Trapp then called the local police station where the incident supposedly occurred. The worried father was told there was no record of the accident. It was then the couple finally called their son at college, and he was completely unaware of the situation.

"Yo, what’s up?" Will calmly answered, according to the Trapps.

According to the report, scam call attempts such as the one employed against the Trapps are illegal but often hard to investigate or prosecute, oftentimes because the scams originate overseas.

Samuel Mangold-Lenett, a staff editor at The Federalist, believes more could be done to protect potential victims.

"Phishing scams remain a serious issue. But now that AI enables bad actors to use sophisticated voice cloning technology, everyone with a cellphone is a potential target," Mangold-Lenett told Fox News Digital. "More needs to be done to hold the people who run these scams accountable. Laws already on the books can likely be applied to situations like these, but perhaps more specific regulations ought to be implemented." 

SCAMMERS USE AI TO CLONE WOMAN'S VOICE, TERRIFY FAMILY WITH FAKE RANSOM CALL: ‘WORST DAY OF MY LIFE’

Phil Siegel, the founder of the Center for Advanced Preparedness and Threat Response Simulation (CAPTRS), told Fox News Digital it is unknown how often AI gets used in such crimes, though he warned such incidents are likely to increase in the coming years.

"Many seniors and older people will not take the time to understand AI because they won’t use it, which will make them ripe targets for scams such as this," Siegel said. "Whether it’s being in jail, in a car accident, needing home repair money or many other scams, this will happen more often in the next few years."

man holding smartphone in one hand, credit card in other

AI scams often target the elderly, experts say. (Cyberguy.com)

CLICK HERE TO GET THE FOX NEWS APP

Siegel said people can defend themselves by being cautious, especially when it comes to transactions with money.

"The lesson is to always call the person back before doing anything," Siegel said. "Never give someone you don’t know or haven’t hired any cash, wire money, Venmo money or cryptocoins. And, of course, if it is anything to do with law enforcement, check with them first."

Will Trapp told the San Francisco Chronicle he has no idea how the scammers were able to copy his voice, noting that his social media accounts are private, though he sometimes does sing and make music that gets posted online.

"It’s hard to imagine how that could be used because it’s not like my speaking voice," he said. "It’s really scary."