ESCAMBIA COUNTY, Fla. — The threat of cybercriminals using artificial intelligence to scam you and your family out of thousands of dollars is increasing, according to the Federal Bureau of Investigation.
The FBI says bad actors use voice cloning of a family member, co-worker or other trusted person to make it appear like they are in danger.
The FBI’s Internet Crime Complaint Center claims that AI-based scams and attacks as of 2022 have cost Americans approximately $10 billion.
Cybersecurity expert says scammers are playing on your emotions. They hope you won’t have time to think rationally and will just send money.
Dr. Eman El-Sheikh tells WEAR News how to avoid becoming a victim. She is associate vice president of the Center for Cybersecurity at the University of West Florida.
“The techniques have become so good and have enough data,” says El-Sheikh. “It is very difficult to detect whether it is a real person’s voice or a synthetic response.”
A scammer used a synthetic replica of Jennifer Destefano’s daughter Briana’s voice in 2023.
Destefano detailed the traumatic situation on the Senate floor last year, saying she received a call from an unknown number. The voice on the other line was Briana – or so she thought.
“Sobbing and crying saying, “Mommy? At first, I didn’t pay attention to it and casually asked him, “What happened?” Destefano said.
“Briana continued, ‘Mom, I made a mistake,’ while crying and sobbing continuously,” she said. “Without thinking twice, I asked him again, ‘Okay, what happened?’ Suddenly, a man’s voice barked at him: “Lie down, put your head back.” »
Destefano demanded to know what was going on. Briana answered him again.
“‘Mom, these bad men have got me. Help me. Help me,’ she begged and pleaded as the phone was taken away from her,” Destefano says.
She says a threatening man called her and told her that if she called anyone, including the police, he would harm her daughter.
“’I’m going to fill her stomach with drugs, I’m going to do whatever I want with her,’” Destefano told the man.
In Destefano’s panic, someone nearby called the police. While the scammer demanded $50,000 in cash.
That’s when she was informed that 911 was very aware of such a scam. Bad actors are using AI to clone the voices of your loved ones.
“It wasn’t just his voice,” Destefano says. “It was his screams. It was his sobs. It just wasn’t his voice.”
Briana was found safe and sound. This was a voice cloning scam that the FBI warned about in May of this year.
According to the FBI, “malicious actors are increasingly using AI-powered voice and video cloning techniques to impersonate trusted individuals, such as family members, co-workers or business partners “.
“Sometimes you can do these things and it’s useful for things, like reproducing audio for entertainment or audio for a movie, but unfortunately bad actors also have access to these tools and can use them to exploiting people’s emotions to scam them out of money,” says El-Sheikh.
El-Sheikh says scammers can easily get their hands on your voice, whether through a previous call, audio recording or public recordings, or even social media.
“If you look at social media feeds and people are talking about their pets or wishing someone online a ‘happy birthday’ in an audio or video message,” says El-Sheikh. “All of this was recorded into the AI model and used to synthesize someone’s voice.”
El-Sheikh says if you receive one of these calls, hang up and contact your loved one directly.
“Always be wary of someone who calls and asks for money,” says El-Sheikh. “Because even if they take advantage of your emotions, that’s a red flag.”
This holiday season, it might be a good idea to sit down and talk about it with your loved ones.
El-Sheikh recommends finding a code word or phrase you can say to verify that you’re on the phone with the right person, if you ever find yourself in one of these situations.