The Growing Concern of AI Voice Replication Scams: How to Protect Yourself

2023-06-12 12:23:23

Her daughter’s voice and her sobs upset her when a man called her asking for a ransom. But the intonations were reproduced to perfection by an artificial intelligence (AI) for an attempted scam, a concern that is emerging with this booming technology.

• Read also: A sculpture created by artificial intelligence on display in Sweden

• Read also: Seeing a deceased loved one once more, when artificial intelligence goes beyond death

• Read also: AI better than doctors at predicting medical complications

The biggest danger of AIs, experts say, is their ability to nearly obliterate the line between fact and fiction, providing criminals with effective and inexpensive tools.

New telephone scams using AI voice replication tools, readily available online, are worrying US authorities.

“Help me, mom, I beg you, help me,” heard Jennifer DeStefano, a mother living in Arizona, on the phone.

This voice, she is then “100%” convinced that it is that, in great distress, of her 15-year-old daughter, who has gone skiing.

“It was totally her voice, the way she would have cried,” the mother told a local TV station in April.

“I didn’t doubt for a moment that it might not be her.”

At the end of the line, the scammer who then speaks, from an unknown number, demands a million dollars.

The sequence, which quickly ended when Jennifer DeStefano managed to reach her daughter, is now the subject of an investigation and has highlighted the potential misappropriation of AI by cybercriminals.

“Of the deepfakes convincing”

“AI voice replication, now almost indistinguishable from human speech, allows malicious people like scammers to more effectively extract information and money from victims,” Wasim Khaled told AFP. CEO of Blackbird.AI.

Many applications, many of which are free and available online, allow artificial intelligence to replicate a person’s real voice from a short recording.

This can easily be extracted from content posted online.

“With a short audio sample, an AI voice clone can be used to leave messages and audio capsules. It can even be used as a live voice modifier during calls,” explains Wasim Khaled.

“The scammers use various accents and genders, or even imitate the way of speaking of your relatives,” he adds. This technology “makes it possible to create deepfakes convincing”.

According to a survey of 7,000 people in nine countries including the United States, one in four people have been the target of an AI-replicated voice scam attempt, or know someone who has.

70% of respondents said they weren’t sure they might tell the difference between a real voice and a cloned voice, according to this survey released last month by McAfee Labs.

US authorities have recently warned of the growing popularity of the “grandparent scam”.

“You get a call, a panicked voice on the line, it’s your grandson. He says he is in big trouble, he had a car accident and ended up in detention. But you can help him by sending money”, described as a warning the American agency of protection of the consumers (FTC).

In the comments under this FTC warning, several seniors said they had been misled in this way.

Vulnerability

A victimized grandfather had been so convinced that he set regarding collecting money, even considering re-mortgaging his house, before the ruse was uncovered.

The ease with which we can now artificially replicate a voice means that “almost everyone online is vulnerable,” said AFP Hany Farid, professor at the UC Berkeley School of Information.

“These scams are gaining ground,” he assesses.

Earlier this year, startup ElevenLabs had to admit it was possible to misuse its AI voice replication tool, following users posted a deepfake of actress Emma Watson reading my fight

“We are rapidly approaching the point where we can no longer trust content on the internet and where we will have to use new technologies to ensure that the person we think we are talking to (on the phone) is really the nobody you talk to,” concludes Gal Tal-Hochberg, a manager at tech investment firm Team8.

1686576029
#Voice #replication #telephone #scams #worry

Leave a Replay