Deepfake in audio becomes a mechanism to apply blows to family and friends | TC Detective

2023-05-16 18:03:00

Over time, cybercriminals end up adapting to new technologies that emerge to create more personalized and efficient scams, which are more successful in stealing data and money from victims.

One of these resources explored in recent times is the so-called deepfake. The Artificial Intelligence resource has been used to simulate people’s audio in order to ask for money and deceive family and friends. O Detective TudoCelular explains the situation and how to protect yourself next.

recent cases

One of the recent cases involved a 73-year-old Canadian named Ruth Card. In March of this year, the criminal impersonated the grandson of the lady, using an AI-generated voice. He called his wife, said he was under arrest and needed bail money.

The elderly woman believed and ran with her husband – Greg Grace, 75 years old – to withdraw a large amount from the bank – to be exact, 3 thousand Canadian dollars. When trying to withdraw a larger amount at another financial institution, the manager alerted and warned the couple that it might be a scam, having seen a similar case of another account holder.

Coup also in Brazil

But this is not a practice exclusive to foreigners. The coup has also been applied in Brazilian territory. The site’s Instagram profile SOS Almanac declared a case that occurred, in which the man received a call with the voice of his son, asking for BRL 600 to supposedly pay a bill.

However, when transferring the amount, the money went to a person’s account with another name – which caused strangeness in the man.

What deepfake?



O deepfake consists of the name given to a technology that uses Artificial Intelligence with deep learning techniques to copy a person’s speech or look, through content of just a few seconds.

The result can be recorded in video, image or audio, all with ultra-realistic quality. These contents are able to fit any speech in the voice of another person who has never uttered it, which generates the ability to deceive third parties.

As the practice has become increasingly simple, thanks to AI software, the detection of fraud often presents complexity. This is because the voice used is not that of cybercriminals, in addition to the challenges of tracking calls.

Deepface vs deepfake: at


security
22 Dez

What


security
08 Out

It didn’t start from now

Despite gaining momentum in 2023, deceitful practice is not new among cyber criminals. In mid-2020, a company employee – who declined to be identified at the time – reportedly received an audio from the company’s CEO.

In the voice message, the recording asked “immediate assistance to finalize an urgent business”. The worker became suspicious and contacted the security consulting company Nisos – which he found to be a synthetic audio.

But this is not just a voice-related problem. To the deepfakes can also be done using the exploited person’s image, mainly for biometric authentication in banks.

According to a study carried out by iProov and released last week, this method has become one of the main attacks to gain unauthorized access to banking applications. There is also an estimate of a 20% loss of online banking revenue – something around R$60 billion annually – in Latin America due to fraud – with Brazil in first place.

How to protect yourself?



To prevent someone from impersonating a loved one and deceiving you, there are some tips that will allow you to detect the origin of the voice message. One of them is to make a video call with the person – preferably, covering the camera so as not to be seen before checking who is on the other side.

Another possibility is to create a “keyword” between you and your family or close friends. Thus, in case there is any kind of suspicion regarding who is on the other side, just ask the person to say the agreed word, to see if there is truth in the voice message.

Have you come across such a scam? How did you manage to identify the fraud, or how did you proceed in the case? Tell us in the space below.

1684281346
#Deepfake #audio #mechanism #apply #blows #family #friends #Detective

Leave a Replay