Fraudsters are now using artificial intelligence to imitate the voice of your loved ones

L’article of Washington Post [en anglais] gives the example of Canadian grandparents who received a call from fraudsters posing as their grandson.

  • The artificial intelligence mimicked voice asked them to quickly send him $3,000 to post bail to get out of jail.

According to the speakers quoted in the article:

  • Software can easily imitate a person’s voice from simple audio extracts.
  • However, extracts of many people’s voices can be found on social networks such as TikTok, Facebook, Instagram and YouTube or in podcasts.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.