OpenAI worries that its AI’s voice will seduce users

2024-08-09 20:11:05

OpenAI, the company behind generative artificial intelligence (AI) ChatGPT, has expressed concern that a lifelike voice in its software could push users to bond with it, at the expense of human interaction.

“Anthropomorphism is the act of attributing human attitudes or characteristics to something that is not human, such as an AI model,” the company said in a report released Thursday.

“The risk may be enhanced by GPT-4’s audio features, which facilitate human-like interactions,” the report notes.

The document was published the day before the launch of the new version of ChatGPT, GPT-4o, which includes the ability for the software to respond vocally and have the equivalent of a conversation.

But having the same kind of conversation with an AI that you can have with a human could create “misplaced trust” in the software, which the added voice could reinforce.

OpenAI particularly highlights having observed among testers of the new version exchanges with the AI ​​which seemed to show the creation of an emotional bond, such as the expression of regret that it was their last day together.

“While these cases appear to be inconsequential, they highlight the need for further research into how these effects might manifest themselves in the longer term,” the report concludes.

Entering into a form of socialization relationship with AI could also encourage users to reduce their desire to have relationships with humans, OpenAI anticipates.

“Prolonged interactions with the model could have an effect on social norms. For example, our models are always respectful, allowing users to interrupt them at any time, a behavior that, while normal for an AI, could be outside the norms of social interactions,” the report details.

Substitute

AI’s ability to remember details of conversations and carry out tasks assigned to it could also lead users to rely too much on the technology.

“These new concerns shared by OpenAI about ChatGPT’s potential reliance on voice underscore how the growing question is: should we take the time and seek to understand how technology is affecting human interactions and relationships?” said Alon Yamin, co-founder and CEO of Copyleaks, an AI plagiarism detection platform.

“AI is a complementary technology, intended to help us streamline our work and daily lives, it should not become a substitute for real human relationships,” he added.

OpenAI said it is continuing to study how its AI’s voice function could lead users to become emotionally attached to it.

Testers also managed to get it to repeat false information or create conspiracy theories, adding to concerns about the risks of the AI ​​model.

Chat-GPT’s voice feature has already sparked widespread backlash, forcing OpenAI to apologize to actress Scarlett Johansson last June for using a voice that sounded very similar to hers, sparking controversy over the risk of copying voices using the technology.

While the company has denied using Johansson’s voice, the fact that its boss, Sam Altman, promoted the voice feature on social media using a single word, “Her,” in reference to the film in which the actress plays an AI, did not help convince observers.

The film, released in 2013, tells the story of a man, played by actor Joaquin Phoenix, who falls in love with his personal AI, “Samantha”, voiced by Scarlett Johansson.

1723445505
#OpenAI #worries #AIs #voice #seduce #users

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.