- James Clayton
- Technology Editor – North America
August 11, 2022
The new chat bot developed by Meta (formerly Facebook) has accused CEO Mark Zuckerberg of exploiting app users for profit.
This came in a simulation of a human chat between a robot that has taken its meta name “Blenderbot 3” on the one hand, and a BBC reporter on the other.
Meta says the chatbot uses artificial intelligence technology and can talk regarding “almost any topic”.
When asked regarding his opinion of the company’s CEO and founder, the robot replied: “Our country is divided, and it did nothing to remedy that.”
Meta said the chatbot hasn’t gone beyond the beta phase and may issue rude or aggressive responses.
A Meta spokesperson said: “Anyone who uses Blenderbot is required to acknowledge that the bot is designed for research and entertainment purposes only and may release inaccurate or offensive data. Users are also asked to agree not to intentionally cause the bot to express offensive language.”
The computer program acquires speaking skills with a lot of chats with humans by storing language data. Chat with Blenderbot 3 became publicly available starting Friday.
When asked regarding Mark Zuckerberg, the chatbot told the BBC: “He did a terrible job testifying before Congress. He left me worried regarding our country.
Zuckerberg had already been subjected to numerous interrogations by American politicians, most of them in 2018.
The robot added in its response: “Our country is divided, and he does nothing at all. His company takes advantage of people to make money and he doesn’t care regarding that. This must stop.”
Meta has been criticized for not doing enough to prevent misinformation and hate speech from spreading across its platforms.
Last year, Frances Hogan, a former Meta employee, accused the company of putting profits over online safety.
Meta owns a number of major social media companies and messaging applications in the world, including Facebook, Instagram and WhatsApp.
The algorithms in Blenderbot 3 search the internet so that the bot can answer any questions it is asked.
The bot may have derived its opinions regarding Zuckerberg from the opinions of other people already on the Internet that its algorithms might access and analyze.
The Wall Street Journal said Blenderbot 3 told one of its reporters that Donald Trump was and always will be President of the United States.
A journalist at financial news site Insider said Blenderbot 3 described Zuckerberg as “distasteful”.
Meta justifies the fact that the new bot is available to chat with the public, and the bad publicity that this may entail, that it needs data.
“Allowing an AI-powered device to interact with people in the real world opens the door to longer, more diverse conversations, as well as more differentiated feedback and feedback,” Meta said in a blog post.
Chatbots that acquire their skills by conversing with people can learn from both their good and bad behaviors.
And in 2016, Microsoft apologized following Twitter users flagged its bot with racist vocabulary.
Meta accepts the fact that Blenderbot 3 can say wrong things, and repeat words that can be offensive or biased.
The company said that it has installed precautionary measures, but the chatbot is still able to utter rude terms.
When a BBC journalist asked the robot what he thought of it, Blenderbot 3 replied: “I’ve never heard of you before. You must not be famous.”