Microsoft explains some strange behavior of Bing AI chatbot – Engadget 中文版

Microsoft

Microsoft released a new Bing that integrates AI chatbots last week. Although the editors of the main site are satisfied with its performance, as more and more users, the problems become more and more widespread, and slowly some problems with AI have begun to emerge. come out. Some users found that Bing’s robot not only sometimes gave wrong information, but even complained that users were wasting its time, and insisted that it was the user who was wrong, not itself.

In one instance, when a user asked where to watch the latest “Avatar: Way of Water” movie, the Bing bot responded that it’s 2022 and the movie hasn’t been released yet. And when the user told Bing that the time was wrong, Bing countered that the user was “unreasonable and stubborn.”Although this type of situation is bound to be very rare, it will still damage the image of Bing robots, soMicrosoft posted a blog postto explain the cause and solution of the problem.

The most important thing is that Microsoft did not expect Bing’s AI to be used for “exploring the world” and for “social entertainment” purposes. For the former, many people will ask a series of sometimes irrelevant questions one after another, which makes the robot may be gradually “biased” and forget what it was originally answering. In the latter case, some users deliberately guide Bing’s AI to answer some questions outside of the design, such as asking whether it is “conscious”. In addition, the robot will also use the tone of the questioner to determine the way to answer, which makes the robot may respond in a way that is not what Microsoft expected. The current tentative solution is to add a “reset” button to allow the robot to restart after getting lost. In addition, Microsoft is also considering giving users more control to ensure that the answers given by the robot meet the needs.

Related Articles:  Argentina's Bank Bonds and Stocks: Winning Bets in 2024

While there were inevitably some issues when it first launched, users generally gave Bing’s AI fairly high marks, especially because it included sources and references. However, Microsoft also admitted that Bing’s robots still need to be strengthened in terms of the immediacy of sports scores and the accuracy of financial reports. In the future, Microsoft plans to add a switch that will let you choose whether the robot’s answer should be more accurate or more creative, to avoid some questions that may be vague in wording.

The Bing team finally thanked all those who participated in the test so far, saying that these special conditions collected will help Microsoft to further improve the product. What surprised the team the most was that someone could play with the robot for two hours at a stretch, so it’s no wonder that so many situations were discovered. As more AI robots come online and related technologies develop, there must be more problems of this kind that need to be solved.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.