Microsoft says don’t chat too long with Bing or you’ll derail ChatGPT

Microsoft’s new ChatGPT-powered Bing has gone haywire several times in the week since its launch, and the tech giant explains why.

In a blog post (opens in a new tab) titled “Learning from our first week”, Microsoft admits that “in long chat sessions of 15 or more questions”, its new Bing search engine can “become repetitive or be prompted/provoked to give answers that are not necessarily helpful or in keeping with the tone we have designed”.

Leave a Replay