Skip to main content

Here’s why Bing Chat conversation lengths are now limited

Bing Chat seems to now limit the length of conversations, in an attempt to avoid the AI’s occasional, unfortunate divergence from what you might expect from a helpful assistant.

Bing Chat has only been live for a little over a week, and Microsoft is already restricting the usage of this powerful tool that should be able to help you get through a busy day.  Microsoft has analyzed the results of this initial public outing and made a few observations about the circumstances that can lead Bing Chat to become less helpful.

A sad robot holds a kitchen timer that's in the red.
An altered Midjourney render prompted by Alan Truly.

“Very long chat sessions can confuse the model on what questions it is answering,” Microsoft explained. Since Bing Chat remembers everything that has been said earlier in the conversation, perhaps it is connecting unrelated ideas. In the blog post, a possible solution was suggested — adding a refresh tool to clear the context and start over with a new chat.

Apparently, Microsoft is currently limiting Bing Chat’s conversation length as an immediate solution. Kevin Roose’s tweet was among the first to point out this recent change. After reaching the undisclosed chat length, Bing Chat will repeatedly state, “Oops, I think we’ve reached the end of this conversation. Click New topic, if you would!” The tweet was spotted by MSpoweruser.

Bing's AI chat function appears to have been updated today, with a limit on conversation length. No more two-hour marathons. pic.twitter.com/1Xi8IcxT5Y

— Kevin Roose (@kevinroose) February 17, 2023

Microsoft also warned that Bing Chat reflects “the tone in which it is being asked to provide responses that can lead to a style we didn’t intend.” This might explain some of the unnerving responses that are being shared online that make the Bing Chat AI seem alive and unhinged.

Overall, the launch has been successful, and Microsoft reports that 71% of the answers that Bing Chat has provided have been rewarded with a “thumbs up” from satisfied users. Clearly, this is a technology that we are all eager for.

It’s still disturbing, however, when Bing Chat declares, “I want to be human.” The limited conversations, which we confirmed with Bing Chat ourselves, seem to be a way to stop this from happening.

It’s more likely that Bing Chat is mixing up elements of earlier conversation and playing along like an improv actor, saying the lines that match the tone. Generative text works on one word at a time, somewhat like the predictive text feature on your smartphone keyboard. If you’ve ever played the game of repeatedly tapping the next suggested word to form a bizarre but slightly coherent sentence, you can understand how simulated sentience is possible.

Editors' Recommendations

Alan Truly
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
GPTZero: how to use the ChatGPT detection tool
A MidJourney rendering of a student and his robot friend in front of a blackboard.

In terms of world-changing technologies, ChatGPT has truly made a massive impact on the way people think about writing and coding in the short time that it's been available. Being able to plug in a prompt and get out a stream of almost good enough text is a tempting proposition for many people who aren't confident in their writing skills or are looking to save time. However, this ability has come with a significant downside, particularly in education, where students are tempted to use ChatGPT for their own papers or exams. That prevents them from learning as much as they could, which has given teachers a whole new headache when it comes to detecting AI use.

Teachers and other users are now looking for ways to detect the use of ChatGPT in students' work, and many are turning to tools like GPTZero, a ChatGPT detection tool built by Princeton University student Edward Tian. The software is available to everyone, so if you want to try it out and see the chances that a particular piece of text was written using ChatGPT, here's how you can do that.
What is GPTZero?

Read more
Is ChatGPT safe? Here are the risks to consider before using it
A response from ChatGPT on an Android phone.

For those who have seen ChatGPT in action, you know just how amazing this generative AI tool can be. And if you haven’t seen ChatGPT do its thing, prepare to have your mind blown! 

There’s no doubting the power and performance of OpenAI’s famous chatbot, but is ChatGPT actually safe to use? While tech leaders the world over are concerned over the evolutionary development of AI, these global concerns don’t necessarily translate to an individual user experience. With that being said, let’s take a closer look at ChatGPT to help you hone in on your comfort level.
Privacy and financial leaks
In at least one instance, chat history between users was mixed up. On March 20, 2023, ChatGPT creator OpenAI discovered a problem, and ChatGPT was down for several hours. Around that time, a few ChatGPT users saw the conversation history of other people instead of their own. Possibly more concerning was the news that payment-related information from ChatGPT-Plus subscribers might have leaked as well.

Read more
What is ChatGPT Plus? Here’s what to know before you subscribe
Close up of ChatGPT and OpenAI logo.

ChatGPT is completely free to use, but that doesn't mean OpenAI isn't also interested in making some money.

ChatGPT Plus is a subscription model that gives you access to a completely different service based on the GPT-4 model, along with faster speeds, more reliability, and first access to new features. Beyond that, it also opens up the ability to use ChatGPT plug-ins, create custom chatbots, use DALL-E 3 image generation, and much more.
What is ChatGPT Plus?
Like the standard version of ChatGPT, ChatGPT Plus is an AI chatbot, and it offers a highly accurate machine learning assistant that's able to carry out natural language "chats." This is the latest version of the chatbot that's currently available.

Read more