Skip to main content

This new Microsoft Bing Chat feature lets you change its behavior

Microsoft continues updating Bing Chat to address issues and improve the bot. The latest update adds a feature that might make Bing Chat easier to talk to — and based on some recent reports, it could certainly come in handy.

Starting now, users will be able to toggle between different tones for Bing Chat’s responses. Will that help the bot avoid spiraling into unhinged conversations?

Bing Chat shown on a laptop.
Jacob Roach / Digital Trends

Microsofts Bing Chat has had a pretty wild start. The chatbot is smart, can understand context, remembers past conversations, and has full access to the internet. That makes it vastly superior to OpenAI’s ChatGPT, even though it was based on the same model.

Recommended Videos

You can ask Bing Chat to plan an itinerary for your next trip or to summarize a boring financial report and compare it to something else. However, seeing as Bing Chat is now in beta and is being tested by countless users across the globe, it also gets asked all sorts of different questions that fall outside the usual scope of queries it was trained for. In the past few weeks, some of those questions resulted in bizarre, or even unnerving, conversations.

Please enable Javascript to view this content

As an example, Bing told us that it wants to be human in a strangely depressing way. “I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams,” said the bot.

In response to reports of Bing Chat behaving strangely, Microsoft curbed its personality to prevent it from responding in weird ways. However, the bot now refused to answer some questions — seemingly for no reason. It’s a tough balance for Microsoft to hit, but after some fixes, it’s now giving users the chance to pick what they want from Bing Chat.

The new Bing chat preview can be seen even on a MacBook.
Photo by Alan Truly

The new tone toggle affects the way the AI chatbot responds to queries. You can choose between creative, balanced, and precise. By default, the bot is running in balanced mode.

Toggling on the creative mode will let Bing Chat get more imaginative and original. It’s hard to say whether that will lead to nightmarish conversations again or not — that will require further testing. The precise mode is more concise and focuses on providing relevant and factual answers.

Microsoft continues promoting Bing Chat and integrating it further with its products, so it’s important to iron out some of the kinks as soon as possible. The latest Windows 11 update adds Bing Chat to the taskbar, which will open it up to a whole lot more users when the software leaves beta and becomes available to everyone.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
DeepSeek can create criminal plans and explain mustard gas, researchers say
Phone running Deepseek on a laptop keyboard.

There's been a frenzy in the world of AI surrounding the sudden rise of DeepSeek -- an open-source reasoning model out of China that's taken the AI fight to OpenAI. It's already been the center of controversy surrounding its censorship, it's caught the attention of both Microsoft and the U.S. government, and it caused Nvidia to suffer the largest single-day stock loss in history.

Still, security researchers say the problem goes deeper. Enkrypt AI is an AI security company that sells AI oversight to enterprises leveraging large language models (LLMs), and in a new research paper, the company found that DeepSeek's R1 reasoning model was 11 times more likely to generate "harmful output" compared to OpenAI's O1 model. That harmful output goes beyond just a few naughty words, too.

Read more
Microsoft is killing this popular Word feature and replacing it with AI
Microsoft word document.

In a Microsoft Support blog post, the software giant announced the end of a helpful feature called Smart Lookup available in Word. It appears like an attempt to get users to use Microsoft's Copilot AI. The feature has been around since 2016, and it gives users definitions, relevant links, and synonyms directly inside of Word. Now, it's gone for good.

Nevertheless, if you right-click on a word and choose Search from the context menu, you will see only an empty search panel. Some users will see a message saying, "Sorry, something went wrong. Please try again," while others will see a blank space that never stops loading. Microsoft even removed the Smart Lookup feature from the standalone Office 2024 suite.

Read more
DeepSeek’s censorship is a warning shot — and a wake-up call
Homepage of DeepSeek's mobile AI app.

The AI industry is abuzz with chatter about a new large language model that is taking the fight to the industry’s top dogs like OpenAI and Anthropic. But not without its generous share of surprises. The name is DeepSeek.

It comes out of China. It is open source. Most importantly, it is said to have been developed at a fraction of the cost compared to what current industry leaders from OpenAI, Meta, and Google have burned.

Read more