Skip to main content

Edge Copilot finally delivers on Microsoft’s Bing Chat promises

Microsoft is finally making the version of Bing Chat we heard about in February a reality. The latest version of Microsoft Edge (111.0.1661.41) includes the Bing Copoilot sidebar, which allows you to chat, generate AI content, and get insights into topics powered by AI.

This is the form of Bing Chat Microsoft originally pitched. Since its launch, the chat portion of Bing Chat has been available through a waitlist that, according to Microsoft, has amassed millions of sign-ups. However, Microsoft also talked about Bing Copilot, which would live in the Edge sidebar and open up the possibility of generating emails, blog posts, and more, as well as provide context for whatever web page you were on.

the Compose feature in Microsoft Bing Chat.
Image used with permission by copyright holder

The Compose tab is where you can generate emails, blog posts, and even lists of ideas. It offers several tones ranging from professional to funny, as well as three lengths and four formats. Enter your prompt, select your parameters, and generate your draft. It doesn’t seem like the length matters much, though; even with a Short length, the AI would spit out five or more paragraphs to most prompts I entered.

Once you’re done, you can regenerate the response to your prompt or copy it from the window. There’s also an Add to site button below the text window, but I wasn’t able to find a use for it.

The Insights tab is contextual, pulling insights from and around the website you’re currently on. For example, on the Digital Trends home page, it pulled some basic information about the website, where users are located, and how most visitors land on the website. It’s not clear where these analytics are coming from, though, outside of a “data from: bing.com” disclaimer at the bottom of the Insights window.

It’s not perfect. With our review of Hogwarts Legacy pulled up, the Q&A section of Copilot includes the question “Is Hogwarts Legacy a curse,” which is a reserved-engineered question based on a bit of stylized copy in the review.  For most websites, this seems like what Edge Copilot is doing. It pulls some questions and key points from the current page while also gathering context from around the internet.

Insights tab in Edge Copilot.
Image used with permission by copyright holder

Briefly using the Copilot, the best use I found for it was shopping. When you land on a product, Copilot can easily gather reviews, news articles, comparisons, and alternatives. It’s what you’d find in a normal search, but you don’t have to make a separate search.

Of course, the Chat tab is still present, as well. Microsoft’s Bing Chat has been skewed into submission after some unhinged responses, but the floodgates are slowly opening back up. You can use Bing Chat in Edge Copilot, including its new tone response types.

Microsoft is continuing to build on its AI strategy after a multi-billion investment in OpenAI, the research group behind ChatGPT, earlier this year. The company is scheduled to give a presentation on the future of AI in the workplace on March 16 where we expect to hear more about Bing Chat in Office apps, as well as the possibility of ChatGPT 4 providing AI-generated videos.

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
Is ChatGPT safe? Here are the risks to consider before using it
A response from ChatGPT on an Android phone.

For those who have seen ChatGPT in action, you know just how amazing this generative AI tool can be. And if you haven’t seen ChatGPT do its thing, prepare to have your mind blown! 

There’s no doubting the power and performance of OpenAI’s famous chatbot, but is ChatGPT actually safe to use? While tech leaders the world over are concerned over the evolutionary development of AI, these global concerns don’t necessarily translate to an individual user experience. With that being said, let’s take a closer look at ChatGPT to help you hone in on your comfort level.
Privacy and financial leaks
In at least one instance, chat history between users was mixed up. On March 20, 2023, ChatGPT creator OpenAI discovered a problem, and ChatGPT was down for several hours. Around that time, a few ChatGPT users saw the conversation history of other people instead of their own. Possibly more concerning was the news that payment-related information from ChatGPT-Plus subscribers might have leaked as well.

Read more
What is ChatGPT Plus? Here’s what to know before you subscribe
Close up of ChatGPT and OpenAI logo.

ChatGPT is completely free to use, but that doesn't mean OpenAI isn't also interested in making some money.

ChatGPT Plus is a subscription model that gives you access to a completely different service based on the GPT-4 model, along with faster speeds, more reliability, and first access to new features. Beyond that, it also opens up the ability to use ChatGPT plug-ins, create custom chatbots, use DALL-E 3 image generation, and much more.
What is ChatGPT Plus?
Like the standard version of ChatGPT, ChatGPT Plus is an AI chatbot, and it offers a highly accurate machine learning assistant that's able to carry out natural language "chats." This is the latest version of the chatbot that's currently available.

Read more
‘Take this as a threat’ — Copilot is getting unhinged again
A screenshot of Copilot's unhinged responses on a screen.

The AI bots are going nuts again. Microsoft Copilot -- a rebranded version of Bing Chat -- is getting stuck in some old ways by providing strange, uncanny, and sometimes downright unsettling responses. And it all has to do with emojis.

A post on the ChatGPT subreddit is currently making the rounds with a specific prompt about emojis. The post itself, as well as the hundreds of comments below, show different variations of Copilot providing unhinged responses to the prompt. I assumed they were fake -- it wouldn't be the first time we've seen similar photos -- so imagine my surprise when the prompt produced similarly unsettling responses for me.

Read more