Skip to main content

Microsoft’s Bing in the Classroom launches, filters out ads and adult content

microsofts bing in the classroom launches filters out ads and adult content hp lizard 202f8afc
Image used with permission by copyright holder

Microsoft is attempting to cater to school districts all over the country by launching Bing in the Classroom, a version of the search engine that Redmond claims is free of advertisements, and blocks adult content, according to an official blog post. Along with putting the kaibosh on ads and questionable content, Microsoft claims that Bing in the Classroom disables targeted ads as well, when searches are performed while devices are connected to the school’s network.

“At Bing we think advertising done well is an additive and essential part of the search experience,” says Bing exec Derrick Connell. “But sometimes the best innovation is knowing when to take something away.”

Bing in the Classroom is available for “eligible” public and private U.S. school districts, ranging from Kindergarten to the 12th grade. Bing in the Classroom first was test ran in five of the nation’s biggest school districts. Microsoft claims that Bing in the Classroom has already served 4.5 million students nationwide, and has been used for over 35 million search queries.

The service was previously dubbed Bing for Schools, but Microsoft scrapped the name  in favor of Bing in the Classroom because the company feels that this better “aligns with other Microsoft education programs.” Some of those include Windows in the Classroom, and Skype in the Classroom.

Microsoft has also launched a rewards program where, by using Bing, students can earn credits for their school. Once the 30,000 credit plateau is reached, a Microsoft Surface tablet is sent to the school that the user chooses to support. You can learn more about the rewards program here.

From sign-up to completion, Microsoft says that setting up Bing in the Classroom only takes a few days. 

Editors' Recommendations

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
Microsoft may have known about Bing Chat’s unhinged responses months ago
Bing Chat saying it wants to be human.

Microsoft's Bing Chat AI has been off to a rocky start, but it seems Microsoft may have known about the issues well before its public debut. A support post on Microsoft's website references "rude" responses from the "Sidney" chat bot, which is a story we've been hearing for the past week. Here's the problem -- the post was made on November 23, 2022.

The revelation comes from Ben Schmidt, vice president of information design at Nomic, who shared the post with Gary Marcus, an author covering AI and founder of Geometric Intelligence. The story goes that Microsoft tested Bing Chat -- called Sidney, according to the post -- in India and Indonesia some time between November and January before it made the official announcement.

Read more
You could be creeped out by Bing Chat on the go soon
Microsoft Edge browser is open on an iPhone.

Microsoft's latest changes to Bing Chat must be making the company feel more comfortable with the AI's stability. Microsoft is pressing forward, it seems, as a mobile version has been spotted by a few people who received early access.

Microsoft announced in a February 7 blog post that a mobile experience would be available soon. Less than two weeks later, it is beginning to arrive, despite the recent trouble with Bing Chat becoming unhinged and declaring that it wants to be human.

Read more
Microsoft responds to ChatGPT Bing’s first week of trial by fire
The new Bing chat preview can be seen even on a MacBook.

Microsoft is responding to some of the seemingly unhinged comments made by its Bing Chat AI. The service, which is currently in a limited public preview, has seen a trial by fire in its first week, and Microsoft has some updates planned to bring it more in line with the original vision of the AI.

As we reported yesterday, Bing Chat is capable of saying things such as "I want to be human," when engaged in prolonged chat sessions. Microsoft says this happens usually after 15 or more questions where the model becomes confused.

Read more