Skip to main content

A.I. and Google News: The push to broaden perspectives

Image used with permission by copyright holder

There is no editorial team at Google News. There is no building filled with hundreds of moderators monitoring the thousands of stories hitting the web every second, making sure the full story is presented. Instead, Google uses artificial intelligence algorithms, as well as its partnerships with fact-checking organizations providing headlines from credible, authoritative sources.

“Humans are generating the content,” Trystan Upstill, Google News engineering and product lead, told Digital Trends. “We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence. In a way, the A.I. is controlling this fire hose of human stuff going on.”

“We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence.”

A.I. is a big part of the redesigned Google News app, which was recently announced at the annual Google I/O developer conference in Mountain View, California. The algorithms filter or demote stories after detecting the spread of misinformation, and they also understand terms and fragments of text coming through the news cycle, aligning them with fact checks from partner organizations.

But one of the A.I.’s main tasks is to provide a full picture of major, nuanced stories through a feature called “Full Coverage.” It’s a small button you can press on stories, which will lead you to similar articles from a variety of publications — including ones you do not follow or may not like. The main section of Google News shows content tailored to you, but “Full Coverage” does not respect your likes and dislikes — everyone sees the same information pulled together by the A.I.

That includes modules for fact checks, frequently asked questions, a timeline of events, international coverage, and even tweets from primary sources. Everyone reading “Full Coverage” sees the same information, which Upstill said is crucial.

“The core premise we have is that in order to have a productive conversation about something, everyone basically needs to be able to see the same thing,” he said.

While the breadth of data the algorithms pull is impressive, it’s entirely on the user to click on the small “Full Coverage” button to read more perspectives on the topic at hand. It’s why the button features Google’s red, green, blue, and yellow colors — it stands out from a page that’s mostly black and white.

“Fundamentally, we’re trying to build tools that are easy, that people can use to develop their understanding,” Upstill said. “A part of the challenge for people to break out of their bubbles and echo chambers is that it’s just hard; it’s hard work, and we set out to make that easy.”

Pulling together a variety of sources has always been a part of Google News’ roots. The desktop service began right after the 9/11 attacks in 2001, when people were scrambling to find as much information as they could about the tragic event.

“It came to the table with this idea that in terms of understanding a story, you shouldn’t read a single article,” Upstill said. “You should read a set of articles around that story to really position what you’re reading. That is a key message that resonates with people even today, in this age of people having increasingly polarized views.”

“You should read a set of articles around that story to really position what you’re reading.”

Google has been criticized for helping people stay in their bubbles. Search results are personalized based on location and previous searches, and people end up seeing what they want to see rather than the full picture. Upstill said Google isn’t in the business of censorship, and “in Search, if you come in and say ‘give me the fake news publication’ or type ‘fakenews.com,’” it will show up. But with Google News, Upstill said you shouldn’t find disreputable sources.

The new Google News app is currently rolling out on both Android and iOS, and the desktop redesign will go live early next week. Both will share the same features, but the desktop version will have a different format.

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
I can’t wait to make my iPhone look like Android with iOS 18
An iPhone home screen with iOS 18.

Apple’s WWDC 2024 keynote was quite a spectacle. It showed off a ton of new features coming to iOS 18, iPadOS 18, and macOS Sequoia, with most of it being powered with Apple Intelligence -- Apple’s own brand of AI.

But there were some other non-AI features, too, including some much-needed changes to the iPhone's home screen. It’s been a while since Apple really overhauled the home screen, the last time being iOS 14 and the ability to add widgets and create custom app icons through Shortcuts. With iOS 18, users can further customize their home screen with new ways to rearrange apps and widgets, plus the ability to theme app icons like never before.

Read more
Here’s every AI feature coming to your iPhone with iOS 18
A person demonstrating the new Siri revamped with Apple Intelligence at the Worldwide Developers Conference (WWDC) 2024.

Apple’s WWDC 2024 keynote has come and gone. It was quite a memorable one, starting with an action-packed opening sequence and then drilling deep down into the new features coming to all of Apple's latest software updates.

One of the biggest focuses this year was on Apple Intelligence, which is Apple’s version of the AI-powered tools that are behind those new features. You may have missed all of the cool new AI things coming, so here’s a rundown of it all.
What is Apple Intelligence?

Read more
iOS 18 has ended the iPhone vs. Android debate
Updated interface of Siri activation.

“I just have to see anything particularly useful that AI can do,” a tech journalism veteran told me ahead of Apple’s WWDC 2024 event. To a large extent, I agree with the sentiment, even though I have pushed consumer-grade AI tools in every scenario that my hardware selection allowed. By the time Apple’s event concluded, I had a strong feeling that Apple may just have delivered the most practical dose of AI on a smartphone.

We have entered the era of Apple Intelligence on iPhones. I will drop the bad news first: The whole AI platter has been served only on the latest and greatest “Pro” iPhones. They are not even available for the iPhone 15 or the iPhone 15 Plus. It seems the silicon and the onboard NPU are to blame, or maybe it's all-important memory restrictions. Similar restrictions apply for iPads, which need at least an M-class processor.

Read more