Skip to main content

A.I. and Google News: The push to broaden perspectives

Sundar Pichai stands in front of a Google logo at Google I/O 2021.
This story is part of our complete Google I/O coverage

Image used with permission by copyright holder

There is no editorial team at Google News. There is no building filled with hundreds of moderators monitoring the thousands of stories hitting the web every second, making sure the full story is presented. Instead, Google uses artificial intelligence algorithms, as well as its partnerships with fact-checking organizations providing headlines from credible, authoritative sources.

Recommended Videos

“Humans are generating the content,” Trystan Upstill, Google News engineering and product lead, told Digital Trends. “We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence. In a way, the A.I. is controlling this fire hose of human stuff going on.”

“We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence.”

A.I. is a big part of the redesigned Google News app, which was recently announced at the annual Google I/O developer conference in Mountain View, California. The algorithms filter or demote stories after detecting the spread of misinformation, and they also understand terms and fragments of text coming through the news cycle, aligning them with fact checks from partner organizations.

But one of the A.I.’s main tasks is to provide a full picture of major, nuanced stories through a feature called “Full Coverage.” It’s a small button you can press on stories, which will lead you to similar articles from a variety of publications — including ones you do not follow or may not like. The main section of Google News shows content tailored to you, but “Full Coverage” does not respect your likes and dislikes — everyone sees the same information pulled together by the A.I.

That includes modules for fact checks, frequently asked questions, a timeline of events, international coverage, and even tweets from primary sources. Everyone reading “Full Coverage” sees the same information, which Upstill said is crucial.

“The core premise we have is that in order to have a productive conversation about something, everyone basically needs to be able to see the same thing,” he said.

While the breadth of data the algorithms pull is impressive, it’s entirely on the user to click on the small “Full Coverage” button to read more perspectives on the topic at hand. It’s why the button features Google’s red, green, blue, and yellow colors — it stands out from a page that’s mostly black and white.

“Fundamentally, we’re trying to build tools that are easy, that people can use to develop their understanding,” Upstill said. “A part of the challenge for people to break out of their bubbles and echo chambers is that it’s just hard; it’s hard work, and we set out to make that easy.”

Pulling together a variety of sources has always been a part of Google News’ roots. The desktop service began right after the 9/11 attacks in 2001, when people were scrambling to find as much information as they could about the tragic event.

“It came to the table with this idea that in terms of understanding a story, you shouldn’t read a single article,” Upstill said. “You should read a set of articles around that story to really position what you’re reading. That is a key message that resonates with people even today, in this age of people having increasingly polarized views.”

“You should read a set of articles around that story to really position what you’re reading.”

Google has been criticized for helping people stay in their bubbles. Search results are personalized based on location and previous searches, and people end up seeing what they want to see rather than the full picture. Upstill said Google isn’t in the business of censorship, and “in Search, if you come in and say ‘give me the fake news publication’ or type ‘fakenews.com,’” it will show up. But with Google News, Upstill said you shouldn’t find disreputable sources.

The new Google News app is currently rolling out on both Android and iOS, and the desktop redesign will go live early next week. Both will share the same features, but the desktop version will have a different format.

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
HuggingSnap app serves Apple’s best AI tool, with a convenient twist
HuggingSnap recognizing contents on a table.

Machine learning platform, Hugging Face, has released an iOS app that will make sense of the world around you as seen by your iPhone’s camera. Just point it at a scene, or click a picture, and it will deploy an AI to describe it, identify objects, perform translation, or pull text-based details.
Named HuggingSnap, the app takes a multi-model approach to understanding the scene around you as an input, and it’s now available for free on the App Store. It is powered by SmolVLM2, an open AI model that can handle text, image, and video as input formats.
The overarching goal of the app is to let people learn about the objects and scenery around them, including plant and animal recognition. The idea is not too different from Visual Intelligence on iPhones, but HuggingSnap has a crucial leg-up over its Apple rival.

It doesn’t require internet to work
SmolVLM2 running in an iPhone
All it needs is an iPhone running iOS 18 and you’re good to go. The UI of HuggingSnap is not too different from what you get with Visual Intelligence. But there’s a fundamental difference here.
Apple relies on ChatGPT for Visual Intelligence to work. That’s because Siri is currently not capable of acting like a generative AI tool, such as ChatGPT or Google’s Gemini, both of which have their own knowledge bank. Instead, it offloads all such user requests and queries to ChatGPT.
That requires an internet connection since ChatGPT can’t work in offline mode. HuggingSnap, on the other hand, works just fine. Moreover, an offline approach means no user data ever leaves your phone, which is always a welcome change from a privacy perspective. 

Read more
Gemini is replacing Google Assistant. How will the shift affect you?
Google Assistant and Gemini apps on an Android phone.

The writing has been on the wall for a while, but the shift away from Google Assistant is now official. Google has announced that it will shift users to Gemini as the default AI assistant on their devices in the coming months. Once that happens, they will no longer be able to access the Google Assistant.
At the moment, you can switch to Google Assistant as the default option on your Android phone, even on newer phones that come with Gemini running out of the box. In addition to phones, Google will be giving a similar treatment to smartwatches, Android Auto, tablets, smart home devices, TVs, and audio gear.
“We're also bringing a new experience, powered by Gemini, to home devices like speakers, displays, and TVs,” says Google, without sharing a specific time frame for the transition. What happens to Google Assistant following the transition? Well, it will be removed from devices and will no longer be available to download from app stores.

Talking about apps, Gemini can already interact with a wide range of Google’s own as well as a few third-party apps. Users can ask it to perform chores across different products, without ever having to open those apps. In addition to in-house apps such as Docs, Drive, and Gmail, the Gemini assistant can also perform tasks in third-party apps such as WhatsApp and Spotify, alongside a bunch of Samsung apps.

Read more
Android is prepping notification summaries. Let’s hope it’s better than iOS
Android notification summaries concept.

So far, Google has done an admirable job of putting generative AI tools on Android smartphones. Earlier today, it announced further refinements to how users interact with Gemini AI assistant and extended a few freebies, too. Now, Google seems to be chasing an AI tool that has worked poorly on iPhones.
The folks over at AndroidAuthority took a peek at the code of Android 13’s latest beta update and found the mention of“notification summaries.” To enable this feature, users will have to flick a dedicated toggle under the Notifications dashboard of the Settings app.

A thoughtful approach for Android
Thankfully, users will be able to disable notifications for apps that they don’t want to see summarized notifications. An analysis of the strings suggests that the feature will only summarize notifications that host conversational content, such as messages, and no other app alerts.
This is a thoughtful strategy, and will likely avoid the mess that came from summarized notifications within the Apple Intelligence bundle. Notification summaries are a useful way to catch up on the chatter in a buzzy group, like friends or workplace chats.

Read more