Skip to main content

Google supercharges Search with new features

Search is the biggest Google product and at the I/O, the company is pushing its service to new levels. Earlier, Google introduced Multisearch which allows you to, for example, take a photo of an object, and base a search query around that photo. Now, the company has announced that it will be rolling out Multisearch with the additional “near me” variable and Scene Exploration features later this year.

Multisearch’s “near me” variable allows you to search for a plate of food, find out what it’s called, and find out where to eat it near you. It’s like Shazam but for Search queries.

Google multisearch near me feature.
Image used with permission by copyright holder

Basically, you can search for a photo and a question at the same time. The feature will work on everything, such as a photo of a dish, and add “near me” – which will bring up a restaurant that serves the dish. It will scan photos from Maps contributors to match the photo you searched. The feature will be rolled out later this year for English and other languages over time.

Recommended Videos

Another Search feature that’s coming is Scene Exploration which allows you to pan your camera and instantly glean insights about multiple objects from a wider scene. You can scene the entire shelf with your camera, and it will display helpful info overlaid on the objects.

Google multisearch scene exploration.
Image used with permission by copyright holder

The feature uses computer vision to connect multiple frames that make up the scene and all the objects within it. The knowledge graph surfaces the most helpful results. Google cited a shelf of chocolates that you can scan and get information on which chocolate bars are nut-free. It allows you to get a sort of AR-looking overlay of an entire scene in front of you.

Prakhar Khanna
Prakhar Khanna is an independent consumer tech journalist. He contributes to Digital Trends' Mobile section with features and…
I can’t choose between Google and Apple, and it just got much harder
A person taking the Google Pixel 9a out of a pocket.

As I watched the Google I/O 2025 keynote, there were several product demonstrations which really stood out to me, not just because they were technically impressive and exciting, but also because none were ones I could ever imagine Apple showing off during a public presentation in the same way.

It reignited an age-old battle within me. Which approach do I prefer? Google and its incredible “moonshots” which may or may not actually be useful, or even become products I can buy, or Apple and its carefully considered demos firmly rooted in the real-world, benefits and all? 

Read more
Google has its sights set on your next pair of luxury sunglasses
A still image taken from Google I/O 2025's keynote.

Make no mistake, Google wants to run the technology inside your first pair of luxury smart sunglasses, but it’s not leaving the looks to chance or for its own internal designers to handle. Instead it’s turning to established eyewear partners with experience in the space it clearly wants to dominate, as it aims to make Android XR the go-to software for eyewear brands to adopt. 

Big names with the right experience 

Read more
Google IO 2025 summary: 5 big announcements you’ll want to know
Google IO 2025 logo on the surface of the earth

Google IO 2025 delivered us a huge helping of AI during the almost two-hour opening keynote.

Google's CEO, Sundar Pichai, and colleagues got through an awful lot on stage, and while some of the talk was aimed primarily at developers, there were plenty of big announcements for us - the people on the street - to explore.

Read more