Skip to main content

You can now augment Google Lens photo searches with text

Google is looking to improve its search results by leveraging both the power of photos, along with additional text for context. The new experience is called multisearch, which will be available on phones and tablets as part of Google Lens inside the Google app.

Google says the feature combines visual and word searches together to deliver the best results possible, even when you can’t describe exactly what it is that you’re trying to search for.

Five screenshots that show how to search in Google using multiple elements.
Image used with permission by copyright holder

“At Google, we’re always dreaming up new ways to help you uncover the information you’re looking for — no matter how tricky it might be to express what you need,” Google explained of the new multisearch feature. “That’s why today, we’re introducing an entirely new way to search: Using text and images at the same time. With multisearch in Lens, you can go beyond the search box and ask questions about what you see.”

A practical example of where multisearch will be useful is online shopping. Fashionistas may like a particular style of dress but may not know what that style is called. In addition, rather than shopping from a catalog with that particular dress available in a specific color, by leveraging the power of multisearch, you can snap a picture of the dress and search for the color green or orange. Google will even suggest similar alternatives in the colors you want.

“All this is made possible by our latest advancements in artificial intelligence.”

In this sense, multisearch extends the Google Lens experience by not only identifying what you see but by entering additional search text — like the color green — your search becomes more meaningful.

“All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways,” Google explained the technology powering multisearch. “We’re also exploring ways in which this feature might be enhanced by MUM — our latest AI model in Search — to improve results for all the questions you could imagine asking.”

To begin your multisearch experience, you’ll need the Google app, which can be downloaded as a free app on iOS and Android devices. After you download the app, launch it, tap on the Lens icon, which resembles a camera, and snap a picture or upload one from your camera roll to begin your search. Next, you’ll want to swipe up and click on the plus (+) icon to add to your search.

Some ways to use this new multisearch tool include snapping a picture of your dining set and adding the “coffee table” term to your search to find matching tables online, or capturing an image of your rosemary plant and adding the term “care instruction” to your search to find out how to plant and care for rosemary, Google said.

Multisearch is available now as a beta experience within the Google app. Be sure to keep your Google app updated for the best results.

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Google quietly launches a new text-to-video AI app
A photo of Google Vids running with a sample timeline

Google quietly announced an AI-powered video creation app today. Called Google Vids, the new app is designed for Google Workspace users and uses the power of Google Gemini to help you create informational videos for the workspace.

Currently in testing with select Google Workspace Labs users (a public beta ispromised for later), the new online tool builds on some of the AI-powered features we've already seen in Google's other apps like Docs, Sheets, and Slides. The difference is that with Google Vids, you can manually create a video storyboard using your media or use AI to create one using basic words and simple prompts. This allows you to edit and put together much more informative videos in a short time.

Read more
The Google Pixel 8a leaked again, and now I’m nervous
Pixel 7a back.

Just about everything regarding the Google Pixel 8a has leaked at this point. We've seen high-quality renders of the phone, its specs are everywhere online, and its release date is all but guaranteed. A new Pixel 8a leak appeared online today, and after seeing it, I'm feeling a bit nervous.

TechDroider on X (formerly Twitter) shared two hands-on photos of the Pixel 8a today, including pictures of the front and back of the phone. The back of the phone showcases a black color with a matte finish that looks quite good. We also get a clear view of the two rear cameras, the Google "G" logo in the middle, and the rounded corners.

Read more
Check your Google Pixel Watch right now for two new features
A person wearing the Google Pixel Watch 2.

If you own a Google Pixel Watch, you may want to check your smartwatch for a new software update. Google has begun rolling out its April 2024 security update for both Pixel Watch models, and it packs a couple of new features you'll want to try out.

In its blog post announcing the new update, Google says it includes "new features, numerous bug fixes, and performance updates for Pixel Watch users." In addition to those ever-important bug fixes, there are two specific upgrades we think you'll really like.

Read more