Skip to main content

Google Translate gets conversational with new update

Google has moved a step closer to making us feel like we are really living in the future. Yesterday, the Internet giant announced an update for its Google Translate app that brings live-speech translation to its Android app. Google has tagged the new feature with the decidedly un-futuristic name of “Conversation Mode.”

Real-time translation is activated with a press of a button and then a phrase is spoken into the device. So if you’re trying to find out what time your bus leaves for Madrid, you simply speak “what time does the bus leave” into your device. The sentence is then translated in real-time and read back in Spanish so that a bus driver or passerby can hear and respond. A response given in Spanish can then translated back into English, effectively enabling a two-way conversation to take place without either party having to depart from their native tongue.

Recommended Videos

Google warns that Conversation Mode is still in its early stages and currently only supports translation between English and Spanish. “Because this technology is still in alpha, factors like regional accents, background noise or rapid speech may make it difficult to understand what you’re saying,” said Awaneesh Verma, product manager at Google, in a blog post. “Even with these caveats, we’re excited about the future promise of this technology to be able to help people connect across languages.

In addition to the live-translation feature, Google has also updated Translate’s interface. “Today, we’re refreshing Translate for Android with several updates to make the app easier to interact with,” Verma said. “Among other improvements, we’ve created better dropdown boxes to help select the languages you want to translate from and into, an improved input box, and cleaner icons and layout.”

See below for a video of an early demonstration of Conversation Mode in action.

Aemon Malone
Former Digital Trends Contributor
Google Gemini set to close gap on ChatGPT with rumored new feature
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app displaying the transcribe of a conversation and the steps taken

The Gemini app offers a whole bunch of useful things, but it's lacking one thing: Video analysis based on uploads from your PC or phone. That might be about to change, though, as looking into the APK code reveals that Google is working on a video upload feature. This could soon help Gemini analyze and summarize videos uploaded directly by users; it'd also help it rival ChatGPT, which already offers such a feature.

Android Authority went on a deep dive into the APK source code of the Google app beta and came up with some interesting findings. Given that this was found in the official Google app, there's a good chance it'll eventually make it into Gemini, but just to be extra safe, read the following with a little bit of skepticism.

Read more
Google removed a useful but little-known Play Store feature
Person holding Samsung Galaxy smartphone showing Google Play Store.

The most recent update to the Google Play Store app has quietly removed a useful app-sharing feature that you probably didn't know existed. The feature first came onto the scene in 2021 and allowed Android users to use the "Quick Share" option to send apps to others.

With the latest Play Store update (version 45.2.19-31), the feature is officially kaput. If you never used it or knew about it, don't feel bad. App-sharing wasn't widely advertised, and even users who did know rarely used it.

Read more
I saw the new Gemini and Project Astra, here’s why it’s the future
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

We’re quickly entering the realm of AI that is useful, and key to this is Project Astra, Google’s new universal AI agent that is helpful in everyday tasks. Oppo, Honor, Motorola and Tecno have all developed new ways for AI to help you in your daily life, but key to the next generation of artificial intelligence is Astra’s multimodal approach.

The premise is simple: point your phone camera at something and have a live conversation with Google Gemini, where you can ask it questions and have it provide suggestions based on what it is seeing.

Read more