Skip to main content

Future versions of Google Now will work offline and learn your preferences

Google Now has so far only worked when you’re online. But now, new documents published by Google reveal that the Google Research system is working on an update to voice commands, dictation, and the rest of the system that will see it stored on the user’s device rather than in the cloud. Essentially, Google Now will soon be able to work without being connected to the Internet.

Google often publishes documentation related to upcoming projects, and these new documents hint at its plans to seriously boost the capabilities of your phone’s personal assistant.

Recommended Videos

To train the new system, the team basically used 2,000 hours of recorded and anonymized Google voice search traffic, totaling as many as 100 million requests. The team also added background noise from YouTube to make it closer to real life.

Using a variety of computational models, the team was able to come up with a voice system that not only runs from the user’s smartphone, but runs seven times faster than the current version of Google Now on the Nexus 5. The new version of Google Now does all this, and it only takes up 20.3MB of storage, a negligible amount when most phones have at least 8GB or 16GB of onboard storage.

As is the case with a lot of recent Google research, the system is based on machine-learning, in which the software is able to learn as it goes, getting acquainted with the user and the user’s preferences

The system is similar to the previous version of Google Now in that it’s able to handle proper names and other information that will be specific to your device. For example, your set of contacts is going to be different than another user’s contacts. When asked to do something like send an email, the system will transcribe the command and store it, then execute it later when the user is back online.

It will certainly be interesting to see what the future of Google Now holds. A lack of offline functionality is one of the main frustrations most people have with the current version of the assistant, and offering at least some offline functionality will be very helpful.

Christian de Looper
Christian de Looper is a long-time freelance writer who has covered every facet of the consumer tech and electric vehicle…
Pixel users, beware: A Google Assistant bug can disable your alarms
Person holds Pixel 9a in hand while sitting in a car.

A Google Assistant bug is causing chaos with Do Not Disturb profiles and causing users to miss alarms, appointments, and more. The bug was first reported on Reddit earlier today by u/Rawalanche, who claimed it started when Pixel received new Do Not Disturb modes.

Their report states that any non-standard DnD profile, such as one that allows calls from only certain contacts or specific applications, will only work if you activate it through the Settings menu or the notification menu. If you request the DnD profile with Google Assistant, it does not activate and instead mutes everything on the phone.

Read more
Magic Editor not working in your Google Photos? A fix is on the way
Reimagine tool in Magic Editor on mobile.

The Magic Editor feature on Google Photos is a handy way to use AI to edit the images taken on your phone (though not everyone loves the results), but over the last few months some users have been reporting problems using the editor. Now, Google has acknowledged the problems and announced it is rolling out a fix, so if you're having problems with the editor then you should be able to use it as normal again soon.

Problems with Magic Editor were raised by users on Reddit as far back as November last year, with similar complaints appearing on Google's official Photos form in December. Some users found that after making edits with the Magic Editor tool, they were unable to save their images. The issue seemed to be intermittent, particularly affecting photos taken from the Screenshot folder and then edited in the Magic Editor.

Read more
Thanks to Gemini, you can now talk with Google Maps
Gemini’s Ask about place chip in Google Maps.

Google is steadily rolling out contextual improvements to Gemini that make it easier for users to derive AI’s benefits across its core products. For example, opening a PDF in the Files app automatically shows a Gemini chip to analyze it. Likewise, summoning it while using an app triggers an “ask about screen” option, with live video access, too.
A similar treatment is now being extended to the Google Maps experience. When you open a place card in Maps and bring up Gemini, it now shows an “ask about place” chip right about the chat box. Gemini has been able to access Google Maps data for a while now using the system of “apps” (formerly extensions), but it is now proactively appearing inside the Maps application.

The name is pretty self-explanatory. When you tap on the “ask about place” button, the selected location is loaded as a live card in the chat window to offer contextual answers. 

Read more