Skip to main content

Google Brain brings ‘zoom and enhance’ method one step closer to reality

The concept of enhancing a pixelated image isn’t new — “zoom and enhance” is responsible for dozens of criminals being put behind bars in shows like Criminal Minds, but that kind of technology has so far evaded the real world. Well, the boffins over at Google Brain have come up with what may be the next best thing.

The new technology essentially uses a pair of neural networks, which are fed an 8 x 8-pixel image and are then able to create an approximation of what it thinks the original image would look like. The results? Well, they aren’t perfect, but they are pretty close.

Recommended Videos

To be clear, the neural networks don’t magically enhance the original image — rather, they use machine learning to figure out what they think the original could have looked like. So, using the example of a face, the generated image may not look exactly like the real person but instead, a fictional character that represents the computer’s best guess. In other words, law enforcement may not be able to use this technology to produce an image of a suspect using a blurry reflection from a photo of a number plate yet, but it may help the police get a pretty good guess at what a suspect may look like.

As mentioned, two neural networks are involved in the process. The first is called a “conditioning network,” and it basically maps out the pixels of the 8 x 8-pixel image into a similar looking but higher resolution image. That image serves as the rough skeleton for the second neural network, or the “prior network,” which takes the image and adds more details by using other, already existing images that have similar pixel maps. The two networks then combine their images into one final image, which is pretty impressive.

It is likely we will see more and more tech related to image processing in the future — in fact, artificial intelligence is getting pretty good at generating images, and Google and Twitter have both put a lot of research into image enhancing. At this rate, maybe crime-show tech will one day become reality.

Christian de Looper
Christian de Looper is a long-time freelance writer who has covered every facet of the consumer tech and electric vehicle…
Google Gemini can now tap into your search history
Google Gemini app on Android.

Google has announced a wide range of upgrades for its Gemini assistant today. To start, the new Gemini 2.0 Flash Thinking Experimental model now allows file upload as an input, alongside getting a speed boost.
The more notable update, however, is a new opt-in feature called Personalization. In a nutshell, when you put a query before Gemini, it takes a peek at your Google Search history and offers a tailored response.
Down the road, Personalization will expand beyond Search. Google says Gemini will also tap into other ecosystem apps such as Photos and YouTube to offer more personalized responses. It’s somewhat like Apple’s delayed AI features for Siri, which even prompted the company to pull its ads.

Search history drives Gemini’s answers

Read more
Google’s new Gemma 3 AI models are fast, frugal, and ready for phones
Google Gemma 3 open-source AI model on a tablet.

Google’s AI efforts are synonymous with Gemini, which has now become an integral element of its most popular products across the Worksuite software and hardware, as well. However, the company has also released multiple open-source AI models under the Gemma label for over a year now.

Today, Google revealed its third generation open-source AI models with some impressive claims in tow. The Gemma 3 models come in four variants — 1 billion, 4 billion, 12 billion, and 27 billion parameters — and are designed to run on devices ranging from smartphones to beefy workstations.
Ready for mobile devices

Read more
Google AI will turn your fashion ideas into a real sample for shopping
Google Shopping matches AI images with real prsssoduct.

AI has been a part of the Google Shopping experience for a while now. In October last year, Google started showing an AI-generated brief with suggestions about the products users are looking for, alongside tailored product listings.

A month prior to that, the Shopping tab added an AI-powered virtual try-on experience featuring various body shapes to show the right size fitting. Now, Google is giving it another AI-fueled update that involves literally turning your fashion ideas into a digital garment, one that will be used to find similar pieces of real clothing items.
How AI will reinvent shopping
Let’s say you are on the hunt for ”pink socks with blue flowers and red polka dots.” Entering that query may not necessarily find you real products matching that description, as that would depend on whether a seller of such items has labelled their products and images accordingly to match your description.

Read more