Skip to main content

Like an A.I. acid trip, this neural net rebuilds reality with flowers and fire

gloomy sunday

When computers get creative, the results are frequently fascinating — as a new project created by artist and machine learning Ph.D. student Memo Akten admirably demonstrates. A bit like projects such as Google’s Deep Dream image generator, Akten has been applying artificial neural networks to create some unusual visual effects. His “Learning To See” project uses image recognition neural nets to interpret the images it sees on a live video feed. The twist? He trained his different neural networks exclusively on a diet of only water, sky, flowers or fire still images so that regardless of what image they actually see, they interpret it as waves crashing, fires roaring, or flowers growing.

Recommended Videos

“In some ways, this was a response to the binary polarization that we see politically in the U.K., in the United States, and in Turkey, which is where I’m from,” Akten told Digital Trends. “The idea is that all of us are only capable of seeing the world through the lens of what we’ve seen before. We incapable of seeing it through other people’s eyes because we’re so colored by what we know. In the case of this piece of work, the neural network has been trained only on certain images — such as waves or fire or flowers. As a result, everything it sees it can only make sense of based on its own experience.”

Please enable Javascript to view this content

It’s an intriguing concept, both conceptually and technologically. Particularly impressive from a tech point of view is how fluid the movements look, despite the fact that Akten says the neural networks were trained exclusively on still images. Nonetheless, through analyzing only still images the A.I. has approximated a fairly accurate idea of how fires burn or water moves.

“With any emerging technology, artists will always think about how they can apply it to their own domain, whether that’s painting, dance, performance, or whatever else,” Akten continued. “Right now these machine learning technologies are still a bit complex and inaccessible for a lot of people. But there’s a lot of work being done to make these tools into things which can be used by everyone.”

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
A.I. translation tool sheds light on the secret language of mice
ai sheds light on mouse communication

Breaking the communication code

Ever wanted to know what animals are saying? Neuroscientists at the University of Delaware have taken a big leap forward in decoding the sounds made by one particular animal in a way that takes us a whole lot closer than anyone has gotten so far. The animal in question? The humble mouse.

Read more
Deep learning A.I. can imitate the distortion effects of iconic guitar gods
guitar_amp_in_anechoic_chamber_26-1-2020_photo_mikko_raskinen_006 1

Music making is increasingly digitized here in 2020, but some analog audio effects are still very difficult to reproduce in this way. One of those effects is the kind of screeching guitar distortion favored by rock gods everywhere. Up to now, these effects, which involve guitar amplifiers, have been next to impossible to re-create digitally.

That’s now changed thanks to the work of researchers in the department of signal processing and acoustics at Finland’s Aalto University. Using deep learning artificial intelligence (A.I.), they have created a neural network for guitar distortion modeling that, for the first time, can fool blind-test listeners into thinking it’s the genuine article. Think of it like a Turing Test, cranked all the way up to a Spınal Tap-style 11.

Read more
A.I. upscaling makes this film from 1896 look like it was shot in dazzling 4K
ai upscaling lumiere brothers 1896 train video

[4k, 60 fps] Arrival of a Train at La Ciotat (The Lumière Brothers, 1896)

Watching old documentary footage lets us get a glimpse of how life was in an earlier era. The effect isn’t totally convincing, though. Whether it’s incorrect projection speeds, decrepit film stock, or just the distinctly non-4K quality of old-fashioned cameras, watching this footage today still feels like, well, you’re watching old footage.

Read more