The video, titled NYC Flow, was shot at 240 frames per second, producing a slow motion output that already gave it a dreamy feel to begin with. From there, Krivoruchko ran it through the algorithm, which completely transformed it into a living painting.
The algorithm itself comes from Manuel Ruder and his team, who published it in a paper at the University of Freiburg in Germany. It is able to copy a style from an input source, like a painting, and apply it to a video. It is, in fact, very similar technology to what’s employed by Prisma, except that the app doesn’t handle video yet.
Should you be interested in trying out this technique, head on over to Ruder’s GitHub where the open source code can be downloaded. Take heed, however, as you will need a powerful computer running Ubuntu with a hefty graphics card. Four gigabytes of video memory are required just to output a video with a resolution of 450 x 350 pixels.
The purpose for building the algorithm was to create automated rotoscoping to replace the time-intensive tasks of having human artists hand-paint every frame to turn it into an animation. Rotoscoping is an old technique, with perhaps its most notable use being in the film A Scanner Darkly.
NYC Flow is part of Krivoruchko’s Deep Slow Flow project on Instagram, which explores the idea of using neural network code in filmmaking.
- Apple paid a student $100,000 for successfully hacking a Mac
- Apple’s iOS 15.3 update fixes critical Safari security bug
- Jabra adds Bluetooth Multipoint to Elite 7 Pro/Active earbuds
- Messenger’s encrypted chats pick up user-friendly features
- Elgato’s Stream Deck Pedal helps your feet be productive