The project uses Google Soli, the search giant’s purpose-built interaction sensor that uses radar for motion tracking of the human hand. While Google is using it for proper grownup applications like objection recognition and next-gen wearables, however, [Design I/O]’s project offers something a bit more musical.
In order to do this, it also relies on Wekinator, a free machine learning tool used to train and detect the violin gestures, as well as open source C++ toolkit openFrameworks for communicating with the Soli and playing the sweet, sweet music. Once all of these pieces are put together, what results is a device capable of working out whether or not a violin gesture is being made and playing back a violin sample accordingly.
“This is, I believe, the first time someone has made the small violin gesture actually play music,” [Design I/O] partner Theodore Watson tells Digital Trends. “When we first saw the Soli demo last year, the gestures used in the demo reminded us so much of the smallest violin joke that we thought we would try it out if we ever got hold of a dev kit.”
Watson says he would describe the tiny violin as more of a “speed project” than the somewhat more serious interactive projects he normally works on. But he says that he and his [Design I/O] collaborators, “thought it would be a fun and accessible way for people to understand what types of things are possible with this incredibly innovative technology.”
It’s hard to disagree with that logic — particularly when the end result is as cool as this!
- A.I. creativity is improving fast. This hilarious GPT3-generated film is proof
- 1More True Wireless ANC review: Almost an AirPods Pro contender
- 5 great Galaxy Note 20 features I’ll be missing on the iPhone 12 Pro
- The terrible and tremendous sound of the world’s largest analog synthesizer
- What is Dolby Atmos Music, and how can you experience it?