The project uses Google Soli, the search giant’s purpose-built interaction sensor that uses radar for motion tracking of the human hand. While Google is using it for proper grownup applications like objection recognition and next-gen wearables, however, [Design I/O]’s project offers something a bit more musical.
In order to do this, it also relies on Wekinator, a free machine learning tool used to train and detect the violin gestures, as well as open source C++ toolkit openFrameworks for communicating with the Soli and playing the sweet, sweet music. Once all of these pieces are put together, what results is a device capable of working out whether or not a violin gesture is being made and playing back a violin sample accordingly.
“This is, I believe, the first time someone has made the small violin gesture actually play music,” [Design I/O] partner Theodore Watson tells Digital Trends. “When we first saw the Soli demo last year, the gestures used in the demo reminded us so much of the smallest violin joke that we thought we would try it out if we ever got hold of a dev kit.”
Watson says he would describe the tiny violin as more of a “speed project” than the somewhat more serious interactive projects he normally works on. But he says that he and his [Design I/O] collaborators, “thought it would be a fun and accessible way for people to understand what types of things are possible with this incredibly innovative technology.”
It’s hard to disagree with that logic — particularly when the end result is as cool as this!
- Together, Google and Samsung just may have a chance to beat the Apple Watch
- Google’s Project Starline is a magic mirror to bring 3D video chatting to life
- This tech was science fiction 20 years ago. Now it’s reality
- Pixel 4 vs. iPhone 11 Pro
- Chess. Jeopardy. Go. Why do we use games as a benchmark for A.I.?