Developed by a team of computer science students at the University of Washington, a new form of technology called WiSee uses Wi-Fi signals within a home to detect human movement. Conceptually similar to Micrsoft’s Xbox Kinect, someone living within the home could raise and lower the volume of music playing in another room or switch songs by making specific gestures in the air.
Hypothetically, it could be tied into anything that has to do with home automation. Forgot to lock the Wi-Fi connected front door deadbolt? Need to turn off all the lights in the home before going to sleep? Want to turn on the coffeemaker when you wake up in the morning? Just wave your hands in the air and WiSee figures it out.
To accomplish gesture translation using a home’s wireless network, the research team created a receiver that constantly measures variations in the frequency of a home’s wireless signals. Not just limited to the home’s router, the receiver also measures signal variation between other devices in the home like laptops, smartphones, tablets and other Internet-connected tech.
When someone moves within the home, the receiver watches how the signals change. This pattern of changes is called a “Doppler frequency shift.” While these changes are very small, they were noticeable enough for the research team to develop an algorithm to detect and interpret the changes. Specifically, the WiSee system can detect nine different full-body gestures and these gestures can be linked to specific home automation functions. Regarding accuracy, the team was able to achieve a 94 percent success rating when testing the system 900 times within a two-bedroom apartment as well as an office space.
Speaking about the WiSee system, team member Qifan Pu said “This is the first whole-home gesture recognition system that works without either requiring instrumentation of the user with sensors or deploying cameras in every room. By analyzing the variations of these signals over time, we can enable full-body gestures that go beyond simple hand motions.”
The receiver has been built with multiple antennas to interpret up to five different users all at the same time. In addition, it’s impossible to activate the receiver without performing a specific gesture sequence first. This acts as a form of password protection and eliminates the possibility of someone activating the system by accident. After the unlocking gesture is performed, the receiver hones in on the general location of the user and looks for gestures to interpret.
The team is continuing the test the WiSee system in multiple environments and will likely tweak the system to improve performance over time. Assuming the concept proves to be a success, the team will present the technology at the MobiCom 2013 in Miami, Florida at the end of September. Members of the project team have worked on similar tracking projects in partnership with Microsoft Research, but those projects focused on tracking movement using radiation from electrical wires and sound waves. However, this is the first project that allows gesture tracking without any hardware located in the same room as the user.