Want to switch off the living room lights from bed, change channels while washing dishes, or turn the heat up from the couch? A team at the University of Washington has rigged a standard Wi-Fi home network to detect your movements anywhere in the home and convert them into commands to control connected devices.
Gesture recognition is the latest fad in games and tech, but even the newest systems require high-tech depth-sensing cameras or other special hardware. Microsoft’s new Kinect, for instance, uses a photon-measuring method called “time of flight” sensing that was, until the Kinect was announced, limited to high-tech laboratories. And Kinect isn’t small, either.
UW computer science students, led by assistant professor Shyam Gollakota, looked at the gesture-detection puzzle another way — specifically, how people affect the environment they’re already in.
Our bodies distort the Wi-Fi signals we use to beam information to and from our laptops and phones. By watching those signals very closely, the team could determine not just what room you’re in, but where you’re standing and how you’re moving your body. They call the system WiSee.
“By analyzing the variations of these signals over time, we can enable full-body gestures that go beyond simple hand motions,” said Qifan Pu, a visiting student and one of the team at UW, in a video outlining the work.
That’s no easy task: the “doppler effect” that our bodies have on the wavelength and path of the Wi-Fi signals is miniscule, meaning reliable measurement with consumer-grade hardware is difficult. But the WiSee team’s expertise worked it out.