Come See AllSee, UW’s New Low-Power Gesture Control System

The chorus of Hotel California sings out from the mobile phone in Bryce Kellogg’s pocket. He waves his hand to turn down the volume. He flicks his fingers a couple of times and Stairway to Heaven plays. The phone never leaves his pocket.

It looks like a parlor trick, but the underlying technology, developed by a team of University of Washington electrical engineers and computer scientists, has potential to enable natural gesture control of the broadest range of electronic devices, even those with no batteries.

The technology—a prototype of which was attached to Kellogg’s off-the-shelf Android phone—is called AllSee. Kellogg and fellow doctoral student Vamsi Talla refined it over the course of the last year with Shyam Gollakota, a UW assistant professor of computer science and engineering. They’re demonstrating it this week at the Symposium on Networked Systems Design and Implementation, an international conference being held in Seattle. They’ve filed for a provisional patent with help from the UW Center for Commercialization, and they’ve already fielded multiple enquiries from phone manufacturers.

In the UW Networks and Wireless Lab, Kellogg, 24, and Talla, 26, show me how AllSee works. They’ve set up a prototype receiver; an RFID reader, which provides a radio signal for demonstration purposes; an oscilloscope that shows how the signal appears to the receiver; and a computer that displays the receiver’s interpretation of the gestures being performed.

Talla, left, and Kellogg.

Talla, left, and Kellogg.

Kellogg holds his hand near the receiver. The scope shows small peaks and troughs in the signal’s amplitude, indicating the power of the radio signal being reflected off his hand. He moves his hand closer and the peaks and troughs on the scope grow bigger. An image of a “push” gesture pops up on the screen.

“That’s how we tell the difference, for example, between a push gesture and a pull gesture,” he explains. “We go small to big, big to small. Just looking at the power of the reflected signal, we can detect which gesture you’re doing.”

Though we’re still getting used to touchscreen-based gestures, lots of devices today can detect hand or body motions. Prominent examples on the market today include the Kinect from Microsoft, the Leap Motion controller, and Samsung Air Gesture. The Kinect, a sophisticated controller sold as an accessory for Xbox game consoles and for Windows PCs, includes a camera, microphone, and infrared sensors. The Leap Motion connects to a Mac or PC and interprets hand and finger gestures using infrared sensors. Air Gesture, too, relies on infrared proximity and gesture sensors.

Since they rely on visible or invisible light, all require the gestures be made within sight of the devices—for the Samsung technology, gestures can be no more than 3 inches away, according to the company’s how-to guide. But since AllSee uses reflected ambient radio waves, it can detect gestures even if there’s an object or material in the way.

But it’s AllSee’s power effiency that really makes it stand out. Kinect, which does skeletal tracking of multiple players simultaneously, and facial and voice recognition, hooks into a larger, non-mobile device such as a PC or game console and therefore has much more power, and greater computational resources, to work with. The same is true of the Leap Motion device, which connects via USB to a computer and is designed to sit on a desktop. Samsung Air Gesture, a feature of the company’s Galaxy S4 phone, is perhaps the more apt comparison to AllSee.

But, Kellogg notes, “If you leave it on all the time, it just sucks your battery. What we’re trying to do is the same type of gesture recognition, but with passive components.”

AllSee consumes only 28 microwatts of power to perform an average of 15 gestures per minute—orders of magnitude less than similar systems. That includes the low-power controller that runs a simple algorithm to convert eight distinct gestures into instructions for the device, Talla says. (The Samsung technology only recognizes five gestures.) AllSee has no radio, just an antenna to receive ambient television or RFID signals. It even makes use of another UW innovation to gather small amounts of power from the signals themselves.

The technology, which costs less than $1, presents the possibility of outfitting virtually any device—regardless of whether it has batteries—with gesture controls. “You don’t have to worry about power if you have this kind of technology for gesture recognition,” Talla says.

Kellogg and Talla came together after working on earlier projects with Gollakota, who heads the UW Networks and Wireless Lab. Computer science professor Ed Lazowska describes Gollakota as “an idea factory.” He and his students have won acclaim for recent research projects including WiSee, which measures Doppler shifts in wireless network signals as a medium for gesture controls, and Ambient Backscatter, which harvests small amounts of power from radio signals for communication devices and is being applied in AllSee. The latest project is “TongueSee: Non-Invasive Tongue Machine Interface.” It is meant to help people with conditions such as tetraplegia and amyotrophic lateral sclerosis (Lou Gehrig’s Disease) control devices using the tongue, Gollakota says.

Mobile phones are obvious candidates for the AllSee technology, but the team envisions broader applications.

An AllSee prototype.

An AllSee prototype.

“Gestures are the most natural way of interacting with objects,” Talla says.

As more objects are equipped with sensors, smarts, and communications capabilities—the foretold technology upheaval broadly referred to as the Internet of Things—controlling them could be a barrier, Talla says. Take Bluetooth-equipped activity tracking products like Fitbit or Jawbone Up, which are essentially small, power-constrained wearable sensors. “Right now, the only way to interact with them is use your phone and send a message,” Talla says. But imagine waving at the start of your run to instruct your fitness monitor to start measuring. “That’s a best-case fit for AllSee devices,” he says.

In a video demonstrating AllSee (the same one in which Kellogg shows off the music controls), Talla summons a small robot that’s behind him with a simple “come here” wave. That raises a question: As we get to a more fully realized Internet of Things, where lots of objects are controlled this way, can the technology distinguish an individual’s unique intent—that Talla was summoning that particular robot, and not trying to turn up the heat on a smart thermostat or signaling a blaring smoke alarm to turn itself off?

“Our technology looks at specific signal changes, so it’s actually very directional,” Talla says. Gestures aimed straight at the robot, in other words, would be received by it more strongly. But he acknowledges that there will need to be improvements, such as increasing the number of gestures the system recognizes and improving specificity with triangulation.

Security is another issue. You don’t want someone to be able to change your music just by waving at your pocket.

Trending on Xconomy