“We propose to use the vocabulary of hand motions for [device] interactions," creating a generic input device that eliminates the need to interact with a smartphone,” said Project Lead Ivan Poupyrev in a talk at Google I/O here. “It doesn’t have to be a virtual touch pad...your hand can become a variety of controls -- a virtual dial, a slider,” he said.
The team has developed two Soli chips, a 9mm 2 device using pulse radar an 11mm 2 chip using continuous signal radar. Both run at 60 GHz with sub-millimeter accuracy. Soli’s radar can illuminate a hand with a broad beam, then treat the received signal as a complex superposition that takes note of your hand as it moves.
"From the signal representations, we can extract features that directly measure my hands characteristics and dynamics,” said Jaime Lien, lead research engineer for Soli. “These features are fed into machine learning algorithms which interpret them into gesture labels, and now the sensor can tell if I’m wiggling figures or not,” she said.
Software interprets signals from Soli to understand motions on different planes. Thus someone could change the hour on a smartwatch using a traditional watch-winding gesture from 5 inches above the screen, then use the same gesture at 7 inches above the screen to change the minutes. Soli's machine-learning algorithms can run on a variety of chips that support parallel processing, said ATAP Design Lead Carsten Schwesig.
“We like this idea of virtual tools,” Schwesig told EE Times. Similar to the Jacquard Project , if you imagine this virtual touch pad in your hand or a dial, those are controls you can map to anywhere,” he said.
“It’s difficult now to think [beyond] the phone...[but] I would like to see this in devices that are not dependent on a screen that you look at and you touch,” he added.
ATAP is developing a Soli board with the 60 GHz