Google I/O: Project Soli's radar-based gesture control could redefine wearable interfaces

Google's ATAP division has just announced Project Soli which could completely redefine the way we interact with wearables as well as smartwatches.


Google I/O 2015 developer conference started on May 28 with announcements of updates to the Android operating system, new Google Photos app, updates to Google Maps and Google Now and more. While we still feel that the announcements weren't really mind-blowing and were left wanting for more, the Advanced Technology And Projects (ATAP) session on May 29 has surely managed to impress. Google's ATAP division, most famous for Project Ara and Project Tango, has just announced Project Soli which could completely redefine the way we interact with wearables as well as smartwatches.

Project Soli is Google ATAP's attempt to get rid of external input devices and make use of the human body as the only input device. As devices get smaller, interacting with them via a display can get cumbersome at times. Project Soli makes use of your hand and finger movments to interact with wearables or devices with small screens.

Gesture control is a feature present on most mobile devices today. The most popular is drawing an alphabet on the lock screen to get into a particular app or flash the V sign to auto-fire the camera shutter. The other way is waving your hand in front of the front-facing camera on your phone to identify the gesture and act accordingly. We all know how frustrating it can be to keep repeating the gesture to get a response.

Using finger gestures to simulate call to action functions (Image: ATAP)

Using finger gestures to simulate call to action functions (Image: ATAP)

What puts Project Soli apart is that it employs radar to detect natural hand and finger motions. According to ATAP, it uses radar-based sensor that is capable of tracking sub-millimeter motions at high speed and accuracy. The Soli radar chip works within the 60GHz radar spectrum and has a refresh rate of 10,000 frames per second.

It is capable of detecting fine motions such as sliding your thumb along your index finger (to simulate turning the volume knob up or down), tapping the tips of the index finger and the thumb (to simulate pressing a button) and so on. As seen in the video above, by using just two fingers you can actually simulate multiple actions which can be translated to perform particular functions.

Project head Ivan Poupyrev demoed controlling a smartwatch without touching its display, by making gestures from a slight distance from the smartwatch. For instance, he simulated rotating the watch crown to change the time and the radar-based sensor picked up the gestures and responded to them in real-time by changing what you saw on the display.

soli-evolution

The radar module which went from being the size of a console to a tiny chip in 10 months (Image: ATAP)

Project Soli aims to work with tiny devices which are able to take in finger gestures from a distance or even through a surface. ATAP has worked for over 10 months to get the radar implementation from a console-sized box to something tinier than a coin. The radar tracks the position and motion of your fingers and there are signals associated with specific natural finger motions.

Google is releasing the Project Soli API to developers to let them come up with their own applications and ways of interpreting data, based on finger manipulations. It will be seen inside devices later this year, although Google hasn't given an exact date.

From the demos seen in the video above, Project Soli points to a fresh way of looking at user interfaces. There have been certainly been attempts in the past, most notable being Leap Motion and the Microsoft Kinect, which aim at doing away with traditional interfaces such as keyboard/mouse for PC and gaming controllers respectively. Intel's RealSense 3D camera module is also capable of letting you interact with your environment using gestures.

But these implementations are optimised to be used in a larger field-of-view. When you go down to the size of wearables, there needs to be gesture detection over a very small surface area. Project Soli has given a way to tackle that question. Seeing it being implemented in a consumer product is still some time away, but it is surely something worth being excited about.

So while the day 1 keynote may not have been all that exciting, Google's ATAP keynote has shown that the Mountain View giant definitely has a few tricks up its sleeves.


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.