Google Project Soli Is Reshaping Gesture Recognition


Google Project Soli is yet another crazy venture from the folks who believe in the self-driving car and bringing Wi-Fi to the world. Currently we have several ways of interacting with our devices, from touch applications to physically turning, twisting and pushing buttons. Some devices have proximity sensors that allow you to page back and forth or scroll up and down with a gesture over the screen. Google Project Soli is reshaping the way gesture recognition is interrupted by the device by using something we’ve had for a very long time, radar. Using radar Google is able to track your fingers and read programmed gestures to perform an action like pinch to zoom or turning the volume up and down. Check out the video below to find out a bit more about Google Project Soli.

This is pretty awesome stuff going on and could reshape the way we interact with a myriad of different devices from our phones to our homes and cars. We’re excited to see where Google takes this technology and who, if anyone, they will work with to implement it into real world use. What do you think of Google Project Soli? Let us know in the comments below or on Google+, Facebook and Twitter.

Last Updated on November 27, 2018.


Native Multi-Window Support Appears In Android M Developer Preview

Kung Fury Review: A Short 80s Style Action Packed Adventure You Don’t Want To Miss


Latest Articles

Share via
Copy link
Powered by Social Snap