Google’s Advanced Technologies and Projects (ATAP) head Regina Dugan introduced a technology that aims to utilize the beauty and power of gesture control to the wearable devices. This is being done as a part of Google’s Project Soli.
This means that the smaller IoT devices could be controlled using our hand and finger gestures, even when they are not around. Project Soli is a gesture radar device and the previous ways to capture fingers’ motion in free space was using large radar dishes. Google ATAP overcame this problem by creating its very own gesture radar system. The team has been working on this project since last June.
The size of the Project Soli radar has been scaled down from the size of a computer to a tiny chip smaller than a coin that can fit in today’s smartwatches. The Soli radar re-imagines our hand as a part of its own user interface and needs a sensor to detect the motion.
The Soli chip tracks even the minute changes around it and determines the position of individual fingers. TechCrunch writes that an API will enable the developers to get the translated signal information and help them to get the various stages of interpreted data.
Jaime Lien, research engineer head from Google’s ATAP team explained at Google I/O how it will work with gesture sensing. She said, “The sensor can tell if I’m wiggling my fingers, or holding still.”
This tiny radar as a result of Google’s ATAP Project Soli will enable the users to interact with our wearable device without touching it.
Google confirmed that hardware and developer APIs will be made available within few months. This could definitely help in the improvement of Android Wear hardware and other smartwatches with some big launch ahead this summer.
No comments:
Post a Comment