Google is perpetually pushing the envelope in the world of technology. Their new Project Soli brings gesture control to wearables. The foundation of the project is built on substituting standard interface paradigms of the past, like the volume knob or the button with instinctual hand gesture vocabulary. Thereby, the user is able to interact with their smartphone, wearable or laptop by simply making certain movements (emulating the push of a button or turning a volume knob).
Project Soli replaces the physical control of smartwatches and potentially other wearables with hand gestures using radar to capture the hand movements. Remarkably, the radar has been truncated into a chip that can recognize movement, velocity and distance; and can be programmed to change the input based on distance. The Soli Chip works within the 60Ghz radar spectrum at up to 10,000 seconds per frame. The projects biggest virtue is the compactness of the chip. According to Project Soli Founder, Ivan Poupyrev, “What makes this project so promising: it’s extremely reliable. There is nothing to break, no moving parts, no lenses, just a piece of sand on a board.”
Soli Information will be released to developers in the near future, so stay tuned!
For more information, see the full article here.