Google Soli Radar sees your intentions, and it's less scary and more useful than you think

Google Soli Radar sees your intentions, and it's less scary and more useful than you think

Radar, a technology nearly a century old, is proving to be far more useful in modern computing and electronics than anyone might have thought.

It's added to light bulbs to help detect if someone has fallen and can't get up, and was introduced in Google's Pixel 4 Android phone a few years ago to enable touchless gesture-based activities.

Now, however, Google's ATAP search is becoming more aggressive, using radio ranging and sensing technology to detect people, their actions, their orientation, their position relative to sensors, and most importantly, what they it means that movement and interaction.

a new mesh

On Tuesday, Google's ATAP group, which is responsible for researching wearable technology and Soli Radar in the Pixel 4, unveiled its latest Soli Radar research project, which focuses on nonverbal interactions with the radar system.

Described in a video on YouTube, the researchers explain how they use a mesh network of Soli radars, which can be integrated into devices like Google Nest, to detect people as they move around and near devices.

"Environmentally and socially aware devices" use Soli and machine radar and deep learning to understand what our actions mean. Everything from a wave of the hand, a turn of the head, the distance from the devices, the speed at which we pass them, and even the orientation of our body in relation to the devices, tells the system something about our intent.

“We are inspired by the way people interact with each other. As human beings, we understand each other intuitively, without saying a single word. Google's ATAP Design Lead, in the video.

The clear goal is for computers to understand us as if they were a little more human.

Radar sensors help devices (as well as this AI) understand the social context around them and then act accordingly.

In the video, they showed opaque screens noticing when people looked at them. They could tell the difference between someone intentionally giving their full attention to the screen and someone just casually looking. In one case, a small white screen quickly flashed rain and an umbrella to alert someone leaving that they might want to grab an umbrella.

It was somewhat amusing to hear researchers describe a computer's neighborhood as "your personal space." Even so, the idea is clear, a computer equipped with Soli Radar could respond to a person's approach more like another person rather than just a computer. Instead of waiting for you to tap the keyboard to turn it on, it might detect your approach and get started.

"Approach" is actually one of the new computer interaction primitives. The others are "go out", "look" and "pass". The device's actions differ depending on the motion primitive it detects.

"Our technique uses advanced algorithms, including deep learning, to understand the nuances of people's subtle body language," Eiji Hayashi, Google's ATAP human interactive lead, said in the video.

These nuances include which way your head is facing, but could also mean that the Soli Radar system can tell when you tilt your head to indicate, "I don't understand."

One of the advantages of using radar over optical sensors for intent detection is that radars don't "see" anything and collect images. They simply use radio waves to build a mesh of motion, motion, and position. It's up to the software to make sense of it.

Although some radar technologies are already in consumer electronics, Google's ATAP Soli Radar project is far from being produced. Perhaps that is why a future with radar seems so close to the XNUMXrd century.

"These devices are designed to respond to us from the background in a quiet and respectful way," Timi Oyedeji, Google's ATAP interaction designer, said in the video, "You can get these gentle prompts and have interactions that are helpful, but not annoying."