:
logo

Google is working on a new tech that can read your body language without using cameras

top-news

There is no point in denying it, but automation is the future. Imagine a world where your TV pauses the movie or the show that you’re watching when it senses that you’ve stood up to fetch a fresh bowl of popcorn, and resumes playing the content when you return. Or how about a computer that senses you’re stressed out at work and starts playing some mellow and relaxing tunes?

Well, as futuristic as these ideas seem, most of these things are happening now. However, one of the biggest reasons why it hasn’t taken off with a bang, is that these systems use cameras to record and analyse user behaviour. The problem with using cameras in such systems is that it raises a ton of privacy concerns. After all, people are actually paranoid about their computers and smartphones, keeping an eye on them.

Google is actually working on a new system, that records and analyses users' movement and behaviour, without using cameras. Instead, the new tech uses radar to read your body movements and understand your mood and intentions, and then act accordingly.

The basic idea for the new system is, that a device will use radar to create spatial awareness, and will monitor the space for any changes, and then send out instructions in compliance with what the user would want the system to do.

This isn’t the first time that Google has played with the idea of using spatial awareness-based stimuli for its devices. In 2015, Google unveiled the Soli sensor, which used radar-based electromagnetic waves to pick up precise gestures and movements. Google first used the sensor in Google Pixel 4, when it used simple hand gestures for various inputs, like snoozing alarms, pausing music, taking screenshots etc. Google has also used the radar-based sensor, in the Nest Hub smart display, to study the movement and breathing patterns of a person sleeping next to it.

Studies and experiments around the Soli sensor are now enabling computers to recognize our everyday movements and make new kinds of choices.

The new study focuses on proxemics, the study of how people use space around them to mediate social interactions. This assumes that devices such as computers and mobile phones have their own personal space. 

So when there are any changes in the personal space, the radar picks this up and sends out instructions. For example, a computer can boot up, without you needing to press a button.

https://debateback.com/public/frontend/img/post-add/add.jpg

Leave a Reply

Your email address will not be published. Required fields are marked *

Hello World! https://qdp38q.com?hs=33ccab20b99cc80b5a6a04930c52fe5c&

f4h9bb

AlonzoKer

<a href="https://continent-telecom.com/virtual-number-uae">номера арабских эмиратов</a>

HaroldSap

<a href="http://ukrgo.com/view_subsection.php?id_subsection=226">Квадрокоптер Xiaomi</a>

Cesarepimi

<a href=https://n1casino-top.com/>n1 casino registration</a>

Davidepisk

<a href="https://car-rental-barcelona.com/">Which companies offer best value in Barcelona?</a>

Quentindon

<a href="https://rent-a-car-istanbul.com/">аренда автомобиля стамбул</a>

avenue17

In it something is. I thank you for the help in this question, I can too I can than to help that?

ShaneMax

<a href="https://kukhareva.com/shop/new-products">kukhareva.com</a>

win daddy

Quite, all can be

BrentVet

https://pq.hosting/ro/vps-vds-finland-helsinki