Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

The AI That Can Sense Movement Through Walls

DZone's Guide to

The AI That Can Sense Movement Through Walls

Can AI sense movement through walls? This article looks at how this process works and the many benefits that it can provide.

· AI Zone ·
Free Resource

The most visionary programmers today dream of what a robot could do, just like their counterparts in 1976 dreamed of what personal computers could do. Read more on MistyRobotics.com and enter to win your own Misty. 

The prospect of monitoring our health and wellbeing from inside the home is one of the more fascinating developments in health technology. A team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has made a fascinating breakthrough that might allow them to capture this data through walls.

Their work, which is known as RF-Pose, utilizes AI to allow wireless devices to monitor and understand people's postures and movements, even when a wall separates them. A neural network has been developed to analyze radio signals that bounce off of our bodies, with the software then able to create dynamic stick figures that replicate our movement.

The team believes that the technology could be used to monitor a range of diseases, such as multiple sclerosis and Parkinson's. It could also be crucial in assisting the elderly to live an independent life for as long as possible.

"We've seen that monitoring patients' walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn't have before, which could be meaningful for a whole range of diseases," the team explain. "A key advantage of our approach is that patients do not have to wear sensors or remember to charge their devices."

Outside of Healthcare

The team also believes that the technology could have a range of other uses, whether in video game entertainment or even search-and-rescue missions.

Traditionally, neural networks are trained using data that is manually labeled by humans. Whilst this is feasible for images of animals or something, it's much harder for radio signals. The team attempted to overcome this by collecting data from both the wireless device and a camera. They collectively gathered thousands of images of people doing a range of things, from walking to opening doors.

These images were then used to extract stick figures, which were shown to the neural network alongside the corresponding radio signal. This allowed the algorithm to understand the association between the stick figure image and the radio signal.

When the system was then put to the test, it was able to accurately estimate the posture and movements of the individual without any other data.

It's a fascinating project, and you can see the system in action via the video below.


Robot Development Platforms: What the heck is ROS and are there any frameworks to make coding a robot easier? Read more on MistyRobotics.com

Topics:
artificial intelligence ,macine learning ,deep learning ,neural networks

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}