Teaching Robots to See
Teaching Robots to See
In recent years, improving the visual processing capabilities of automated devices has been an undertaking key to robotics (and their future) interaction in our homes, schools, and workplaces.
Join the DZone community and get the full member experience.Join For Free
Digi-Key Electronics’ Internet of Things (IoT) Resource Center Inspires the Future: Read More
I wrote recently about the latest developments in 3D mapping for drones, which aims to help the devices navigate terrain independently of a pilot.
The benefits of having machines capable of seeing and therefore navigating environments is clear, so a recent development from a team at the Dyson Robotics Lab at Imperial College London is very interesting.
The team have developed some open source software, which they’re calling ElasticFusion, which gives robots a better understanding of their environment, and their place in it.
The ultimate aim is to allow robots to operate more safely in the home by mapping the environment and identifying elements within it.
This would support the robot in undertaking various household tasks that could enable its operation in places such as care homes. Dyson Robotics researchers say,
The family home is actually quite a complex environment for a robot to map. Houses are filled with breakable objects, family members that constantly move about, and a range of complex appliances that need to be operated safely. Domestic robots will need to negotiate all these challenges with aplomb to become a useful tool for making our lives easier. Elastic Fusion is the first step towards making vision that can help a robot to negotiate the complexities of the home.
How it Works
Data is captured using a depth-sensing camera with an application, then processing the data to produce a real-time map of the room in 3D.
The team believe their process allows for an ongoing observation of the environment and a rapid adjustment of the map as things change and move about.
The process can take on a wider range of tasks once depth cameras evolve to be able to function as effectively outside as they do inside. There are also limitations in the software in its management of people and pets as it currently struggles with moving objects.
As such, it’s still very much a work in process, but the software is free to download for other researchers to build upon, so it will hopefully prove a strong stepping stone to greater improvements in the area.
Check out the video below for more information about ElasticFusion.
Published at DZone with permission of Adi Gaskell , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.