Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

This Robot Is a Little Batty

DZone's Guide to

This Robot Is a Little Batty

Robots have used high-frequency sounds to measure distance, but this new tech reconstructs a 3D model of the environment by listening to the reflections.

· AI Zone ·
Free Resource

Did you know that 50- 80% of your enterprise business processes can be automated with AssistEdge?  Identify processes, deploy bots and scale effortlessly with AssistEdge.

It might surprise you to know that the fact that bats use echolocation was a mystery until 1938. In 1778, the Swiss scientist Charles Jurine demonstrated that when you plugged a bat's ears with wax, they navigated very poorly and crashed into things. But they didn't know about ultrasound back then. With the discovery of ultrasonic sound, it became possible for Robert Galambos and Donald Griffin to detect the bat vocalizations and combine that with the fact that the bats hearing had something to do with navigation to develop our current theory of echolocation. Griffin wrote an interesting book in 1958 "Listening In The Dark: Acoustic Orientation Of Bats And Men," which is still available in paperback! But I digress.

Tel Aviv University researchers decided to develop a robot that relies on echolocation to model its surroundings: Very much like a bat. And they named their creation...wait for it...Robat. Before you get too excited, this version doesn't fly, it just rolls around. Also, it only maps stationary objects so it can't aim for tasty moths! But it is a proof of concept, and who knows what next year holds?

Image title

Itamar Eliakim of Tel Aviv University commented that Robat demonstrates the "... great potential of using sound for future robotic applications.” To be fair, there have been a number of ultrasonic sound-based mapping systems but they are more like radar (or submarine sonar) in that they send out a narrow beam and measure the return of pulses in the circular scan display. But this is clearly not the way bats use ultrasound. Bats seem to be able to emit reasonably omnidirectional audio pulses and listen to the full complement of reflections and somehow make sense of the 3D world surrounding them. As part of their study, the team researched the many decidedly nonbiological approaches that have been engineered around ultrasonic sonar. But this team wanted to embrace the biological approach the bats were using and explore the problem from a signal processing point of view. Clearly, bats just listen to the sonic reflections and (somewhat mysteriously) directly compute a 3D model. The researchers based their Robat platform around a single emitter (representing the bat's mouth) and two microphones (representing ears).

In some ways, their approach to the problem was similar to the methodology used for speech recognition. Think about it this way: Speech recognition accepts an acoustic input and ultimately predicts the dynamic shape of the speaker's vocal tract. Yes, I know that speech recognition algorithms produce a text representation of the utterance. But what is the human perspective? Try this thought experiment: you can listen to a person speaking in a language you do not understand and you can repeat their phrases (totally without understanding) by placing your vocal tract into the same configuration as the speaker's vocal tract. Just by processing the waveform from the foreign speaker, you can almost instantly map the 3D terrain of that speaker's vocal tract. So it seems as though the biological computing power for this sort of signal processing is already in place. Bats have figured out a way to use it for more than just language.

So, using the reflected sound as input to a neural net, which was trained against a number of known representative 3D environments, the researchers were able to, in real time, navigate autonomously through previously unexplored environments using only the sound of the generated ultrasonic pulses and the acoustic reflections picked up by the microphones/ears. All navigation was based on the computed model and dead reckoning of its motion within that model.

You can experience this style of navigation yourself by walking across an area of obstacles with your eyes closed. After every two steps, you can open your eyes, look around, and close them before taking the next two steps. You are not navigating based on your immediate vision, but solely on an internal 3D model that you frequently update. Your peeks are equivalent to the bat chirps.

The team estimated that bats update their 3D environment about every half meter or so. They chose to use that incremental distance for their model. The study states “Every 0.5 m, the Robat emitted three bat-like wide-band frequency-modulated sound signals while pointing its sensors in three different headings: -60, 0, 60 degrees relative to the direction of movement.” The full report can be found here.

This initial project was about 70% accurate in determining the positions of objects in its model of the garden in which it roamed. Granted, the technology is nowhere good enough to target and capture mosquitoes in flight in the dark, but that's no surprise. Biological solutions will far outclass any of our engineered solutions for a long time to come.

Image title

I view these existing and totally amazing biological solutions as proofs of concepts. We know it can be done, so we shouldn't give up!

Consuming AI in byte sized applications is the best way to transform digitally. #BuiltOnAI, EdgeVerve’s business application, provides you with everything you need to plug & play AI into your enterprise.  Learn more.

Topics:
artifical intelligence ,neural network ,signal processing ,acoustics ,echolocation

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}