Indeed, only recently a team from Harvard developed a surgical robot that takes inspiration from an octopus. Their project highlights both the progress made in robotics and also in the manufacture of the robots.
The device is untethered, and thus completely autonomous. It requires no electrical power, nor any form of control system to operate. Instead, the power comes via chemical reactions between hydrogen peroxide and platinum. This forms a gas that flows through the machine to provide it with power.
Robotics in the Real World
That is undoubtedly fascinating, but in most real-world settings, robots will probably need to take a slightly different form. So a recent project from Cornell should be of interest, as they’ve developed a way for a soft robot to feel its surroundings in much the same way humans do.
Traditionally, robots grasp and sense through motorized means, but this can often be bulky and inflexible. The Cornell team believe they’ve developed a better way to do it. They use stretchable optical waveguides to act as the curvature, elongation and force sensors in a soft, robotic hand.
“Most robots today have sensors on the outside of the body that detect things from the surface,” they say. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
As with so many things, optical waveguides have only really become available in this way as changes in the ecosystem have occurred, which in this case was the development of 3D printing and soft lithography to ensure elastomeric sensors can be easily developed.
A number of tasks were performed by the prototype prosthesis, including grasping objects and probing for texture and shape. For instance, it was able to gauge the ripeness of tomatoes.
Whilst there are obvious applications as a prosthetic, the team also hope that the device will also be used in a wider range of robotic use cases.
They next hope to look at enhancing the sensory capabilities and believe they can do so by 3D printing more complex sensor shapes. They also believe they can incorporate machine learning to help decouple signals as the number of sensors increase.
It’s a fascinating project and a nice insight into the progress being made. You can see the hand in action via the video below.