On the surface, you would perhaps imagine that healthcare is a peculiar place for robotics to take off. After all, caring for someone is one of the more human things we can engage in, and yet robots are making impressive progress, both in terms of what they can deliver, but also their growing acceptance by both patients and medical professionals alike.
From old to young, robots are beginning to provide support for us in our time of need, and with healthcare systems around the world struggling to provide a suitable level of care at an affordable level, it is perhaps no surprise.
Make no mistake, however, this is not a case of robots being used to provide cheap and sub-standard care instead of a better and more rounded human nurse or doctor. The technology has progressed at a rapid pace, and modern robots are capable of doing some seriously impressive things.
I met recently with the team behind Qihan’s Sanbot, which the company branded as robotics-as-a-service. The device comes with a range of sensors so that it can safely navigate its environs, and a number of tools to help communication.
Perhaps unsurprisingly, healthcare is one of the industries the Qihan team are targeting as they roll out Sanbot across Europe, and the thing that potentially enables the device to stand out from the crowd is the open API that underpins Sanbot.
The team hope that this will enable the functionality of the standard device to be expanded upon considerably, and there is certainly a tremendous amount of potential.
I have touched on a number of fascinating developments in healthcare that could potentially augment the Sanbot. Telehealth is an obvious service that could potentially be delivered via the Sanbot, but recent developments in video technology offer a much wider range of applications.
For instance, I’ve covered previously a new technology that uses video footage to prove a patient is taking their medication.
The technology sends the patient a reminder and then requests that they use the camera built into their phone to video themselves taking the medicine.
The machine learning then kicks into gear, attempting to recognize that the person in the video is the patient, and then to identify the pill in the mouth of the patient to prove that they have taken their medicine.
You even have startups such as Affectiva who use the footage captured from a camera to gauge our health. The company believes that by monitoring your face via video or image, they can detect things like your heart rate, stress levels, and various other potential health issues.
Or you have the solution developed recently by a team from the University of Missouri who have developed sensors capable of measuring the gait and stride length of people, and therefore predict their likelihood of falls.
There are even services now that use automated means to test for things like speech disorders and other mental conditions. For instance, one recent study found that if a robot has an avatar that is similar to ourselves, it aids our mental well-being considerably.
The researchers found that indeed there were connections forged between players who noticed similarities in behavior, even if those players weren’t in any way real. The authors believe this could be a crucial finding for the successful rehabilitation of patients with social disorders as the artificial peers can be programmed to have the same traits as the patient.
“It is very challenging to build an avatar that is intelligent enough to synchronise its motion with a human player, but our initial results are very exciting,” the authors say.
With an open API, the SanBot robot has the potential to fulfill many of these roles, but the device will only be as strong as the network of entrepreneurs and partners that plug into it.
The potential for robots to perform pastoral duties, however, is nonetheless an interesting one. A study, published in Nature last year, found that we are capable of empathizing with our robot companions.
The research found that we do tend to empathize with robots in a similar way to humans, but we tend to take a bit longer to warm up to them. The authors suggest this is largely down to the challenges in understanding the perspective of the robot.
Having a robot that can tap into and compliment the great work being done by others in this field seems a compelling proposition, so it will be interesting to see just how far the Sanbot goes.