How Robotic Surgery Demands Changes to Training
How Robotic Surgery Demands Changes to Training
Let's see how robotics and burgeoning technologies like VR are changing the healthcare industry and how we train our surgeons.
Join the DZone community and get the full member experience.Join For Free
Digi-Key Electronics’ Internet of Things (IoT) Resource Center Inspires the Future: Read More
The impact new technology will have on the future of work has been a common topic of discussion over the past few years, and it seems increasingly clear that there will be a new relationship between technology and human employees, with that relationship requiring new skills to enable that interaction.
Many of the discussions on this topic are projections into the future, and are therefore hypothetical, so it was interesting to read a recent study that explored the impact of robotic surgical tools on the healthcare sector.
The study was exploring how trainee surgeons typically develop their skills and experience, and how that learning process has been impacted by the introduction of robotic surgery tools.
The study examined teaching practices at 13 leading teaching hospitals in the United States over a two year period. The analysis revealed that the most common method of learning was getting stuck in with cases and very much a learning by doing approach predominated for student surgeons. This was an approach that worked well in what’s known as open surgery.
“Open surgery involved an attending physician (AP), medical resident(s), a scrub technician or ‘‘scrub’’ (an individual responsible for setup, breakdown, and exchange of sterile materials and tissue), a nurse, and inanimate, general purpose tools (e.g., sterile garb, retractors, tables, drapes, scalpels, sutures, cautery devices),” the paper explains.
When robotic surgery tools were used however, the learning environment fundamentally changed, with the role of the trainee limited significantly, thus creating a weaker learning environment for them.
“In robotic surgery, all (including the AP) immobilized the patient, inflated his or her belly with CO2, and then attached the robot to the patient via trocars (metal cylinders) inserted in keyhole incisions. The surgeon then sat in a console 15 feet or so away to view and operate inside the patient,” the paper explains.
This environment rendered traditional learning methods ineffective and required a fresh approach to learning to continue to provide the teaching benefits of the more open method of surgery.
It’s an approach the author calls ‘shadow learning’, which is a set of practices that allowed the robotic surgical trainees to obtain competence in their craft.
“Successful trainees engaged extensively in three practices: “premature specialization” in robotic surgical technique at the expense of generalist training; “abstract rehearsal” before and during their surgical rotations when concrete, empirically faithful rehearsal was prized; and “undersupervised struggle,” in which they performed robotic surgical work close to the edge of their capacity with little expert supervision—when norms and policy dictated such supervision,” the author explains.
Shadow learning enabled trainees to become proficient exceptionally fast and gave them deep access to risky work that afforded them exacting and valuable feedback. The downside is that it also resulted in them becoming incredibly specialized.
The relative lack of expert robotic surgeons at the time also resulted in fewer experts to mentor trainees. This resulted in a significant gap in capability between trainees exposed to these experts and those that weren’t.
This could potentially be overcome by the use of virtual reality to allow more students to learn from the experts that are available. It’s a field that is attracting a number of entrants. One of the first startups to enter the space was Medical Realities, who developed an augmented reality system to help train surgeons back in 2015.
Researchers from Purdue University and Indiana University School of Medicine followed this up with an augmented reality system, called the System for Telementoring with Augmented Reality (STAR), was documented in a recent study and harnesses a range of technologies to provide surgeons with a transparent display, and several sensors to improve the communication between mentor and mentee.
The same year saw virtual reality startup EchoPixel launch a system that used VR to create lifelike replicas of organs from 2D images to help surgeons prepare for operations. It’s an approach that was taken to a new level by the company when they offered surgeons the opportunity to use 3D printing to create a physical incarnation of the organ.
Tech giant Intel have also entered the space, and have teamed up with Surgical Theater to offer virtual reality services to surgeons via their Precision VR platform.
Room One are another entrant into the space. They recently demoed a live-streamed medical surgery procedure in virtual reality. The team hope that it will enable students and doctors to watch operations from anywhere in the world. The project featured a consortium of partners, including Ericsson, BT, King’s College London (KCL), Room One, and OPTO.
There may be ways around the variance in training opportunities presented in the original study, but it’s clear that things will need to adapt to the new technologies available.
Published at DZone with permission of Adi Gaskell , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.