ADVERTISEMENT
latest stories:

Robotic Surgery Opens Up

If the open-source approach to building robot surgeons can cut costs and improve performance, patients will increasingly find them at the other end of the scalpel

Each Raven II includes two mechanized arms, a screen that lets the surgeon operate remotely, a camera that provides the visuals for the screen and software that enables these elements to work together.
 
In the early 2000s the researchers were interested in writing software for da Vinci but they became frustrated when Intuitive would not provide some of the information needed to complete their coding. “We realized that to be active in the field we needed a research platform that we have full control over,” says Jacob Rosen, associate professor of computer engineering in Santa Cruz’s Baskin School of Engineering and principal investigator on the Raven project.
 
Rosen, then at U.W., and his colleagues received funding from the U.S. Army to build their own robotic surgery system. Five years later, in 2007, the Raven I was born. Rosen redesigned the Raven when he moved to Santa Cruz in 2008 and the following year created a four-armed version he dubbed Raven IV. “I was thinking that if you want to replace the two surgeons in an operating room, you need four arms and two cameras,” he says.
 
By 2010, Rosen had reteamed with Blake Hannaford, director of U.W.’s Biorobotics Laboratory, to improve on the original two-armed Raven I. The collaboration ended up receiving $850,000 in National Science Foundation funding, and used that money to build eight Raven II systems they could give to other institutions pursuing robotic surgery research. This time Rosen and Hannaford stuck with two arms because they agreed that that would be more affordable for all of the research teams moving forward.
 
Inevitable comparisons
The Raven and da Vinci are designed to perform essentially the same tasks, but they differ in a few key ways. For starters, the Raven costs about $300,000, and its arms are anchored in different positions around the operating table. Da Vinci’s arms connect back into a single base—they essentially originate from the same spot. Intuitive did this to take up less space in the operating room.
 
The Raven’s design attempts to emulate the way a team of surgeons can work from either side of a patient during an operation, Rosen says. “By distributing the Raven’s arms in a similar way to how humans use their arms, you can avoid many of the problems with arms hitting each other,” particularly when four or even more arms are introduced into the limited space above the patient, he adds.
 
Unlike da Vinci, Raven is not approved to operate on humans. The researchers did, however, perform several simulated surgeries on pig cadavers while developing the Raven in 2006. Subsequent Raven experiments were done on individual organs extracted from dead animals, but much of the most recent work has focused on developing software to increase the system’s performance and expand its capabilities. Rosen and his colleagues last year formed the company Applied Dexterity, Inc., to produce the Raven as a robotic surgery research platform for other institutions.
 
Surgeons would be better served by having multiple robotic surgery systems to choose from, says Guy, the Temple University surgeon who has used da Vinci to perform numerous cardiac operations. “In general, I’m a big believer in open source,” he adds, but he raises the caveat that the open-source model is largely unproved when it comes to developing medical technology that requires “ little to no tolerance for problems.”
 
Taking flight
In addition to developing the Raven to perform soft-tissue surgeries, Rosen has been considering what is needed to make a robotic system that can perform even more complicated procedures, such as brain surgery. Also on the to-do list is programming the robot so that it could switch tools when needed during surgery without any human help.
 
Fully automated robot surgeons are much farther out on the horizon, if ever. Rosen points out that once a surgeon cuts into a patient’s skin or internal organs, the tissue changes shape and moves in ways that are difficult to program into a computer. “You would need to constantly use computer vision to extract information about the evolving environment in which the robot is operating,” he says. “This is very challenging to do in robotics. You want to automate the mechanical parts of the surgery, not the decision-making parts, which humans are good at.”

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Holiday Sale

Give a Gift &
Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now! >

X

Email this Article

X