|A second scenario on Romano’s monitor showed another direction this technology is being taken. This one, which displayed what looked like three parallel sheets with a featureless rod hovering above them, amounted to a crude simulation of the sort of incision that precedes laparoscopic surgery. Unlike open surgery, which involves cuts large enough for a surgeon’s hands to pass through, laparoscopic surgery is conducted by inserting skinny rods through one or more holes as small as the tip of your pinky finger. One rod is equipped with a video camera—sometimes a twin-lens model with 3-D capability. Others have tools designed for actions like cutting tissue or gripping suture needles. It’s a minimally invasive technique, but it starts with what can be a tricky incision. A surgeon uses what’s called a Veress needle to create the port for all these instruments to pass through.
That’s what the Phantom Omni’s stylus stood in for this time. A certain amount of pressure applied to the top virtual layer pierced it. The stylus, suddenly unopposed by that pressure, lurched forward. The second layer, representing another sort of tissue, had a different level of elasticity. The third layer had still another feel to it.
Endowing such a simulation with the level of textural detail Romano has been modeling could be a big deal for medical training.
“As we do more minimally invasive surgeries, one of the areas that becomes very critical is getting proper access to the abdomen,” says David Lee, chief of the urology division at Penn Presbyterian Hospital and an assistant professor of surgery at the School of Medicine. A surgeon has to puncture the skin, and the fascia underneath, but take care not to go into the next layer of tissue. “Because the bowel is sitting there, and if you injure the bowel, and you don’t see that you’ve injured it, those patients can do really poorly.”
This is a skill that comes with experience, he adds. “But what cost is it to your patients when you’re in your first few cases and you don’t do the right thing? So the more simulation tools that we have, the better—especially at a place like Penn, where we train lots of residents and medical students. To have them work in this no-risk environment and develop these proper feels of how things are supposed to feel, it’s humongous.”
Though the Phantom Omni simulation wasn’t directly modeled on the actual properties of human flesh, Kuchenbecker envisions “capturing the feel of real interactions” via haptic add-ons to the tools surgeons use already. “Then we could build mathematical models later to let a trainee practice that,” she says, “and experience: Okay, this is what it might feel like with a really healthy young person. This is what it might feel like with an obese patient … ”
Lee is an expert in robotic laparoscopy, in which a surgeon doesn’t actually hold onto the rods, but instead sits at a computerized console that basically channels his hand movements to the tiny tools at their tips, inside the patient. Prostate cancer surgery is his specialty.
“The old open radical prostatectomy involves an incision from the belly button down to the pubic bone. Guys did pretty well, but you know, it’s a bloody operation, and guys are pretty sore down there for a few weeks,” he says. “The robot gives you certain advantages. You can see in 3-D … and the robot instruments are also wristed. [With] standard laparoscopy instruments, you can just go in and out, and open and close, but that’s about all you can do.”
What this sort of computer-assisted operation sacrifices is the sense of touch, which has traditionally been integral to the practice of surgery. “When you’re seated at the robot,” Lee explains, “where all of these potential sensations are blunted by going from the tower, through the wires, into the surgeon console and then to your hand controllers, you really don’t have any sense of feel anymore.”
Kuchenbecker is developing a haptic interface that would restore those lost sensations. Her prototype, built around the same surgical system Lee uses (made by a company called Intuitive Surgical), deploys accelerometers and some fancy wiring to transmit vibrations in the rods back to the surgeon’s fingertips.
“What surgeons have become accustomed to in open surgery,” she says, “is when they pull on a suture, they can feel the tension. When they cut tissue, they can feel it’s breaking through. If they cut a suture, they can tell if they cut a suture or they missed it. If they’re cutting tissue they may get a sense of is it healthy or is it diseased, as I’m interacting with it, or as I’m palpating or digging around, trying to look for something. And all of that haptic information is absent when they’re using the robot. They learn to compensate through vision, by what they can see.”
Her current model restores some (but not all) of this tactile feedback with a time delay of 1.6 milliseconds, or about three times faster than a honeybee can flap its wings.
Lee doubts this would make much of a difference for an expert surgeon. (He reckons he’s in the top five in the world in terms of prostate cancer cases done with the robot.) But he believes it would be valuable for surgeons learning how to do robotic laparoscopy. “The first few times you sit down at the robot, you want to reach your hand in there and touch it so you know where you are,” he says. “So surgeons who are experienced at open [radical surgery] and try to switch to the robot, they have a hard time sometimes because they lose that extra feedback.”
The feedback they’d get through Kuchenbecker’s vibration sensor isn’t the same thing as putting your fingers directly on the prostate, he adds, but it could be valuable in a different way. “In the robot surgery setting, because we’re working in a narrow space and you have sometimes three or four instruments, along with a camera, working in [that] space, you have a lot of potential instrument collisions. If you can feel, off-camera, that your instruments are bumping … that’s a place where a less experienced surgeon, if you don’t feel that at all, and start pushing, pushing—all of a sudden you could have this big release,” he says, jerking an imaginary scalpel tool through the air. “Whereas if you feel that right away, you know [that you’ve] got to back up and come in again. So I think there are a lot of benefits in helping a surgeon along that learning curve.”
Kuchenbecker was planning to run a study this summer to measure the effect of this haptic feedback on expert surgeons and trainees. “Maybe this could make it easier to become an expert,” she says. “Or maybe it just makes surgery less stressful, less cognitively intense … I liken it a lot to driving. If you’re driving eight hours a day, if your car was just a little more comfortable, or if your mirrors were just a little better aligned, or if you had better information from the car or a better connection between you and your car, maybe it would make that experience easier.
“Or maybe,” she says, “it can let experts reach a higher level of skill.”
That’s not idle speculation, says Lee, who mentions real-time elastography as an example of where robot-assisted surgery could be headed. “With traditional ultrasound, you just get a picture,” he explains. “But with elastography, it sends certain impulses, and then through mathematical calculations it can tell you how elastic the tissue is. So it could help you feel how elastic the tissue is—or feel hard areas within the prostate, maybe even better than what your fingers can feel.”
Kuchenbecker’s prototype “is the first generation of developing tactile feedback for the robot,” he adds. “But it could turn into a lot of different things where you develop sensors at the tip of your robot instruments that allow you to feel things or see things that you could never do [in] open [surgery]. So you could add all these extra tools and get information pumped to your eyes—and your fingers—as you’re doing the operation that you couldn’t dream of before.”
|page 1 | 2 | 3 | 4 | 5
COVER STORY: Touching the Virtual Frontier By Trey Popp
|page 1 | 2 | 3 | 4 | 5|
| ©2010 The Pennsylvania
Last modified 6/30/10