“You cannot cause effects in the world without physically touching things.”

As professional credos go, that’s a pretty mundane one. But coming from Kuchenbecker it has an unusual subtext. For one thing, she works in a discipline whose sights have long been set on eliminating the need for people to physically interact with things. Roboticists by and large still hew to a Jetsons-style vision of the future. Their promised land is one where machines unload the dishwasher, cars drive themselves, and there’s no need to give soldiers a virtual preview of bullet wounds because androids will be manning the trenches.

“Here in GRASP,” as Kuchenbecker puts it, referring to Penn’s General Robotics, Automation, Sensing and Perception Lab, “there are many folks who work on autonomous robots. How do I make a robot that can do stuff on its own? And I am working on that. But personally, I think that’s a rather far-off goal in the domains that I am interested in.”

What interests her is the realm of touch and movement. If the stereotypical engineering professor is an eggheaded genius who makes Fourier transforms look easy but hopscotch look hard, Kuchenbecker doesn’t fit the type. She played volleyball at Stanford. She takes dance classes in the Pottruck gym. “I pretend to be a graduate student,” she laughs. She doesn’t have to pretend very hard. She’s not much older than a lot of them, and probably fitter than most. Her athletic pursuits also happen to line up nicely with her academic research, which focuses on the intersection of technology and the human body.

“The area that I focus on is robotic technology to help a user do a task,” she continues, “or make an interaction that they’re having with some sort of technology, like a computer, more interesting, more immersive, [to] let them be able to do what they’re trying to do better. And so most—let’s say all—projects in my lab include either a human interaction with something, or touch-based interaction on the robotic side.”

That’s the other thing that makes her statement about physically touching things a little strange. The more she talks about her research, the fuzzier the definition of touching becomes. Not to mention things. Haptic interfaces, as she describes them in the syllabus of a graduate-level class she teaches, “employ specialized robotic hardware and unique computer algorithms to enable users to explore and manipulate simulated and distant environments.”

Haptic technology has a history that goes back a few decades. The controls in modern aircraft, for example, incorporate some sorts of tactile feedback; nothing grabs a pilot’s attention like a shaking joystick. When flight controls were mechanically linked to wing flaps and so forth, things like that happened somewhat naturally. When computerization severed that link, engineers turned to haptic interfaces to replace the lost sensory stimuli with simulated equivalents. The idea—in airplanes, cars, and every other field haptics touches—is to improve and enrich the connection between a person and a machine, making its operation as intuitive as possible.

As more and more of our daily activities migrate to the digital domain, haptic technology is entering another phase.

“This is a very hot area, because we live in two worlds,” says Eduardo Glandt GCh’75 Gr’77, dean of the engineering school. “We live in the real, physical world, and we also live online—we live in the virtual world of the Internet and computers. It’s surprising how much our life now is in that other world. People play and study and shop and find friends, and everything happens virtually. Haptics is the interface. It’s the way the two worlds touch.”

Increasingly, it will also be an interface that connects one realm of the physical world with another. One of the environments currently at the center of Kuckenbecker’s research is the inside of the human body. She wants to enable surgeons who slice and stitch using robot-assisted laparoscopic devices to actually feel what’s at the tips of their instruments. Moreover, she wants trainees to be able to experience those sensations without ever entering someone’s abdomen.

“The way that surgeons learn is actually barbaric,” she says. “Don’t tell the surgeons I said that. They would say, maybe, primitive. But it’s scary if you’re the patient, because a trainee watches an expert do a procedure a couple times, and has read about it in a book, and then they try it and someone watches them. And if they mess up they’re chastised, they’re corrected. But it’s a very high-stakes, high-pressure environment to practice in.”

A high-fidelity virtual reproduction of that environment, lifelike down to the textural differences between healthy tissue and tumors, would make for a safer training ground.

“Surgeons watch movies of people doing surgery,” she goes on. “Well, what if you could watch it and also feel what the surgeon was feeling? I think there’s a benefit there. But no one has any idea. They’ve never done it before.”

The technologies being developed in the Haptics Lab, though fragmented and very much in their infancy, are steps toward a first attempt. One that may prove foundational for the field is a project that Kuchenbecker has been working on with a PhD candidate named Joe Romano GEng’10.
 

July|August 2010 contents
Gazette Home


  page 1 | 2 | 3 | 4 | 5

COVER STORY: Touching the Virtual Frontier By Trey Popp
©2010 The Pennsylvania Gazette

Katherine Kuchenbecker drags a stylus across a flat-screen monitor. Computer algorithms trigger the attached motors to produce a tactile illusion that she’s scraping over one of the textured materials to her right. Photo by Candace diCarlo

 

page 1 | 2 | 3 | 4 | 5
©2010 The Pennsylvania Gazette
Last modified 6/30/10