“In the end, the way you change things in the world is by moving.”

That’s another one of Kuchenbecker’s unofficial mottos—and for most people, a banal fact of normal life. But for stroke survivors who develop apraxia, it is the defining impediment to normal life.

Apraxic stroke patients have difficulty planning and carrying out purposeful movements. They can see a cup of water on the table before them; they can think about grabbing it and taking a sip; but something invades the space between desire and action to foil their attempts. Their shoulder might swivel the wrong way. Their elbow may overextend, or scissor shut at the wrong moment.

Practice helps. Patients who manage to repeat such routine motions over and over can sometimes regain the ability to carry them out consistently. But showing them how to do it isn’t enough.

“These patients can’t interpret the visual feedback,” says Kuchenbecker. “It doesn’t help them to be able to see how they’re messing up.” They need to feel their way toward success.

It’s a daunting job for a physical therapist. Teaching someone how to relearn these motor skills involves countless repetitions—and providing too much physical help can undermine the process.

“From the videos we’ve watched of these patients,” Kuckenbecker says, “sometimes the therapists do actually push, and do the motion. But they’re trying to get the patient to do the movement themselves. They’re trying to get them to make the new connections in their brain, to explore and figure out, How can I get my arm to move in that way?

Some researchers have experimented with planar robots—devices that can guide a patient’s hands along certain trajectories, mechanically pushing them in the right direction when they veer off track. But that leads to another catch.

“It turns out that having the robot help you in this way maybe makes you do a better job of the task right now, but it doesn’t transfer to real life, because the robot is doing it for you,” Kuchenbecker explains. “So we came up with this idea of a sleeve—and eventually, an entire suit—that would know how you’re moving and give you [tactile] guidance.”

As the spring semester wound down, one of her master’s students, Pulkit Kapur GME’10, demonstrated a prototype he had worked on with Kuchenbecker and a pair of clinical researchers at Philadelphia’s Moss Rehabilitation Research Institute. It was a tight-fitting sleeve embedded with sensors whose precise spatial relationships to one another can be monitored in real time by a magnetic tracking device, alongside small eccentric-mass motors (the same things that make your cell phone vibrate) that deliver little high-frequency buzzes to certain parts of the arm.

Plugged into a laptop, the sleeve tracks the arm movements of the person wearing it, translating the sensor data into a moving image of a virtual arm on the screen. Meanwhile, whenever the patient’s arm drifts away from its intended trajectory, one or more pager motors goes off, signaling the error the way a therapist might—albeit with a high-frequency vibration instead of a gentle touch of the palm—to prod a self-directed correction.

“It has to be a little more fancy than a Wii remote because we actually need to know where’s my forearm, where’s my upper arm, where’s my torso, what are the joint angles?” Kuchenbecker says. “And then give them some feedback to help guide their motion, to help make the task more interesting and easier to do.”

“The goal is that this could be something that could be in a rehabilitation clinic,” says Kuchenbecker. While the $10,000 price ceiling set by her clinical collaborators might make the sleeve attractive for that setting, “for it to be truly, truly useful, it would be great if it was something a patient could take home with them, which is on the order of, rather than thousands of dollars, hundreds of dollars.”

“I’m personally interested in also testing athletes,” she adds. “For a stroke patient, they’re relearning motions that they used to know. Whereas an athlete or dancer is maybe trying to really push themselves beyond what’s typical.”

That prospect is several steps ahead of current capabilities. Getting there—and achieving the sort of sophistication that might really begin to change the game in robotic surgery—will hinge to some degree on figuring out how to go from buzzing someone’s arm with a pager motor to imparting more naturalistic sensations.

Pumping information about a tissue’s elasticity across the room to a surgeon’s fingers will require more subtlety and nuance than simulating a videogame bullet strike. After all, a gamer dodging virtual cannons and crossbows probably isn’t looking for strict verisimilitude.

The current advantage of things like pager motors is that they’re small, cheap, and easy to program. “But they’re not what I want to use in the long term,” says Kuchenbecker. “So we’ve been starting to develop what I call new tactors—tactile actuators that either make or break contact with your skin, or vibrate but in a more interesting way, a more natural way. Like, let’s record this thump for someone thumping your arm like this,” she says, rapping a fingertip against her forearm, “and play that thump, thump, thump so it’s more natural instead of this very high-frequency, annoying zzzzzz.”

This fall, she’s bringing a postdoctoral researcher to Penn who will focus on modular devices that can provide skin-stretch feedback. “So say you’re a transhumeral amputee, and I want you to be able to feel the elbow angle of your prosthesis without looking at it,” Kuchenbecker explains. “I could, like, put that [skin-stretch tactor] right on your upper arm so you could feel the extent that this little tactor is stretching your arm,” which would in turn enable an amputee to intuit the prosthetic limb’s spatial position.

The underlying challenge is partly about advancing technology, and partly about understanding how our bodies and brains convert physical stimuli into sensations.

“For haptics,” Kuchenbecker observes, “we work on understanding the capabilities of the human sensing system so that we can try to take advantage of them, exploit them, or build on them.”

Which is just what Saurabh Palan was exploring with his Tactile Gaming Vest. It’s not terribly hard to tap into a computer game for data on what directions the bullets are flying from. The art comes in tricking someone into feeling something that doesn’t quite line up with physical reality. “You need to fool your body or your mind,” as Palan put it. And that’s exactly what he’d done to simulate the searing pain of a bullet entry. It turns out that placing a cold Peltier element right next to a hot one triggers an intense burning sensation without the slightest damage to the skin.

“The human central nervous system and peripheral nervous system evolved interactive with natural stimuli,” Kuchenbecker says. “Your brain is trying to construct the most likely explanation for the feedback it’s feeling … So 1,000 years ago or 2,000 years ago, your body probably would not have experienced a very warm something next to a very cold something. And so there’s this peculiar illusion where you can create a burning sensation because you’re stimulating the nerves in a way that they didn’t typically get stimulated.

“And now we can create all sorts of artificial stimuli that create contradictions, or exploit the underlying method of the sensing system,” she adds. “It’s all about, can we capture the feel of an interaction the same way that you can capture an appearance, and store the parts that are salient … and then can we recreate it, really realistically, for the user to experience later?”

There is something at once exciting and unsettling about all of this. In the last 150 years, human beings have come to terms with the power of photography to preserve fleeting images for as long as we care to keep them. In the last 50, film and video have intensified that ability. We experience places without having visited them, remember events without having witnessed them. In our era of relentless documentation, intimate memories of wedding dances have a way of being supplanted by DVD versions viewed many times afterward, and children may remember their first home runs and ballet recitals more keenly in highlight-reel format than in subjective recollections of the experience itself. What is in store for us when our physical sensations can be distilled into portable and everlasting formats, to buy, sell, save, and replay whenever we like? It is a question that may be answered sooner than you think. The virtual world is coming ever closer. The day is coming when you will reach out and touch it.
 

July|August 2010 contents
Gazette Home


  page 1 | 2 | 3 | 4 | 5

COVER STORY: Touching the Virtual Frontier By Trey Popp
©2010 The Pennsylvania Gazette

 

 

page 1 | 2 | 3 | 4 | 5
©2010 The Pennsylvania Gazette
Last modified 6/30/10