SEAS students hack gaming device to help visually impaired

Hacked

Junior SEAS students Eric Berdinis and Jeff Kiske work on Kinecthesia, a belt-worn camera system that gives users feedback through directional vibrations.

Despite amazing advances in computers and cameras, people with serious visual impairments are often aided with the most basic technology imaginable: a cane.

Earlier this year, juniors Eric Berdinis and Jeff Kiske, both computer engineering majors in the School of Engineering and Applied Science, hacked together a high-tech upgrade for the visually impaired out of off-the-shelf video game equipment. Called the Kinecthesia, it’s a belt-worn camera system that gives users feedback about their immediate surroundings through directional vibrations.

Although it’s fresh out of the workshop, the Kinecthesia is already generating buzz: it was selected as one of 10 projects for Google’s Zeitgeist Young Minds conference, which highlights college-aged innovators.     

Berdinis and Kiske started the project as their final assignment in professor Rahul Mangharam’s embedded systems class. Tasked with creating a medical device, the duo began exploring the Microsoft Kinect, a video game controller that uses multiple cameras to translate a player’s real-life motions into actions on the screen.

 

“We saw that there wasn’t much in the way of assistive devices that had to do with vision, despite all of these new cameras and things like the Kinect,“ says Kiske. “We just thought it looked cool and started playing around with it.”

Recognizing the Kinect’s ability to translate details about environmental depth into digital information as a route to a high-tech upgrade on walking canes, the team began figuring out how to integrate the technology into a wearable device. Getting the cameras to talk to the BeagleBoard, a miniature, customizable computer at the heart of the system, was the first step.

Hacked Belt

Kinecthesia attached around a model human waist.

“The Kinect wasn’t intended to work with anything but the Xbox, so modifying the code to make it work on this processor was one of the biggest challenges,” says Berdinis.

Though the Kinect is great at determining how far away objects are, another challenge was deciding how to relay that information to the user.

“We didn’t want to overwhelm the user with audio cues or vibration motors all across the waist,” says Berdinis. “Through trial and error, we found that three buzzer zones was the right amount.”

The three buzzers, positioned left, right and center, begin vibrating once objects become close enough to potentially impede the user, and increase in intensity as the objects get closer.   

Berdinis and Kiske will continue to work on the Kinecthesia; connections made through the Google conference have enabled them to work with the visually impaired community and further refine their system into what could be a life-changing product.

Originally published on October 20, 2011