Truth or Consequence
A French TV crew has taken over one of the radiology labs at the Hospital of the University of Pennsylvania. They are here to film a simulation, ironically enough, of Dr. Daniel Langleben’s latest experiments on deception.

“I’ll have to ask you to hide somewhere or wait outside,” says the TV reporter to everyone who is not needed inside the snug space for safety or dramatic purposes, including the scientist himself. We creep off to a nearby waiting room, where Langleben, an assistant professor of psychiatry, tells me about a model for recognizing deception that could prove more promising than the polygraph.

When subjects placed inside an fMRI (functional magnetic resonance imaging) scanner were asked to lie about a playing card they held (a standard known as the “Guilty Knowledge Test”) and offered $20 in reward, two areas of the brain—the anterior cingulate and left prefrontal cortex—became more active. In comparison, no parts of the brain became more active when they were telling the truth. “Deception requires extra effort,” Langleben says. “Truth is the default.”

Skillful liars can cheat a polygraph, which only measures physiologic changes such as heart-rate and perspiration. But according to Langleben, “Brain activity is much harder to control.”

“I’m going to predict a TV show for you,” says Caplan, who has a keen sense of the entertainment value in over-the-top prognostications. “The show will be called Liar Liar. It’s going to be on the air within seven years. On the show, which will be hosted by Jerry Springer’s son, couples will come on and make allegations of infidelity or embezzlement or stealing against one another, and then a scanning machine will be brought on and the head stuck inside it and the host will render the verdict about whether they’re telling the truth or not.”

It’s quite likely that Langleben would change the channel. In fact, when asked about fMRI’s potential as a lie detector, he’d rather come up with reasons why it’s not ready for prime time. He has a grant from a Defense Department-linked company to look for flaws in the model.

“If you ever want to use it appropriately, you will have to test it in the target population,” he says. “Let’s say it works for hiding a card in normal college students. That does not mean it’s applicable to a population of middle-aged convicts.” Furthermore, he notes there is “a huge difference between average group effects and individual effects. What we’re showing so far is group effects. What needs to be shown is that this technique can be accurate in single subjects within a single session. The average is never applicable to the individual until the range of the effect has been tested.”

In contrast to Langleben, Iowa scientist Lawrence Farwell is promoting a method of “brain fingerprinting” that has been used in two criminal investigations. Using electroencephalograms (EEGs), Farwell claims he can record electric signals from the brain while a subject is exposed to sounds, words, or images on a computer screen to show the existence or absence of specific memories related to a particular crime—and thus determine a person’s guilt or innocence. His company promotes the system for forensic examinations and screening for security leaks and terrorists.

“I view this as very scary,” says Dr. Kenneth Foster, a professor of bioengineering at Penn who focuses on ethical issues surrounding the use of technology [“Science Meets Society,” February 1998] and cowrote with Wolpe and Caplan an article for IEEE Spectrum on developments like Farwell’s. “This needs to be examined carefully before precedents get established for using it.”

“This is an example of a dangerous use of this sort of thing,” Langleben adds, pointing out that Farwell “doesn’t publish” and is “not part of an academic community.” The good thing about the fMRI machine used in Langleben’s own studies is that it’s “in the hands of the medical community,” he adds. “There’s no way this machine would be in the hands of non-MDs.” (MRI monitors changes in the body’s blood oxygen levels through the use of a powerful magnetic field and radio-frequency pulses, while the subject remains still inside a protective chamber. Functional MRI is an enhanced use of existing MRI technology that provides the necessary resolution and speed to visualize brain function.)

The airport sensor described by Britton Chance wouldn’t require a medical doctor to operate, however, since it’s noninvasive and works with infrared light to detect areas of increased blood-flow and altered metabolism. Chance predicts the device itself could be ready for security tests in two years, though he concedes that more data is needed to measure what he calls “malevolent intent.”

Chance pulls out a pictorial representation of one of his experiments. It’s a page covered with two rows of ovals that represent voxels, or different areas of the forebrain. Each one is differently colored, with red signifying areas of greatest activation. “This is someone who had learned to solve these problems over [many] weeks, so that now, instead of chaotically searching for the solution, they have trained neural networks in this region to function uniquely here.”

Over the past six summers, minority high-school students have taken part in research at Chance’s lab that has provided “an enormous database” for high-school student cognition. Subjects sit before a computer screen wearing a headband-like device, known as a cognosensor. They must determine whether a nonsensical jumble of letters, such as ardma, can be reordered to create an English word (drama). While they are responding to each problem, their brain activity is monitored by the cognosensor, which is studded with emitters that strike tissue or blood vessels in the brain with near-infrared light and detectors that measure the light reflected back.

Since September 11, Chance’s lab has also looked into emotional disturbances. Among other things, the subjects view pictorial displays of angry, happy, or unkind faces as well as pictures of the World Trade Center collapse.

“Each one has their own particular [response] pattern. Just like fingerprints are different, these—perhaps you could call them ‘brain prints’—are expectedly different,” Chance says. “Different voxels. Different intensities of response.”

Chance has also done experiments on deception with the high-school students and with visiting medical students from Pakistan’s Aga Khan University. “When you lie, there are larger signals over a wide area. It may be that the central conscience, which is pricked by telling a lie, is not localized. Just where the social conscience is located, we haven’t determined and suspect that it may be different for different individuals.”

The results are preliminary. “We haven’t studied corporate executives or poker players or ladies of the night, whose job it may be to professionally prevaricate. So we don’t have access to those. Maybe we should take a trip to Atlantic City.”

Chance didn’t give a specific definition of what he means by malevolent intent—the emotional state he proposes to monitor at airports—simply saying it may or may not include deception. “We’re dealing with a characteristic which may be a unique characteristic of an individual,” he says, adding, “We do know that the signals are bigger when there is emotional stress than when there isn’t.”

Involuntary screening is “ethically problematic,” says Art Caplan. On the other hand, “There is no right to fly. If you’ve bought a ticket to board an aircraft, then you can decide whether or not to go through the rituals that get you on that aircraft. Outside of emergencies, you have a choice about flying and therefore about being screened. In other words, someone who was frightened that their malevolent intent would be detected can turn around and go away.”

Paul Wolpe predicts that the public would never stand for this kind of airport screening and argues that it would be impossible to accurately gauge something like malevolent intent. “We don’t know if it’s nervousness because you’re about to plant a bomb in the plane or nervousness because you’re scared to fly. And if you’re a really well-trained terrorist, you’re not going to exhibit.”

Chance acknowledges the need to separate fright from malevolent intent and to test a larger population, adding that he, too, is “deeply concerned” about the ethics of opening this window on the brain. “We have no idea where new technologies are going to take us. So let’s be conservative” in applying them.

Nevertheless, the research still continues at a liberal pace in his busy lab. A different project that has grabbed the attention of the military is the development of a disposable, telemetered cognosensor that would fit on the forehead, under a soldier’s helmet, and allow commanders to remotely monitor soldiers’ fatigue on the battlefield. “We know the tired brain doesn’t work as well as the fresh brain,” Chance says. “Just at what point the brain becomes less efficient is the object of future studies.” He also hopes to use similar technology to monitor people for signs of post-traumatic stress at disaster scenes.

page 1 > 2 > 3 > 4

2004 The Pennsylvania Gazette
Last modified 01/19/04

Who’s Minding the Brain?
By Susan Frith
Illustration by Jon Sarkin

page 1 > 2 > 3 > 4