Essay: Going for the soul's jugular

A new brain-imaging device may help to clarify the physical basis of free will.

By
March 22, 2007 12:26

 
X

Dear Reader,
As you can imagine, more people are reading The Jerusalem Post than ever before. Nevertheless, traditional business models are no longer sustainable and high-quality publications, like ours, are being forced to look for new ways to keep going. Unlike many other news organizations, we have not put up a paywall. We want to keep our journalism open and accessible and be able to keep providing you with news and analyses from the frontlines of Israel, the Middle East and the Jewish World.

As one of our loyal readers, we ask you to be our partner.

For $5 a month you will receive access to the following:

  • A user experience almost completely free of ads
  • Access to our Premium Section
  • Content from the award-winning Jerusalem Report and our monthly magazine to learn Hebrew - Ivrit
  • A brand new ePaper featuring the daily newspaper as it appears in print in Israel

Help us grow and continue telling Israel’s story to the world.

Thank you,

Ronit Hasin-Hochman, CEO, Jerusalem Post Group
Yaakov Katz, Editor-in-Chief

UPGRADE YOUR JPOST EXPERIENCE FOR 5$ PER MONTH Show me later Don't show it again

Years ago, Woody Allen used to joke that he'd been thrown out of college as a freshman for cheating on his metaphysics final. "I looked within the soul of the boy sitting next to me," he confessed. Today, the joke is on us. Cameras follow your car, global-positioning systems track your cell phone, software monitors your Web surfing, X-rays explore your purse, and airport scanners see through your clothes. Now comes the final indignity: machines that can look into your soul. With the aid of functional magnetic resonance imaging (fMRI), which tracks the activation of specific parts of the brain, neuroscientists have been hard at work on Allen's fantasy. Under controlled conditions, they can tell from a brain scan which of two images you're looking at. They can tell whether you're thinking of a face, an animal or a scene. They can even tell which finger you're about to move. But those feats barely scratch the brain's surface. Any animal can perceive objects and move limbs. To plumb the soul, you need a metaphysician. John-Dylan Haynes, a researcher at Germany's Bernstein Center for Computational Neuroscience, is leading the way. His mission, according to the center, is to predict thoughts and behavior from fMRI scans. Haynes, a former philosophy student, is going for the soul's jugular. He's trying to clarify the physical basis of free will. "Why do we shape intentions in this way or another way?" he wonders. "Your wishes, your desires, your goals, your plans - that's the core of your identity." The best place to look for that core is in the brain's medial prefrontal cortex, which, he points out, is "especially involved in the initiation of willed movements and their protection against interference." To get a clear snapshot of free will, Haynes designed an experiment that would isolate it from other mental functions. No objects to interpret; no physical movements to anticipate or execute; no reasoning to perform. Participants were put in an fMRI machine and were told they would soon be shown the word "select," followed a few seconds later by two numbers. Their job was to covertly decide, when they saw the cue, whether to add or subtract the unseen numbers. Then they were to perform the chosen calculation and punch a button corresponding to the correct answer. The snapshot was taken right after the "select" cue, when they had nothing to do but choose addition or subtraction. Until this experiment, which was reported last month in Current Biology, nobody had ever tried to take a picture of free will. One reason is that fMRI is too crude to distinguish one abstract choice from another. It can only show which parts of the brain are demanding blood oxygen. That's too coarse to distinguish the configuration of cells that signifies addition from the configuration that signifies subtraction. So Haynes used software to help the computer recognize complex patterns in the data. To dissect human thought, the computer had to emulate it. EACH PARTICIPANT took the test more than 250 times, choosing independently in each trial. The computer then looked at a sample of the scans, along with the final answers that revealed what choices were made. It calculated a pattern and used it to predict, from each participant's remaining scans, his or her decisions in the corresponding trials. Haynes checked the predictions - add or subtract - against the answers. The computer got it right 71 percent of the time. I know what you're thinking: Why would anyone want a machine to read his mind? But imagine being paralyzed, unable to walk, type or speak. Imagine a helmet full of electrodes, or a chip implanted in your head, that lets your brain tell your computer which key to press. Those technologies are already here. And why endure the agony of mental hunt-and-peck? Why not design computers that, like a smart secretary, can discern and execute even abstract intentions? That's what Haynes has in mind. You want to open a folder or an e-mail, and your computer does it. Your wish is its command. But if machines can read your mind when you want them to, they can also read it when you don't. And your will isn't necessarily the one they obey. Already, scans have been used to identify brain signatures of disgust, drug cravings, unconscious racism and suppressed sexual arousal, not to mention psychopathy and the propensity to kill. Haynes understands the objection to these scans - he calls it "mental privacy" - but he buys only half of it. He doesn't like the idea of companies scanning job applicants for loyalty or scanning customers for reactions to products (an emerging practice known as neuromarketing). But where criminal justice is at stake, as in the case of lie detection, he supports using the technology. Ruling it out, he argues, would "deny the innocent people the ability to prove their innocence" and would "only protect the people who are guilty." I hear what he's saying. I'd love to have put Khalid Sheik Mohammed through an fMRI before September 11, 2001, instead of waiting six years for his confession. And I wish we'd scanned Mohammed Atta's brain before he boarded that flight out of Boston. But what Haynes is saying - and exposing - is almost more terrifying than terrorism. The brain is becoming just another accessible body part, searchable for threats and evidence. We can sift through your belongings, pat you down, study your nude form through your clothes, inspect your body cavities and, if necessary, peer into your mind. Using fMRI is just the first stage. Electrodes, infrared spectroscopy and subtler magnetic imaging are next. Scanners will shrink. Image resolution and pattern-recognition software will improve. But don't count out free will. To make human choice predictable, you first have to constrain it so it's not really free. That's why Haynes confined his participants to arithmetic, gave them only two options and forbade them to change their minds. They could have wrecked his experiment by defying any of those conditions. So could you, if somebody came at you with a scanner or an electrode helmet. To look into your soul and get the right answer, science, too, has to cheat. Somewhere, Woody Allen is laughing. I can feel it. The writer covers science and technology for Slate, the online magazine at www.slate.com - The Washington Post

Related Content

TRAVELERS WAIT in line at Ben-Gurion International Airport. Let critics come to Israel and see this
August 17, 2018
Editor's Notes: Politics at our borders

By YAAKOV KATZ