TOP MENU
A disturbing report from the front lines of the war on cancer.
Identifying Child Molesters,"
Sometimes beauty is just business.
Who really rescued General Motors?
Why do we pay our stars so much money?
Social media can't provide what social change has always
Why is it so difficult to develop drugs for cancer?
It was a dazzling feat of wartime espionage.
How much people drink may matter less than how they drink
Do Enterpreneurs succeed WITHOUT RISK Atticus Finch and the limits of Southern liberalism.
Banks, battles, and the psychology of overconfidence.

Is free the future?
The real genius of Steve Jobs.When underdogs break the rules.

How do we hire when we can't tell who's right for the job?
Can underprivileged outsiders have an advantage?
Why do we equate genius with precocity?
Who says big ideas are rare?
What I.Q. doesn't tell you about race.
Criminal profiling made easy.
Enron, intelligence, and the perils of too much information.
What if you built a machine to predict hit movies?
Comment
What's behind Ireland's economic miracle--and G.M.'s financial
When it comes to athletic prowess, don't believe your eyes.
What is the difference between choking and panicking? Why are there dozens
Why problems like homelessness may be easier to solve than to
What pit bulls can teach us about profiling.
The social logic of Ivy League admissions.
How Rick Warren built his ministry.
Project Delta aims to create the perfect cookie.
The bad idea behind our failed health-care system.

Is pop culture dumbing us down or smartening us up?

In "Collapse, " Jared Diamond shows how societies destroy
Mammography, air power, and the limits of looking.
Should a charge of plagiarism ruin your life?
The Man in the Gray Flannel Suit put the war behind him. Why can't
How to think about prescription drugs.
Employers love personality tests. But what do they really
Mustard now comes in dozens of varieties. Why has ketchup stayed the
Fifty years ago, the mall was born. America would never be the same.
How the S.U.V. ran over automotive safety.
The paradoxes of intelligence reform.
What does 'Saturday Night Live' have in common with German philosophy?
The great Chicago heat wave, and other unnatural disasters.
Can you read people's thoughts just by looking at them?
Are smart people overrated?
Big business and the myth of the lone inventor
How Nassim Taleb turned the inevitability of disaster into an investment
Looking for method in the mess.
What Stanley H. Kaplan taught us about the SAT.
The disposable diaper and the meaning of progress.
If you are wondering what to worry about when it comes to biological
How far can airline safety go?
One of the most striking aspects of the automobile industry is the precision
To beat the competition, first you have to beat the drug test.
In the early morning of July 7th, the thirty-year-old publicist Lizzie Grubman
How caffeine created the modern world.
Sumner Redstone and the rules of the corporate memoir.
Millions of people owe their lives to Fred Soper. Why isn't he a hero?
How the fight to make America's  highways safer went off course.
Fast food is killing us. Can it be fixed?
Why your bosses want to turn your new office into Greenwich Village.
Out of the Frying Pan, Into the Voting Booth
Ron Popeil and the conquest of the American kitchen.
Why some people choke and others panic.
Every now and again in politics, there is a moment that captures the temper
What do job interviews really tell us?
The T-shirt trade becomes a calling.
What the co-inventor of the Pill didn't know about menstruation can endanger
Do our first three years of life determine how we'll turn out?
Don't believe the Internet hype: the real E-commerce revolution happened
The President came to Manhattan last Tuesday to speak at the United
How the Information Age could blow away the blockbuster.
What do Wayne Gretzky, Yo-Yo Ma, and a brain surgeon named Charlie
Is the Belgian Coca-Cola hysteria the real thing?
"Melrose Place," 1992-1999, R.I.P.
Hair dye and the hidden history of postwar America.
Last week, New York City began confiscating the automobiles of people caught
Is the hectic pace of contemporary life really to blame for A.D.D.? Not so fast.
Sagaponack Homeowners Association vs. Ira Rennert
Science and the Perils of a Parable
We all know that underdogs can win - that's what the David versus
She's a grandmother, she lives in a big house in Chicago, and you've never
Judith Rich Harris and child development
Are our spin meisters just spinning one another?
It's not just the Indians who fetishize nukes.
If it hadn't gone down, the Titanic might have gone up in smoke.
Can we learn how to lose weight from one of the most obese people in the
Seven bodies buried in the Artic tundra might solve the riddle of the worst
What America's most popular pants tell us about the way guys think.
How wrong is Dr. Susan Love?
Why blacks are like boys and whites are like girls.
Who decides what's cool? Certain kids in certain places
Why do some people turn into violent criminals?
The American shopper has never been so fickle.
save the life of a coma patient like the Central Park
family's experience, why West
When the means justify the ends.
Blamed for a disaster like the Challenger explosione


GO TO TOP MENU

  A disturbing report from the front lines of the war on cancer.

  1.

  In the fall of 1963, not long after Vincent T. DeVita, Jr., joined the National Cancer Institute as a clinical associate, he and his wife were invited to a co-worker's party. At the door, one of the institute's most brilliant researchers, Emil Freireich, presented them with overflowing Martinis. The head of the medical branch, Tom Frei, strode across the room with a lab technician flung over his shoulder, legs kicking and her skirt over her head. DeVita, shocked, tried to hide in a corner. But some time later the N.C.I.'s clinical director, Nathaniel Berlin, frantically waved him over. Freireich, six feet four and built like a lineman, had passed out in the bathtub. Berlin needed help moving him. "Together, we pulled him up, threw his arms over our shoulders, and dragged him out through the party,"

DeVita writes, in his memoir, "The Death of Cancer"

(Sarah Crichton Books). "Out front, Freireich's wife, Deanie, sat behind the wheel of their car. We tossed Freireich in the backseat and slammed the door."

  Half a century ago, the N.C.I. was a very different place. It was dingy and underfunded--a fraction of its current size--and home to a raw and unruly medical staff. The orthodoxy of the time was that cancer was a death sentence: the tumor could be treated with surgery or radiation, in order to buy some time, and the patient's inevitable decline could be eased through medicine, and that was it. At the N.C.I., however, an insurgent group led by Frei and Freireich believed that if cancer drugs were used in extremely large doses, and in multiple combinations and repeated cycles, the cancer could be beaten. "I wasn't sure if these scientists were maniacs or geniuses,"

DeVita writes. But, as he worked with Freireich on the N.C.I.'s childhood-leukemia ward--and saw the fruits of the first experiments using combination chemotherapy--he became a convert.

  DeVita decided to try the same strategy on another seemingly hopeless cause, Hodgkin's lymphoma, a cancer that begins as a solid tumor in the lymph nodes and steadily spreads throughout the body. He teamed up with a fellow-associate named Jack Moxley. Over a few beers one night, at Au Pied de Cochon in Georgetown, the two sketched out a protocol, based loosely on what Frei and Freireich were doing with leukemia. Given the ability of cancer cells to adapt and mutate in the face of threats, they figured they needed four drugs, each effective against Hodgkin's in its own way, so that whatever cells survived one wave had a chance of being killed by the next. They also had to be careful how frequently they gave the drugs: doses needed to be high enough to wipe out the cancer cells but not so high that they killed the patient. After several months, they settled on a regimen called MOMP: three eleven-day rounds of nitrogen mustard, Oncovin (a brand of vincristine), methotrexate, and prednisone, interspersed with ten-day recovery cycles.

  "The side effects were almost immediate,"

DeVita writes:

  The sound of vomiting could be heard along the hallway. Night after night, Moxley and I paced outside the rooms of our patients, fearful of what might happen. Over the weeks that followed, they lost weight and grew listless, and their platelet counts sank lower and lower to dangerous levels.

  Then came the surprise. Twelve of the fourteen patients in the initial trial went into remission--and nine stayed there as the months passed. In most cases, the tumors disappeared entirely, something that had never before been seen in the treatment of solid tumors. In the spring of 1965, DeVita went to Philadelphia to present the results to the annual meeting of the American Association for Cancer Research. He stood up before the crowd and ran triumphantly through the data: " "�Our patients were, therefore,' I said, savoring the dramatic conclusion, "�in complete remission.' "

  What happened? An illustrious cancer expert named David Karnofsky made a narrow point about the appropriateness of the term "complete remission."

After that, nothing: "There were a few perfunctory questions about the severity of the side effects. But that was it."

History had been made in the world of cancer treatment, and no one seemed to care.

  Vince DeVita served as the head of the National Cancer Institute from 1980 to 1988. He went on to serve as the physician-in-chief of the Memorial Sloan Kettering Cancer Center, in New York, and then ran the Yale Cancer Center, in New Haven. For the past half century, he has been at the forefront of the fight against one of the world's most feared diseases, and in "The Death of Cancer"

he has written an extraordinary chronicle. DeVita's book is nothing like Siddhartha Mukherjee's magisterial "The Emperor of All Maladies."

Mukherjee wrote a social and scientific biography of the disease. DeVita, as befits someone who spent a career at the helm of various medical bureaucracies, has written an institutional history of the war on cancer. His interest is in how the various factions and constituencies involved in that effort work together--and his conclusions are deeply unsettling.

  2.

  When his first go-round as a clinical associate at the N.C.I. was up, DeVita took a post as a resident at Yale. At what was supposed to be a world-class hospital, he discovered that the standard of care for many cancers was woefully backward. Freireich had taught DeVita to treat Pseudomonas meningitis in leukemia patients by injecting an antibiotic directly into the spinal column--even though the drug's label warned against that method of administration. That was the only way, Freireich believed, to get the drug past the blood-brain barrier. At Yale, DeVita writes, "you just didn't do that kind of thing. As a result, I watched leukemic patients die."

Leukemia patients also sometimes came down with lobar pneumonia. Conventional wisdom held that that ought to be treated with antibiotics. But N.C.I. researchers had figured out that the disease was actually a fungal infection, and had to be treated with a different class of drug. "When I saw this condition in patients with leukemia and pointed it out to the chief of infectious diseases at Yale, he didn't believe me--even when the lab tests proved my point,"

DeVita continues. More patients died. Leukemia patients on chemotherapy needed platelets for blood transfusions. But DeVita's superiors at Yale insisted there was no evidence that transfusions made a difference, despite the fact that Freireich had already proved that they did. "Ergo, at Yale,"

DeVita says, "I watched patients bleed to death."

  Later, when DeVita and his fellow N.C.I. researcher George Canellos wanted to test a promising combination-chemotherapy treatment for advanced breast cancer, they had to do their trial overseas, because they couldn't win the coöperation of surgeons at either of the major American cancer centers, Memorial Sloan Kettering or M. D. Anderson. When the cancer researcher Bernard Fisher did a study showing that there was no difference in outcome between radical mastectomies and the far less invasive lumpectomies, he called DeVita in distress. He couldn't get the study published. "Breast surgeons made their living doing radical or total mastectomies, and they did not want to hear that that was no longer necessary,"

DeVita writes. "Fisher had found it difficult to get patients referred to his study, in fact, because of this resistance."

The surgeons at Memorial Sloan Kettering Cancer Center were so stubborn that they went on disfiguring their patients with radical mastectomies for years after Fisher's data had shown the procedure to be unnecessary. "The Death of Cancer"

is an angry book, in which one of the critical figures in twentieth-century oncology unloads a lifetime of frustration with the obduracy and closed-mindedness of his profession. DeVita concludes, "There are incredibly promising therapies out there. If used to their fullest potential for all patients, I believe we could cure an additional 100,000 patients a year."

He is not the first to point out the shortcomings of clinical practice, of course. What sets "The Death of Cancer"

apart is what he proposes to do about it.

  3.

  After DeVita was rebuffed at the American Association for Cancer Research meeting, he and Moxley went back to the drawing board. They needed to do more than push patients into remission.

  Their first step was to alter the combination of drugs in their protocol, replacing methotrexate with a newer compound called procarbazine. Next, they reëxamined the schedule of treatment. Combination chemotherapy is a delicate balancing act. Cancer drugs are typically so toxic that they can be given only in short bursts, so that patients can regain their strength. If the breaks are too long, though, the cancer comes roaring back. In the first trial, they had simply followed the schedule that Freireich used in treating leukemia. Hodgkin's cells, however, were different. They divided more slowly--and, since cancer cells are most vulnerable when they are dividing, that suggested that the Hodgkin's schedule needed to be a lot longer.

  So MOMP became MOPP: two full doses of nitrogen mustard and vincristine on the first and the eighth days, and daily doses of procarbazine and prednisone for fourteen days, followed by two weeks of rest. Since only twenty per cent of Hodgkin's cells would divide during the course of that cycle, the regimen would have to be repeated at least six times. A second trial was launched, and the outcome was unequivocal: the regimen had beaten the disease.

  When the new results were published, in 1970, the response was better, but there was still considerable resistance. A crucial presentation at Memorial Sloan Kettering was met with "tepid"

applause, after which one oncologist after another got up to complain that MOPP didn't work. DeVita was told that his data must be wrong.

  Baffled, he asked one of the hospital's leading oncologists, Barney Clarkson, to explain exactly how he was administering the MOPP protocol. Clarkson answered that he and his colleagues had decided to swap the nitrogen mustard in DeVita's formula for a drug called thiotepa. This was a compound they had developed in-house at Memorial Sloan Kettering and felt partial to. So MOPP was now TOPP. DeVita writes:

  They'd also cut the dose of procarbazine in half, because it made patients nauseous. And they'd reduced the dose of vincristine drastically because of the risk of nerve damage. They'd also added, at a minimum, an extra two weeks between cycles so that patients would have fully recovered from the toxic effects of the prior dose before they got the next. They gave no thought to the fact that the tumor would have been back on its feet by then, too, apparently.

  These alterations had not been tested or formally compared with DeVita's original formula. They were simply what the oncologists at Memorial Sloan Kettering felt made more sense. After an hour, DeVita had had enough:

  "Why in God's name have you done this?"

he asked.
A voice piped up from the audience. "Well, Vince, most of our patients come to us on the subway, and we don't want them to vomit on the way home."

  Here were physicians at one of the world's greatest cancer hospitals denying their patients a potentially life-saving treatment because their way felt better. Stories like this are why DeVita believes that a hundred thousand cancer patients in the United States die needlessly every year. The best innovations are sometimes slow to make their way into everyday medical practice. Hence the sustained push, in recent years, toward standardizing treatments. If doctors aren't following "best practices,"

it seems logical that we should write up a script describing what those best practices are and compel them to follow it.

  But here "The Death of Cancer"

takes an unexpected turn. DeVita doesn't think his experience with the stubborn physicians at Memorial Sloan Kettering or at Yale justifies greater standardization. He is wary of too many scripts and guidelines. What made the extraordinary progress against cancer at the N.C.I. during the nineteen-sixties and seventies possible, in his view, was the absence of rules. A good illustration was Freireich's decision to treat Pseudomonas meningitis by injecting an antibiotic directly into the spinal fluid. DeVita writes:

  The first time Freireich told me to do it, I held up the vial and showed him the label, thinking that he'd possibly missed something. "It says right on there, "�Do not use intrathecally,' "

I said. Freireich glowered at me and pointed a long bony finger in my face. "Do it!"

he barked. I did it, though I was terrified. But it worked every time.

  Clinical progress against a disease as wily and dimly understood as cancer, DeVita argues, happens when doctors have the freedom to try unorthodox things--and he worries that we have lost sight of that fact. By way of example, he tells the story of a friend of his, Lee, who was diagnosed with advanced prostate cancer at the age of sixty. According to the practice guidelines, the best option for Lee was androgen-deprivation therapy, or A.D.T., which slows down the cancer cells by denying them testosterone. That's what Lee's doctor recommended. DeVita understood why: there are strong incentives--like the threat of malpractice suits--for doctors to adhere to treatment protocols. But DeVita judged that Lee's cancer was so aggressive that A.D.T. would buy him only a short reprieve. The guidelines limited Lee's treatment options at a moment when he needed maximum flexibility.

  "Over the years, we've gained more tools for treating cancer, but the old ability to be flexible and adapt has disappeared,"

DeVita writes:

  Guidelines are backwards looking. With cancer, things change too rapidly for doctors to be able to rely on yesterday's guidelines for long. These guidelines need to be updated frequently, and they rarely are, because this takes time and money. . . . Reliance on such standards inhibits doctors from trying something new.

  DeVita's first thought was to get Lee enrolled in a pioneering trial at the Mayo Clinic, where surgeons were removing the prostate along with all surrounding lymph nodes. Fifteen per cent of patients who underwent the procedure survived free of disease. The Mayo doctors wouldn't operate on Lee, however. His cancer was too advanced. So DeVita found someone who would. "I can be very persuasive,"

he writes. Then he managed to get Lee enrolled in an experimental-drug trial for relapsed prostate-cancer patients--only to discover that the study's protocol called for treatment to end after a fixed number of doses. DeVita felt that Lee needed a much longer course. Lee sought an exemption from the rules of the study, which required a judgment from the hospital's institutional review board. The lead investigator declined to take it up. DeVita was devastated, though hardly surprised. The system was built to be inflexible.

  DeVita's struggle to keep his friend alive goes on for years. He finagles his way into one experimental trial after another. He improvises. He works his contacts. Finally, with Lee at the end of the line, DeVita hears of an experimental drug called abiraterone. But he can't get Lee into the trial: the study's protocol forbids it. DeVita tries to find his way around the rules and fails--and he's heartbroken when he learns, after Lee finally succumbs to the disease, that abiraterone is so effective against advanced prostate cancer that the trial is stopped in mid-course and the patients in the control group are switched over to the new drug. "I could have told you a story with a happy ending,"

DeVita writes, speaking of what he is sure was his friend's premature death. "I instead chose to tell you one that could have had a happy ending because it illustrates what has been, for me, a source of perennial frustration: at this date, we are not limited by the science; we are limited by our ability to make good use of the information and treatments we already have."

  4.

  Here we have a paradox. The breakthroughs made at the N.C.I. in the nineteen-sixties and seventies were the product of a freewheeling intellectual climate. But that same freewheeling climate is what made it possible for the stubborn doctors at Memorial Sloan Kettering to concoct their non-cure. The social conditions that birthed a new idea in one place impeded the spread of that same idea in another. People who push for greater innovation in the marketplace often naïvely assume that what is good for the innovator is also, down the line, good for the diffusion of their ideas. And people worried about diffusion often position themselves as the friends of innovation, as if a system that does well at spreading good ideas necessarily makes it easier to come up with good ideas. The implication of "The Death of Cancer"

is, on the contrary, that innovation and diffusion can sometimes conflict.

  Practice guidelines would have made the task of curing Hodgkin's patients with DeVita's regimen a lot easier. But had those guidelines been in place in the mid-sixties, when DeVita was making the rounds on behalf of his new treatment, they would have imposed a tax on other innovators. The obstacles he encountered in trying to save his friend Lee, similarly, were not capricious or arbitrary. They were there to insure that the results of clinical trials were as clear and persuasive as possible. It's just that they had a cost--Lee's death--and in DeVita's mind that cost was too high.

  5.

  The angriest chapter of "The Death of Cancer"

is devoted to the Food and Drug Administration, because DeVita believes that it has fundamentally misunderstood the trade-off between diffusion and innovation. The agency wants all new drugs to be shown to be safe and efficacious, to be as good as or better than existing therapies (or a placebo) in a randomized experiment involving the largest possible number of patients. For example, the F.D.A. might ask that patients getting an experimental treatment have better long-term survival rates than those receiving drug treatments already in use. The F.D.A. is the country's diffusion gatekeeper: its primary goal is to make sure that good drugs get a gold star and bad drugs never make it to market.

  DeVita reminds us, though, that this gatekeeping can hinder progress. A given tumor, for instance, can rarely be stopped with a single drug. Cancer is like a door with three locks, each of which requires a different key. Suppose you came up with a drug that painlessly opened the first of those three locks. That drug would be a breakthrough. But it can't cure anything on its own. So how do you get it through a trial that requires proof of efficacy--especially if you don't yet know what the right keys for the two remaining locks are? Since cancer comes in a dizzying variety of types and subtypes, each with its own molecular profile, we want researchers to be free to experiment with different combinations of keys. Instead, DeVita argues, the F.D.A. has spent the past two decades pushing cancer medicine in the opposite direction. He continues:

  Drugs are now approved not for a specific cancer or for general use in a variety of cancers but for a specific stage of a specific cancer and specifically after and only after patients have had all current treatments, which are listed drug by drug, and the treatments have all failed. Doctors risk F.D.A. censure if they use an approved drug under any other circumstances, and patients are penalized because insurance companies won't pay for treatments not approved by the F.D.A.The vital insight gained by using an approved drug in a different way for a different tumor has been lost.

  There's a second problem with the "efficacy"

requirement. Suppose Drug A, the existing treatment for a certain type of cancer, wipes out all but a billion cells in the typical patient's tumor. Drug B, your alternative, wipes out all but a handful. DeVita points out two curious facts. First, a typical tumor has so many billions of cells that even a drug that leaves a billion cells untouched will look good after an initial treatment cycle. More important, after five years the patients on both Drugs A and B may have identical survival rates. That's because of something called the Norton-Simon effect: smaller populations of cancer cells grow back faster than larger populations. But, in reality, Drugs A and B aren't identical. If you are designing a combination of drugs to cure a cancer, DeVita writes, "the treatment that reduced the population to a few cells is the one you want to go forward with."

How many researchers and companies sit on promising therapies because they don't want to spend several hundred million dollars on a clinical trial, only to fall short of the F.D.A.'s high bar?

  DeVita would have the F.D.A. take a step sideways--away from worrying exclusively about standards and safety, and closer to the innovation end of the continuum. In this respect, his position echoes that of Peter Huber, who in his 2013 book, "The Cure in the Code,"

called on the F.D.A. to stop evaluating drugs as cures and start evaluating them as tools--"molecular scalpels, clamps, sutures, or dressings, to be picked off the shelf and used carefully but flexibly down at the molecular level."

  What critics like DeVita want, in other words, is a return to the world of Freireich's N.C.I., where clinicians had the freedom to tinker and improvise, and DeVita's portrait of the way things were gives us a glimpse of what the future may look like. Discretion means more MOPPS. But it also, inevitably, means more TOPPS. Discretion means Freireich, the great genius, growling "Do it."

But surely Barney Clarkson growled "Do it"

as well, when some fresh-faced clinical associate questioned the wisdom of substituting thiotepa for nitrogen mustard. Modern medicine is intent on addressing "practice variation"

--on bringing bad doctors up to the level of the good ones. Going back to the days of the old N.C.I. makes that problem worse, not better. If you think that there are more Freireichs than Barney Clarksons out there, that is a trade worth making. But DeVita does not acknowledge how difficult that change might prove to be.

  When DeVita faced the naysayers at Memorial Sloan Kettering, who worried about their Hodgkin's patients on the subway ride home, he informed them curtly, "If you told those patients that the choice was between being cured and vomiting, or not vomiting and dying, don't you think they might have opted to take a cab?"

This is how diffusion happens in a world without a diffusion gatekeeper. But how many doctors are capable of that kind of hand-to-hand combat? Life on the innovation end of the continuum is volatile, fractious, and personal--less a genteel cocktail party, governed benignly by bureaucratic fiat, than the raucous bender where your boss passes out in a bathtub. When DeVita returned to Memorial Sloan Kettering years later, as the physician-in-chief, the hospital got better. But DeVita didn't last, which will scarcely come as a surprise to anyone who has read his book. "The problem with Vince,"

the hospital's president reportedly said, in announcing his departure, "is that he wants to cure cancer."

  In a 2001 book, "Identifying Child Molesters,"

the psychologist Carla van Dam tells the story of a young Canadian elementary-school teacher she calls Jeffrey Clay. Clay taught physical education. He was well liked by his students, and often he asked boys in his class to stay after school, to do homework and help him with chores. One day, just before winter break, three of the boys made a confession to their parents. Mr. Clay had touched them under their pants.

  The parents went to the principal. He confronted Clay, who denied everything. The principal knew Clay and was convinced by him. In his mind, what it boiled down to, van Dam writes, "is some wild imaginations and the three boys being really close."

  The parents were at a loss. Mr. Clay was beloved. He had started a popular gym club at the school. He was married and was a role model to the boys. He would come to their after-school games. Could he really have abused them? Perhaps he was just overly physical in the way that young men often are. He had a habit, for example, of grabbing boys in the hallway and pulling them toward him, placing his arms over their shoulders and chest. At the gym club, he would pick boys up and turn them upside down, holding them by the legs. Lots of people--especially gym teachers--like to engage in a little horseplay with young boys. It wasn't until the allegations about Clay emerged that it occurred to anyone to wonder whether he might have been trying to look down the boys' shorts.

  "We weren't really prepared to call the police and make it into a police investigation,"

one of the mothers told van Dam. "It was an indiscretion, as far as we were concerned at this point. It was all vague: "�Well, he put his hands down there.' And, "�Well, it was inside the pants, but fingers went to here.' We were all still trying to protect Mr. Clay's reputation, and the possibility this was all blown up out of proportion and there was a mistake."

  The families then learned that there had been a previous complaint by a child against Clay, and they took their case to the school superintendent. He, too, advised caution. "If allegations do not clearly indicate sexual abuse, a gray area exists,"

he wrote to them. "The very act of overt investigation carries with it a charge, a conviction, and a sentence, a situation which is repugnant to fair-minded people."

He was responsible not just to the children but also to the professional integrity of his teachers. What did they have? Just the story of three young boys, and young boys do, after all, have wild imaginations.

  Clay was kept on. Two months later, after prodding from a couple of social workers, the parents asked the police to investigate. One of the mothers recalls an officer interviewing her son: "He was gentle, but to the point, and he wanted to be shown exactly where Mr. Clay had touched him."

The three boys named other boys who they said had been subjected to Mr. Clay's advances. Those boys, however, denied everything. A new, more specific allegation against Clay surfaced. He resigned, and went to see a therapist. But still the prosecutor's office didn't feel that it had enough evidence to press charges. And within the school there were teachers who felt that Clay was innocent. "I was running into my colleagues who were saying, "�Did you know that some rotten parents trumped up these charges against this poor man?' "

one teacher told van Dam. The teacher added, "Not just one person. Many teachers said this."

A psychologist working at the school thought that the community was in the grip of hysteria. The allegations against Clay, he thought, were simply the result of the fact that he was "young and energetic."

Clay threatened to sue. The parents dropped their case.

  Clay was a man repeatedly accused of putting his hands down the pants of young boys. Parents complained. Superiors investigated. And what happened? The school psychologist called him a victim of hysteria.

  When monsters roam free, we assume that people in positions of authority ought to be able to catch them if only they did their jobs. But that might be wishful thinking. A pedophile, van Dam's story of Mr. Clay reminds us, is someone adept not just at preying on children but at confusing, deceiving, and charming the adults responsible for those children--which is something to keep in mind in the case of the scandal at Penn State and the conviction, earlier this year, of the former assistant football coach Jerry Sandusky on child-molestation charges.

  Jerry Sandusky grew up in Washington, Pennsylvania. His father headed the local community recreation center, running sports programs for children. The Sanduskys lived upstairs. "Every door I opened, there was a bat, a basketball, a football somewhere,"

Sandusky has recounted. "There was constant activity everywhere. My folks touched a lot of kids."

Sandusky's son E.J. once described his father as "a frustrated playground director."

Sandusky would organize kickball games in the back yard, and, E.J. said, "Dad would get every single kid involved. We had the largest kickball games in the United States, kickball games with forty kids."

Sandusky and his wife, Dottie, adopted six children, and were foster parents to countless more. "They took in so many foster children that even their closest friends could not keep track of them all,"

Joe Posnanski writes in "Paterno,"

his new biography of Sandusky's boss, the former Penn State head coach Joe Paterno. "Children constantly surrounded Sandusky, so much so that they became part of his persona."

  Sandusky was a hugger and a grabber and a cutup. "He liked practical jokes and messing around, knocking a guy's hat off his head, making prank calls, sneaking up behind people to startle them,"

Posnanski goes on. People at Penn State thought of him as "a knucklehead."

Much of Sandusky's 2000 autobiography, "Touched,"

is devoted to stories of his antics: the time he smeared charcoal over the handset of his chemistry teacher's phone, the time he ran afoul of a lifeguard for horseplay with his children in a public pool. Four and a half pages alone are devoted to water-balloon fights that he orchestrated while in college. "Wherever I went, it seemed like trouble was sure to follow,"

Sandusky writes. He was a kid at heart. "I live a good part of my life in a make-believe world,"

he continues. "I enjoyed pretending as a kid, and I love doing the same as an adult with these kids. Pretending has always been part of me."

There was a time when one of the kids he was mentoring became "cold and unresponsive"

to him. It upset him. He writes:

"You know it's not right to treat people like this,"

I told him. "You should talk to me."

The boy laid into me, screaming from the top of his lungs. "Get out of here! Get out of here!"

His voice echoed into the hallway and staff people came rushing into the room. I looked at him with sincere tears in my eyes. "I can't believe you're doing this to me,"

I said quietly as I walked out of the room.

  In 1977, Sandusky and his wife started a nonprofit called the Second Mile, to help troubled and disadvantaged boys. At its height, the Second Mile had a budget of millions of dollars and programs that reached tens of thousands of children. Three times, Sandusky was offered head-coaching jobs at other universities. Each time, he said no. The kids came first. "We had a young foster child whose name was Christopher staying with us,"

Sandusky writes, of the time he considered whether to accept a job offer from Marshall University:

I spotted Christopher at the bottom of the stairs. He had a ball in his hands, and as he looked at me, he said, "P'ay ball! P'ay ball!"

. . . Christopher threw me the ball, and as I tossed it back, I came to the realization that we wouldn't be able to take him with us. . . . Seeing Christopher at that moment kind of told me all I needed to know.

  We now know what Sandusky was really doing with the Second Mile. He was setting up a pipeline of young troubled boys. Just as important, though, he was establishing his bona fides. Psychologists call this "grooming"

--the process by which child molesters ingratiate themselves into the communities they wish to exploit. "Many molesters confirmed that they would spend anywhere from two to three years getting established in a new community before molesting any children,"

van Dam writes. One pedophile she interviewed would hang out in bars, looking for adults who seemed to be having difficulties at home. He would lend a comforting ear, and then start to help out. As he told van Dam:

I was just a friend doing things a friend would do. Helping them move, going to baseball games with them. What I found myself doing was getting close to the kids, becoming more of a father figure or a mentor, doing things for them that the parents weren't doing because the parents were out getting drunk all the time. And, of course, it made it easy for me to baby-sit. They'd say, "Oh yeah. We can off-load the kids with Jimmy."

  One of the most remarkable and disturbing descriptions of the grooming process comes from a twenty-two-page autobiography (published as a chapter in a book about pedophilia) by a convicted pedophile named Donald Silva. After graduating from medical school, Silva met a family with a nine-year-old named Eric. He first sexually molested Eric on a ski trip that the two of them took together. But that came only a year after he befriended the family, patiently insinuating himself into the good graces of Eric's parents. At one point, Eric's mother ordered an end to the "friendship,"

because she thought Silva's friends had been smoking pot in her son's presence. But Silva had so won over her husband that, he writes, "this beautiful man found it in his heart to forgive me after I assured him that such a thing would not happen again."

Silva describes an unforgettable night that he and Eric spent together after they were "reunited"

:

I had recently broken up with Cathy [his girlfriend] when Evelyn, my future wife, arrived for a visit. In that month, Evelyn met Eric's family, and she and his mother became good friends. Evelyn stayed with me at my parents' house, and we enjoyed an active sex life. Eric slept over one night, and the three of us shared a bed for a while. He was going to pretend to be asleep while Evelyn and I made love, but Evelyn declined with him there and went to sleep elsewhere.

  To recap: A man uses his new girlfriend to befriend the family of the ten-year-old boy he is molesting. He orchestrates a threesome in a bed in his parents' house. He asks the girl to have sex with him with the ten-year-old lying beside them. She says no. She leaves him alone with his victim--and then he persuades her to marry him.

  The pedophile is often imagined as the dishevelled old man baldly offering candy to preschoolers. But the truth is that most of the time we have no clue what we are dealing with. A fellow-teacher at Mr. Clay's school, whose son was one of those who complained of being fondled, went directly to Clay after she heard the allegations. "I didn't do anything to those little boys,"

Clay responded. "I'm innocent. . . . Would you and your husband stand beside me if it goes to court?"

Of course, they said. People didn't believe that Clay was a pedophile because people liked Clay--without realizing that Clay was in the business of being likable.

  Did anyone at Penn State understand what they were dealing with, either? Here was a man who built a sophisticated, multimillion-dollar, fully integrated grooming operation, outsourcing to child-care professionals the task of locating vulnerable children--all the while playing the role of lovable goofball. "If Sandusky did not have such a human side,"

Sports Illustrated's Jack McCallum wrote, in 1999, "there would be a temptation around Happy Valley to canonize him."

A week later, Bill Lyon, of the Philadelphia Inquirer, paid tribute to Sandusky's selflessness. "In more than one motel hallway, whenever you encountered him and offered what sounded like even the vaguest sort of compliment, he would blush and an engaging, lopsided grin of modesty would wrap its way around his face,"

Lyon wrote. "He isn't in this business for recognition. His defense plays out in front of millions. But when he opens the door and invites in another stray, there is no audience. The ennobling measure of the man is that he has chosen the work that is done without public notice."

  In 1990, the Second Mile was awarded one of President George H. W. Bush's Points of Light awards. After the formal ceremonies were over, Sandusky grabbed the microphone and shouted out, "It's about time, George!"

  "I had reverted back to the days of my mischievous youth,"

Sandusky writes, in "Touched."

"I had always professed that someday I would reap the benefits of maturity, but my lifestyle just wouldn't let me. There were so many things I had done in my life--so many of them crazy and outlandish. . . . My time on this earth has always been unique. At the times when I found myself searching for maturity, I usually came up with insanity."

Years later, at Sandusky's criminal trial, a Penn State coach said that he saw Sandusky showering with boys all the time--and thought nothing of it. Crazy Jerry and his horseplay. Who knew what he would get up to next?

  On the afternoon of May 3, 1998, Sandusky called the home of an eleven-year-old boy he had met through the Second Mile and invited him to a Penn State athletic facility. Sandusky picked him up that evening. The two wrestled and worked out on the exercise machines. Sandusky kissed the boy on the top of his head and said, "I love you."

Sandusky then asked the boy if he wanted to take a shower, and the boy agreed. According to the formal investigation of the Sandusky case, conducted by the law firm of the former F.B.I. director Louis Freeh:

While in the shower, Sandusky wrapped his hands around the boy's chest and said, "I'm gonna squeeze your guts out."

The boy then washed his body and hair. Sandusky lifted the boy to "get the soap out of"

the boy's hair, bringing the boy's feet "up pretty high"

near Sandusky's waist. The boy's back was touching Sandusky's chest and his feet touched Sandusky's thigh. The boy felt "weird"

and "uncomfortable"

during his time in the shower.

  This is standard child-molester tradecraft. The successful pedophile does not select his targets arbitrarily. He culls them from a larger pool, testing and probing until he finds the most vulnerable. Clay, for example, first put himself in a place with easy access to children--an elementary school. Then he worked his way through his class. He began by simply asking boys if they wanted to stay after school. "Those who could not do so without parental permission were screened out,"

van Dam writes. Children with vigilant parents are too risky. Those who remained were then caressed on the back, first over the shirt and then, if there was no objection from the child, under the shirt. "The child's response was evaluated by waiting to see what was reported to the parents,"

she goes on. "Parents inquiring about this behavior were told by Mr. Clay that he had simply been checking their child for signs of chicken pox. Those children were not targeted further."

The rest were "selected for more contact,"

gradually moving below the belt and then to the genitals.

  The child molester's key strategy is one of escalation, desensitizing the target with an ever-expanding touch. In interviews and autobiographies, pedophiles describe their escalation techniques like fly fishermen comparing lures. Consider the child molester van Dam calls Cook:

Some of the little tricks that always work with younger boys are things like always sitting in a sofa, or a chair with big, soft arms if possible. I would sit with my legs well out and my feet flat on the floor. My arms would always be in an "open"

position. The younger kids have not developed a "personal space"

yet, and when talking with me, will move in very close. If they are showing me something, particularly on paper, it is easy to hold the object in such a way that the child will move in between my legs or even perch on my knee very early on. If the boy sat on my lap, or very close in, leaning against me, I would put my arm around him loosely. As this became a part of our relationship, I would advance to two arms around him, and hold him closer and tighter. . . . Goodbyes would progress from waves, to brief hugs, to kisses on the cheek, to kisses on the mouth in very short order.

  Sandusky started with wrestling, to make physical touch seem normal. In the shower, the boy initially turned on a showerhead a few feet from Sandusky. Sandusky told him to use the shower next to him. This was a test. The boy complied. Then came the bear hug. The boy's back was touching Sandusky's chest and his feet touched Sandusky's thigh. Sandusky wanted to see how the boy would react. Was this too much too soon? The boy felt "weird"

and "uncomfortable."

Sandusky retreated. The following week, Sandusky showed up at the boy's home, circling back to test the waters once again. How did the boy feel? Had he told his mother? Was he a promising lead, or too risky? As it turned out, the mother had alerted the University Police Department, and a detective, Ronald Schreffler, was hiding in the house. According to the Freeh report:

Schreffler overheard Sandusky say he had gone to the boy's baseball game the night before but found the game had been cancelled. The boy's mother told Sandusky that her son had been acting "different"

since they had been together on May 3, 1998 and asked Sandusky if anything had happened that day. Sandusky replied, "[w]e worked out. Did [the boy] say something happened?"

Sandusky added that the boy had taken a shower, and said "[m]aybe I worked him too hard."

Sandusky also asked the boy's mother if he should leave him alone, and she said that would be best. Sandusky then apologized.

  A few days later, the mother asked Sandusky to come by the house again; the police were once more in the next room. She questioned him more closely about what had happened in the shower. According to the Freeh report:

Sandusky asked to speak with the son and the mother replied that she did not feel that was a good idea as her son was confused and she did not want Sandusky to attend any of the boy's baseball games. Sandusky responded, "I understand. I was wrong. I wish I could get forgiveness. I know I won't get it from you. I wish I were dead."

  Put yourself in the mind of the detective hiding in the house. Schreffler was there to gather evidence of sexual abuse. But there was no evidence of sexual abuse. Sandusky didn't rape the boy in the shower. That was something that might come only after several weeks, if not months. He gave the boy an exploratory bear hug. Now he was back at the boy's home. But he didn't seem like an aggressive predator. He was carefully soliciting the mother's opinion and apologizing, with all his considerable charm. "I wish I were dead,"

he says to the mother. Is that an admission of guilt? Or is Sandusky saying how mortified he is that he--savior of young boys--could possibly have alienated a child and his mother? Sandusky had been caught in the subtle, early maneuvers of victim selection, and what Schreffler witnessed was Sandusky aborting his pursuit of the boy, not pressing forward. Sandusky had looked for vulnerability and hadn't found it.

  The episode was, as the parent said of the first allegations against Mr. Clay, "all vague."

The mother saw her son come home from the gym with his hair wet. He told her that he had showered with Sandusky. He seemed upset, and showered again the following morning. The mother called a psychologist, Alycia Chambers, who had been working with her son, and one of her questions to Chambers was "Am I overreacting?"

She wasn't sure what had happened. Nor, for that matter, was her son. Here is the Freeh report again:

Later that day, Chambers met with the boy who told her about the prior day's events and that he felt "like the luckiest kid in the world"

to get to sit on the sidelines at Penn State football games. The boy said that he did not want to get Sandusky in "trouble"

and that Sandusky must not have meant anything by his actions. The boy did not want anyone to talk to Sandusky because he might not invite him to any more games.

  Chambers wrote a report on the case and gave it to the University Police Department and Child and Youth Services. She thought that Sandusky's behavior met the definition of a "likely pedophile's pattern of building trust and gradual introduction of physical touch, within a context of a "�loving,' "�special' relationship."

But Jerry Lauro, the caseworker assigned to the incident by the Department of Public Welfare in Harrisburg, disagreed. He thought that the incident fell into a "gray"

area concerning "boundary issues."

The boy was then evaluated by a counsellor named John Seasock, who concluded, "There seems to be no incident which could be termed as sexual abuse, nor did there appear to be any sequential pattern of logic and behavior which is usually consistent with adults who have difficulty with sexual abuse of children."

Seasock didn't think Sandusky was grooming. Someone, he concluded, should talk to Sandusky about how to "stay out of such gray area situations in the future."

  Of all those involved in the investigation, only one person--the psychologist Alycia Chambers--recognized Sandusky's actions for what they were. Here was someone with the full authority and expertise of psychological training, who identified a prominent man with virtually unlimited access to vulnerable children as a "likely pedophile."

But what more could she do? She had told the police. Patient confidentiality constrained her from going to the media, and her responsibility to her client made her wary of turning him into a public victim. Then, there was the fact that two other trained professionals had seen the same evidence she had, and reached the opposite conclusion. She was in the grip of the same uncertainty that afflicts even the best people when confronted with a child molester. She thought Sandusky was suspicious. No one agreed with her. Maybe she decided that she could be wrong.

  Lauro and Schreffler--the man who had hidden in the other room--met with Sandusky. He told them that he had hugged the boy but that "there was nothing sexual about it."

He admitted to showering with other boys in the past. He said, "Honest to God, nothing happened."

Everyone knew Sandusky, and everyone knew that he was a bit of a saint and a bit of a knucklehead. For all we know, he quoted those lines from his book: "At the times when I found myself searching for maturity, I usually came up with insanity."

Penn State officials had been apprised of the investigation from the beginning. After the meeting between Lauro, Schreffler, and Sandusky, Gary Schultz, Penn State's senior vice-president for business and finance, e-mailed Graham Spanier, the university's president, and Tim Curley, the school's athletic director, and told them that the investigators were dropping the whole matter. Sandusky, Schultz wrote, "was a little emotional and expressed concern as to how this might have adversely affected the child."

  Joe Paterno, Sandusky's boss, was a football obsessive. He played quarterback at Brooklyn Prep and at Brown University, which he attended on a football scholarship. Aside from a short stint in the Army, he never held a job outside of football. He began at Penn State as an assistant coach in 1950 and never left. He talked and thought football, around the clock. "At night,"

Posnanski writes, "he wrote countless notes (all his life, he was a compulsive note-taker) about football ideas he wanted to try, plays he wanted to run, techniques he wanted to teach, improvements he wanted to make, thoughts about leadership that crossed his mind."

Shortly after Paterno arrived in State College, he moved into the basement of a fellow assistant coach, Jim O'Hara. Finally, O'Hara confronted him. "Joe, you've been with us ten years. Get the hell out of here."

Paterno, puzzled, replied, "Have I been here that long?"

  Paterno was strict and uncompromising. "Even as a boy, when he played quarterback on his high-school football team back in Brooklyn, he would lecture his teammates in his high-pitched squeal when one of them unleashed a swear word,"

Posnanski writes. " "�Aw gee, come on, guys, keep it clean!' They thought him a prude even then. He had lived a sheltered life--not by accident but by choice. The Paternos never even watched any television except "�The Wonderful World of Disney' on Sunday nights."

  He scripted practices down to the minute. He did not like distractions. "He would scream at us all the time, "�Would you just let me coach my football team,' "

a friend tells Posnanski. "That's all he wanted to do. Every other thing made him crazy."

Once, while hard at work drafting a new defensive scheme, he all but disappeared. "We could have moved out, and he wouldn't have noticed,"

his wife, Sue, said. "He might have noticed when he came out and there was no dinner for him. But he might not even have noticed that. He was in his own world."

  Paterno did not like Sandusky. They argued openly. Paterno found Sandusky's goofiness exasperating, and the trail of kids following him around irritated Paterno no end. He considered firing Sandusky many times. But, according to Posnanski, he realized that he needed Sandusky--that the emotional, bear-hugging, impulsive knucklehead was a necessary counterpart to his own discipline and austerity. Sandusky never accepted any of the job offers that would have taken him away from Penn State, because he could not leave the Second Mile. But he also stayed because of Paterno. What could be better, for his purposes, than a boss with eyes only for the football field, who dismissed him as an exasperating, impulsive knucklehead? Pedophiles cluster in professions that give them access to vulnerable children--teaching, the clergy, medicine. But Sandusky's insight, if you want to call it that, was that the culture of football could be the greatest hiding place of all, a place where excessive physicality is the norm, where horseplay is what often passes for wit, where young men shower together after every game and practice, and where those in charge spend their days and nights dreaming only of new defensive schemes.

  In 1999, Paterno made it plain to Sandusky that he would not be the next head coach of Penn State. Sandusky retired and took an emeritus position. On February 9, 2001, a former Penn State quarterback named Mike McQueary saw Sandusky in the shower with a young boy at a Penn State athletic facility. What exactly McQueary witnessed is still in dispute. That evening, he spoke to a family friend--a local doctor--and told him he had heard "sexual"

sounds. The doctor asked him several times if he had seen any sexual act, and each time McQueary said no. Eleven years later, in his grand-jury testimony and at Sandusky's criminal trial, McQueary's memory grew more explicit: he had seen Sandusky raping the boy, he now said. What is clear, though, is that whatever McQueary saw or heard upset him greatly. He went to Paterno. Paterno called Tim Curley, the Penn State athletic director.

  Posnanski, in one of his final interviews with Paterno, asked him if he had considered calling the police. "To be honest with you, I didn't,"

Paterno said. "This isn't my field. I didn't know what to do. I had not seen anything. Jerry didn't work for me anymore. I didn't have anything to do with him. I tried to look through the Penn State guidelines to see what I was supposed to do. It said I was supposed to call Tim [Curley]. So I called him."

  Curley met with McQueary and Paterno. Then he and Gary Schultz, the university's vice-president for business and finance, went to the Penn State president, Graham Spanier. Here is the Freeh report again:

Spanier said that the men gave him a "heads up"

that a member of the Athletic Department staff had reported to Paterno that Sandusky was in an athletic locker room facility showering with one of his Second Mile youth after a workout. Sandusky and the youth, according to Spanier, were "horsing around"

or "engaged in horseplay."

Spanier said that the staff member "was not sure what he saw because it was around a corner and indirect."

. . . Spanier said he asked two questions: (i) "Are you sure that it was described to you as horsing around?"

and (ii) "Are you sure that that is all that was reported?"

According to Spanier, both Schultz and Curley said "yes"

to both questions. Spanier said that the men agreed that they were "uncomfortable"

with such a situation, that it was inappropriate, and that they did not want it to happen again.

  Horsing around in the shower? That was Jerry being Jerry. It did not occur to them that the goofy, horseplaying Sandusky they thought they knew was another of Sandusky's deceptions. Those who put all their ingenuity and energy into fooling us usually succeed. That is the lesson of a world-class swindler like Bernard Madoff, and of Donald Silva, in his parents' bed with a ten-year-old boy and the woman he later married--not to mention Jeffrey Clay. Clay, van Dam writes, got his teaching certificate reactivated. He went on to teach the handicapped and take foster children into his home. "Needless to say,"

she adds, "his expertise, enthusiasm, and exceptional generosity to those who are needy has been very much appreciated by the community in which he now lives."

  Tim Curley and Gary Schultz currently face criminal charges. Graham Spanier was forced out of office last November, a few days after the grand-jury indictment of Sandusky was released. At the same time, someone came to Paterno's house with an envelope. According to Posnanski:

Paterno opened the envelope; inside was a sheet of Penn State stationery with just a name, John Surma, and a phone number. Surma was the CEO of U.S. Steel and the vice chairman of the State Board of Trustees. Paterno picked up the phone and called the number.
"This is Joe Paterno."


"This is John Surma. The board of trustees have terminated you effective immediately."

Paterno hung up the phone before he could hear anything else.
A minute later, Sue called the number. "After sixty-one years,"

she said, her voice cracking, "he deserved better."

And then she hung up.

  Paterno died two months later.

  The real genius of Steve Jobs.

  Not long after Steve Jobs got married, in 1991, he moved with his wife to a nineteen-thirties, Cotswolds-style house in old Palo Alto. Jobs always found it difficult to furnish the places where he lived. His previous house had only a mattress, a table, and chairs. He needed things to be perfect, and it took time to figure out what perfect was. This time, he had a wife and family in tow, but it made little difference. "We spoke about furniture in theory for eight years,"

his wife, Laurene Powell, tells Walter Isaacson, in "Steve Jobs,"

Isaacson's enthralling new biography of the Apple founder. "We spent a lot of time asking ourselves, "�What is the purpose of a sofa?' "

  It was the choice of a washing machine, however, that proved most vexing. European washing machines, Jobs discovered, used less detergent and less water than their American counterparts, and were easier on the clothes. But they took twice as long to complete a washing cycle. What should the family do? As Jobs explained, "We spent some time in our family talking about what's the trade-off we want to make. We ended up talking a lot about design, but also about the values of our family. Did we care most about getting our wash done in an hour versus an hour and a half? Or did we care most about our clothes feeling really soft and lasting longer? Did we care about using a quarter of the water? We spent about two weeks talking about this every night at the dinner table."

  Steve Jobs, Isaacson's biography makes clear, was a complicated and exhausting man. "There are parts of his life and personality that are extremely messy, and that's the truth,"

Powell tells Isaacson. "You shouldn't whitewash it."

Isaacson, to his credit, does not. He talks to everyone in Jobs's career, meticulously recording conversations and encounters dating back twenty and thirty years. Jobs, we learn, was a bully. "He had the uncanny capacity to know exactly what your weak point is, know what will make you feel small, to make you cringe,"

a friend of his tells Isaacson. Jobs gets his girlfriend pregnant, and then denies that the child is his. He parks in handicapped spaces. He screams at subordinates. He cries like a small child when he does not get his way. He gets stopped for driving a hundred miles an hour, honks angrily at the officer for taking too long to write up the ticket, and then resumes his journey at a hundred miles an hour. He sits in a restaurant and sends his food back three times. He arrives at his hotel suite in New York for press interviews and decides, at 10 P.M., that the piano needs to be repositioned, the strawberries are inadequate, and the flowers are all wrong: he wanted calla lilies. (When his public-relations assistant returns, at midnight, with the right flowers, he tells her that her suit is "disgusting."

) "Machines and robots were painted and repainted as he compulsively revised his color scheme,"

Isaacson writes, of the factory Jobs built, after founding NeXT, in the late nineteen-eighties. "The walls were museum white, as they had been at the Macintosh factory, and there were $20,000 black leather chairs and a custom-made staircase. . . . He insisted that the machinery on the 165-foot assembly line be configured to move the circuit boards from right to left as they got built, so that the process would look better to visitors who watched from the viewing gallery."

  Isaacson begins with Jobs's humble origins in Silicon Valley, the early triumph at Apple, and the humiliating ouster from the firm he created. He then charts the even greater triumphs at Pixar and at a resurgent Apple, when Jobs returns, in the late nineteen-nineties, and our natural expectation is that Jobs will emerge wiser and gentler from his tumultuous journey. He never does. In the hospital at the end of his life, he runs through sixty-seven nurses before he finds three he likes. "At one point, the pulmonologist tried to put a mask over his face when he was deeply sedated,"

Isaacson writes:

Jobs ripped it off and mumbled that he hated the design and refused to wear it. Though barely able to speak, he ordered them to bring five different options for the mask and he would pick a design he liked. . . . He also hated the oxygen monitor they put on his finger. He told them it was ugly and too complex.

  One of the great puzzles of the industrial revolution is why it began in England. Why not France, or Germany? Many reasons have been offered. Britain had plentiful supplies of coal, for instance. It had a good patent system in place. It had relatively high labor costs, which encouraged the search for labor-saving innovations. In an article published earlier this year, however, the economists Ralf Meisenzahl and Joel Mokyr focus on a different explanation: the role of Britain's human-capital advantage--in particular, on a group they call "tweakers."

They believe that Britain dominated the industrial revolution because it had a far larger population of skilled engineers and artisans than its competitors: resourceful and creative men who took the signature inventions of the industrial age and tweaked them--refined and perfected them, and made them work.

  In 1779, Samuel Crompton, a retiring genius from Lancashire, invented the spinning mule, which made possible the mechanization of cotton manufacture. Yet England's real advantage was that it had Henry Stones, of Horwich, who added metal rollers to the mule; and James Hargreaves, of Tottington, who figured out how to smooth the acceleration and deceleration of the spinning wheel; and William Kelly, of Glasgow, who worked out how to add water power to the draw stroke; and John Kennedy, of Manchester, who adapted the wheel to turn out fine counts; and, finally, Richard Roberts, also of Manchester, a master of precision machine tooling--and the tweaker's tweaker. He created the "automatic"

spinning mule: an exacting, high-speed, reliable rethinking of Crompton's original creation. Such men, the economists argue, provided the "micro inventions necessary to make macro inventions highly productive and remunerative."

  Was Steve Jobs a Samuel Crompton or was he a Richard Roberts? In the eulogies that followed Jobs's death, last month, he was repeatedly referred to as a large-scale visionary and inventor. But Isaacson's biography suggests that he was much more of a tweaker. He borrowed the characteristic features of the Macintosh--the mouse and the icons on the screen--from the engineers at Xerox PARC, after his famous visit there, in 1979. The first portable digital music players came out in 1996. Apple introduced the iPod, in 2001, because Jobs looked at the existing music players on the market and concluded that they "truly sucked."

Smart phones started coming out in the nineteen-nineties. Jobs introduced the iPhone in 2007, more than a decade later, because, Isaacson writes, "he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to."

The idea for the iPad came from an engineer at Microsoft, who was married to a friend of the Jobs family, and who invited Jobs to his fiftieth-birthday party. As Jobs tells Isaacson:

This guy badgered me about how Microsoft was going to completely change the world with this tablet PC software and eliminate all notebook computers, and Apple ought to license his Microsoft software. But he was doing the device all wrong. It had a stylus. As soon as you have a stylus, you're dead. This dinner was like the tenth time he talked to me about it, and I was so sick of it that I came home and said, "Fuck this, let's show him what a tablet can really be."

  Even within Apple, Jobs was known for taking credit for others' ideas. Jonathan Ive, the designer behind the iMac, the iPod, and the iPhone, tells Isaacson, "He will go through a process of looking at my ideas and say, "�That's no good. That's not very good. I like that one.' And later I will be sitting in the audience and he will be talking about it as if it was his idea."

  Jobs's sensibility was editorial, not inventive. His gift lay in taking what was in front of him--the tablet with stylus--and ruthlessly refining it. After looking at the first commercials for the iPad, he tracked down the copywriter, James Vincent, and told him, "Your commercials suck."

"Well, what do you want?"

Vincent shot back. "You've not been able to tell me what you want."


"I don't know,"

Jobs said. "You have to bring me something new. Nothing you've shown me is even close."


Vincent argued back and suddenly Jobs went ballistic. "He just started screaming at me,"

Vincent recalled. Vincent could be volatile himself, and the volleys escalated.
When Vincent shouted, "You've got to tell me what you want,"

Jobs shot back, "You've got to show me some stuff, and I'll know it when I see it."

  I'll know it when I see it. That was Jobs's credo, and until he saw it his perfectionism kept him on edge. He looked at the title bars--the headers that run across the top of windows and documents--that his team of software developers had designed for the original Macintosh and decided he didn't like them. He forced the developers to do another version, and then another, about twenty iterations in all, insisting on one tiny tweak after another, and when the developers protested that they had better things to do he shouted, "Can you imagine looking at that every day? It's not just a little thing. It's something we have to do right."

  The famous Apple "Think Different"

campaign came from Jobs's advertising team at TBWAChiatDay. But it was Jobs who agonized over the slogan until it was right:

They debated the grammatical issue: If "different"

was supposed to modify the verb "think,"

it should be an adverb, as in "think differently."

But Jobs insisted that he wanted "different"

to be used as a noun, as in "think victory"

or "think beauty."

Also, it echoed colloquial use, as in "think big."

Jobs later explained, "We discussed whether it was correct before we ran it. It's grammatical, if you think about what we're trying to say. It's not think the same, it's think different. Think a little different, think a lot different, think different. "�Think differently' wouldn't hit the meaning for me."

  The point of Meisenzahl and Mokyr's argument is that this sort of tweaking is essential to progress. James Watt invented the modern steam engine, doubling the efficiency of the engines that had come before. But when the tweakers took over the efficiency of the steam engine swiftly quadrupled. Samuel Crompton was responsible for what Meisenzahl and Mokyr call "arguably the most productive invention"

of the industrial revolution. But the key moment, in the history of the mule, came a few years later, when there was a strike of cotton workers. The mill owners were looking for a way to replace the workers with unskilled labor, and needed an automatic mule, which did not need to be controlled by the spinner. Who solved the problem? Not Crompton, an unambitious man who regretted only that public interest would not leave him to his seclusion, so that he might "earn undisturbed the fruits of his ingenuity and perseverance."

It was the tweaker's tweaker, Richard Roberts, who saved the day, producing a prototype, in 1825, and then an even better solution in 1830. Before long, the number of spindles on a typical mule jumped from four hundred to a thousand. The visionary starts with a clean sheet of paper, and re-imagines the world. The tweaker inherits things as they are, and has to push and pull them toward some more nearly perfect solution. That is not a lesser task.

  Jobs's friend Larry Ellison, the founder of Oracle, had a private jet, and he designed its interior with a great deal of care. One day, Jobs decided that he wanted a private jet, too. He studied what Ellison had done. Then he set about to reproduce his friend's design in its entirety--the same jet, the same reconfiguration, the same doors between the cabins. Actually, not in its entirety. Ellison's jet "had a door between cabins with an open button and a close button,"

Isaacson writes. "Jobs insisted that his have a single button that toggled. He didn't like the polished stainless steel of the buttons, so he had them replaced with brushed metal ones."

Having hired Ellison's designer, "pretty soon he was driving her crazy."

Of course he was. The great accomplishment of Jobs's life is how effectively he put his idiosyncrasies--his petulance, his narcissism, and his rudeness--in the service of perfection. "I look at his airplane and mine,"

Ellison says, "and everything he changed was better."

  The angriest Isaacson ever saw Steve Jobs was when the wave of Android phones appeared, running the operating system developed by Google. Jobs saw the Android handsets, with their touchscreens and their icons, as a copy of the iPhone. He decided to sue. As he tells Isaacson:

Our lawsuit is saying, "Google, you fucking ripped off the iPhone, wholesale ripped us off."

Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple's $40 billion in the bank, to right this wrong. I'm going to destroy Android, because it's a stolen product. I'm willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google's products--Android, Google Docs--are shit.

  In the nineteen-eighties, Jobs reacted the same way when Microsoft came out with Windows. It used the same graphical user interface--icons and mouse--as the Macintosh. Jobs was outraged and summoned Gates from Seattle to Apple's Silicon Valley headquarters. "They met in Jobs's conference room, where Gates found himself surrounded by ten Apple employees who were eager to watch their boss assail him,"

Isaacson writes. "Jobs didn't disappoint his troops. "�You're ripping us off!' he shouted. "�I trusted you, and now you're stealing from us!' "

  Gates looked back at Jobs calmly. Everyone knew where the windows and the icons came from. "Well, Steve,"

Gates responded. "I think there's more than one way of looking at it. I think it's more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it."

  Jobs was someone who took other people's ideas and changed them. But he did not like it when the same thing was done to him. In his mind, what he did was special. Jobs persuaded the head of Pepsi-Cola, John Sculley, to join Apple as C.E.O., in 1983, by asking him, "Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?"

When Jobs approached Isaacson to write his biography, Isaacson first thought ("half jokingly"

) that Jobs had noticed that his two previous books were on Benjamin Franklin and Albert Einstein, and that he "saw himself as the natural successor in that sequence."

The architecture of Apple software was always closed. Jobs did not want the iPhone and the iPod and the iPad to be opened up and fiddled with, because in his eyes they were perfect. The greatest tweaker of his generation did not care to be tweaked.

  Perhaps this is why Bill Gates--of all Jobs's contemporaries--gave him fits. Gates resisted the romance of perfectionism. Time and again, Isaacson repeatedly asks Jobs about Gates and Jobs cannot resist the gratuitous dig. "Bill is basically unimaginative,"

Jobs tells Isaacson, "and has never invented anything, which I think is why he's more comfortable now in philanthropy than technology. He just shamelessly ripped off other people's ideas."

  After close to six hundred pages, the reader will recognize this as vintage Jobs: equal parts insightful, vicious, and delusional. It's true that Gates is now more interested in trying to eradicate malaria than in overseeing the next iteration of Word. But this is not evidence of a lack of imagination. Philanthropy on the scale that Gates practices it represents imagination at its grandest. In contrast, Jobs's vision, brilliant and perfect as it was, was narrow. He was a tweaker to the last, endlessly refining the same territory he had claimed as a young man.

  As his life wound down, and cancer claimed his body, his great passion was designing Apple's new, three-million-square-foot headquarters, in Cupertino. Jobs threw himself into the details. "Over and over he would come up with new concepts, sometimes entirely new shapes, and make them restart and provide more alternatives,"

Isaacson writes. He was obsessed with glass, expanding on what he learned from the big panes in the Apple retail stores. "There would not be a straight piece of glass in the building,"

Isaacson writes. "All would be curved and seamlessly joined. . . . The planned center courtyard was eight hundred feet across (more than three typical city blocks, or almost the length of three football fields), and he showed it to me with overlays indicating how it could surround St. Peter's Square in Rome."

The architects wanted the windows to open. Jobs said no. He "had never liked the idea of people being able to open things. "�That would just allow people to screw things up.' "

  Xerox PARC, Apple, and the truth about innovation.

  1.

  In late 1979, a twenty-four-year-old entrepreneur paid a visit to a research center in Silicon Valley called Xerox PARC. He was the co-founder of a small computer startup down the road, in Cupertino. His name was Steve Jobs.

  Xerox PARC was the innovation arm of the Xerox Corporation. It was, and remains, on Coyote Hill Road, in Palo Alto, nestled in the foothills on the edge of town, in a long, low concrete building, with enormous terraces looking out over the jewels of Silicon Valley. To the northwest was Stanford University's Hoover Tower. To the north was Hewlett-Packard's sprawling campus. All around were scores of the other chip designers, software firms, venture capitalists, and hardware-makers. A visitor to PARC, taking in that view, could easily imagine that it was the computer world's castle, lording over the valley below--and, at the time, this wasn't far from the truth. In 1970, Xerox had assembled the world's greatest computer engineers and programmers, and for the next ten years they had an unparalleled run of innovation and invention. If you were obsessed with the future in the seventies, you were obsessed with Xerox PARC--which was why the young Steve Jobs had driven to Coyote Hill Road.

  Apple was already one of the hottest tech firms in the country. Everyone in the Valley wanted a piece of it. So Jobs proposed a deal: he would allow Xerox to buy a hundred thousand shares of his company for a million dollars--its highly anticipated I.P.O. was just a year away--if PARC would "open its kimono." A lot of haggling ensued. Jobs was the fox, after all, and PARC was the henhouse. What would he be allowed to see? What wouldn't he be allowed to see? Some at PARC thought that the whole idea was lunacy, but, in the end, Xerox went ahead with it. One PARC scientist recalls Jobs as "rambunctious"--a fresh-cheeked, caffeinated version of today's austere digital emperor. He was given a couple of tours, and he ended up standing in front of a Xerox Alto, PARC's prized personal computer.

  An engineer named Larry Tesler conducted the demonstration. He moved the cursor across the screen with the aid of a "mouse." Directing a conventional computer, in those days, meant typing in a command on the keyboard. Tesler just clicked on one of the icons on the screen. He opened and closed "windows," deftly moving from one task to another. He wrote on an elegant word-processing program, and exchanged e-mails with other people at PARC, on the world's first Ethernet network. Jobs had come with one of his software engineers, Bill Atkinson, and Atkinson moved in as close as he could, his nose almost touching the screen. "Jobs was pacing around the room, acting up the whole time," Tesler recalled. "He was very excited. Then, when he began seeing the things I could do onscreen, he watched for about a minute and started jumping around the room, shouting, 'Why aren't you doing anything with this? This is the greatest thing. This is revolutionary!'"

  Xerox began selling a successor to the Alto in 1981. It was slow and underpowered--and Xerox ultimately withdrew from personal computers altogether. Jobs, meanwhile, raced back to Apple, and demanded that the team working on the company's next generation of personal computers change course. He wanted menus on the screen. He wanted windows. He wanted a mouse. The result was the Macintosh, perhaps the most famous product in the history of Silicon Valley.

  "If Xerox had known what it had and had taken advantage of its real opportunities," Jobs said, years later, "it could have been as big as I.B.M. plus Microsoft plus Xerox combined--and the largest high-technology company in the world."

  This is the legend of Xerox PARC. Jobs is the Biblical Jacob and Xerox is Esau, squandering his birthright for a pittance. In the past thirty years, the legend has been vindicated by history. Xerox, once the darling of the American high-technology community, slipped from its former dominance. Apple is now ascendant, and the demonstration in that room in Palo Alto has come to symbolize the vision and ruthlessness that separate true innovators from also-rans. As with all legends, however, the truth is a bit more complicated.

  2.

  After Jobs returned from PARC, he met with a man named Dean Hovey, who was one of the founders of the industrial-design firm that would become known as IDEO. "Jobs went to Xerox PARC on a Wednesday or a Thursday, and I saw him on the Friday afternoon," Hovey recalled. "I had a series of ideas that I wanted to bounce off him, and I barely got two words out of my mouth when he said, 'No, no, no, you've got to do a mouse.' I was, like, 'What's a mouse?' I didn't have a clue. So he explains it, and he says, 'You know, [the Xerox mouse] is a mouse that cost three hundred dollars to build and it breaks within two weeks. Here's your design spec: Our mouse needs to be manufacturable for less than fifteen bucks. It needs to not fail for a couple of years, and I want to be able to use it on Formica and my bluejeans.' From that meeting, I went to Walgreens, which is still there, at the corner of Grant and El Camino in Mountain View, and I wandered around and bought all the underarm deodorants that I could find, because they had that ball in them. I bought a butter dish. That was the beginnings of the mouse."

  I spoke with Hovey in a ramshackle building in downtown Palo Alto, where his firm had started out. He had asked the current tenant if he could borrow his old office for the morning, just for the fun of telling the story of the Apple mouse in the place where it was invented. The room was the size of someone's bedroom. It looked as if it had last been painted in the Coolidge Administration. Hovey, who is lean and healthy in a Northern California yoga-and-yogurt sort of way, sat uncomfortably at a rickety desk in a corner of the room. "Our first machine shop was literally out on the roof," he said, pointing out the window to a little narrow strip of rooftop, covered in green outdoor carpeting. "We didn't tell the planning commission. We went and got that clear corrugated stuff and put it across the top for a roof. We got out through the window."

  He had brought a big plastic bag full of the artifacts of that moment: diagrams scribbled on lined paper, dozens of differently sized plastic mouse shells, a spool of guitar wire, a tiny set of wheels from a toy train set, and the metal lid from a jar of Ralph's preserves. He turned the lid over. It was filled with a waxlike substance, the middle of which had a round indentation, in the shape of a small ball. "It's epoxy casting resin," he said. "You pour it, and then I put Vaseline on a smooth steel ball, and set it in the resin, and it hardens around it." He tucked the steel ball underneath the lid and rolled it around the tabletop. "It's a kind of mouse."

  The hard part was that the roller ball needed to be connected to the housing of the mouse, so that it didn't fall out, and so that it could transmit information about its movements to the cursor on the screen. But if the friction created by those connections was greater than the friction between the tabletop and the roller ball, the mouse would skip. And the more the mouse was used the more dust it would pick up off the tabletop, and the more it would skip. The Xerox PARC mouse was an elaborate affair, with an array of ball bearings supporting the roller ball. But there was too much friction on the top of the ball, and it couldn't deal with dust and grime.

  At first, Hovey set to work with various arrangements of ball bearings, but nothing quite worked. "This was the 'aha' moment," Hovey said, placing his fingers loosely around the sides of the ball, so that they barely touched its surface. "So the ball's sitting here. And it rolls. I attribute that not to the table but to the oldness of the building. The floor's not level. So I started playing with it, and that's when I realized: I want it to roll. I don't want it to be supported by all kinds of ball bearings. I want to just barely touch it."

  The trick was to connect the ball to the rest of the mouse at the two points where there was the least friction--right where his fingertips had been, dead center on either side of the ball. "If it's right at midpoint, there's no force causing it to rotate. So it rolls."

  Hovey estimated their consulting fee at thirty-five dollars an hour; the whole project cost perhaps a hundred thousand dollars. "I originally pitched Apple on doing this mostly for royalties, as opposed to a consulting job," he recalled. "I said, 'I'm thinking fifty cents apiece,' because I was thinking that they'd sell fifty thousand, maybe a hundred thousand of them." He burst out laughing, because of how far off his estimates ended up being. 's pretty savvy. He said no. Maybe if I'd asked for a nickel, I would have been fine."

  3.

  Here is the first complicating fact about the Jobs visit. In the legend of Xerox PARC, Jobs stole the personal computer from Xerox. But the striking thing about Jobs's instructions to Hovey is that he didn't want to reproduce what he saw at PARC. "You know, there were disputes around the number of buttons--three buttons, two buttons, one-button mouse," Hovey went on. "The mouse at Xerox had three buttons. But we came around to the fact that learning to mouse is a feat in and of itself, and to make it as simple as possible, with just one button, was pretty important."

  So was what Jobs took from Xerox the idea of the mouse? Not quite, because Xerox never owned the idea of the mouse. The PARC researchers got it from the computer scientist Douglas Engelbart, at Stanford Research Institute, fifteen minutes away on the other side of the university campus. Engelbart dreamed up the idea of moving the cursor around the screen with a stand-alone mechanical "animal" back in the mid- nineteen-sixties. His mouse was a bulky, rectangular affair, with what looked like steel roller-skate wheels. If you lined up Engelbart's mouse, Xerox's mouse, and Apple's mouse, you would not see the serial reproduction of an object. You would see the evolution of a concept.

  The same is true of the graphical user interface that so captured Jobs's imagination. Xerox PARC's innovation had been to replace the traditional computer command line with onscreen icons. But when you clicked on an icon you got a pop-up menu: this was the intermediary between the user's intention and the computer's response. Jobs's software team took the graphical interface a giant step further. It emphasized "direct manipulation." If you wanted to make a window bigger, you just pulled on its corner and made it bigger; if you wanted to move a window across the screen, you just grabbed it and moved it. The Apple designers also invented the menu bar, the pull-down menu, and the trash can--all features that radically simplified the original Xerox PARC idea.

  The difference between direct and indirect manipulation--between three buttons and one button, three hundred dollars and fifteen dollars, and a roller ball supported by ball bearings and a free-rolling ball--is not trivial. It is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that's appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.

  In a recent study, "The Culture of Military Innovation," the military scholar Dima Adamsky makes a similar argument about the so-called Revolution in Military Affairs. R.M.A. refers to the way armies have transformed themselves with the tools of the digital age--such as precision-guided missiles, surveillance drones, and real-time command, control, and communications technologies--and Adamsky begins with the simple observation that it is impossible to determine who invented R.M.A. The first people to imagine how digital technology would transform warfare were a cadre of senior military intellectuals in the Soviet Union, during the nineteen-seventies. The first country to come up with these high-tech systems was the United States. And the first country to use them was Israel, in its 1982 clash with the Syrian Air Force in Lebanon's Bekaa Valley, a battle commonly referred to as "the Bekaa Valley turkey shoot." Israel coördinated all the major innovations of R.M.A. in a manner so devastating that it destroyed nineteen surface-to-air batteries and eighty-seven Syrian aircraft while losing only a handful of its own planes.

  That's three revolutions, not one, and Adamsky's point is that each of these strands is necessarily distinct, drawing on separate skills and circumstances. The Soviets had a strong, centralized military bureaucracy, with a long tradition of theoretical analysis. It made sense that they were the first to understand the military implications of new information systems. But they didn't do anything with it, because centralized military bureaucracies with strong intellectual traditions aren't very good at connecting word and deed.

  The United States, by contrast, has a decentralized, bottom-up entrepreneurial culture, which has historically had a strong orientation toward technological solutions. The military's close ties to the country' high-tech community made it unsurprising that the U.S. would be the first to invent precision-guidance and next-generation command-and-control communications. But those assets also meant that Soviet-style systemic analysis wasn't going to be a priority. As for the Israelis, their military culture grew out of a background of resource constraint and constant threat. In response, they became brilliantly improvisational and creative. But, as Adamsky points out, a military built around urgent, short-term "fire extinguishing" is not going to be distinguished by reflective theory. No one stole the revolution. Each party viewed the problem from a different perspective, and carved off a different piece of the puzzle.

  In the history of the mouse, Engelbart was the Soviet Union. He was the visionary, who saw the mouse before anyone else did. But visionaries are limited by their visions. "Engelbart's self-defined mission was not to produce a product, or even a prototype; it was an open-ended search for knowledge," Matthew Hiltzik writes, in "Dealers of Lightning" (1999), his wonderful history of Xerox PARC. "Consequently, no project in his lab ever seemed to come to an end." Xerox PARC was the United States: it was a place where things got made. "Xerox created this perfect environment," recalled Bob Metcalfe, who worked there through much of the nineteen-seventies, before leaving to found the networking company 3Com. "There wasn't any hierarchy. We built out our own tools. When we needed to publish papers, we built a printer. When we needed to edit the papers, we built a computer. When we needed to connect computers, we figured out how to connect them. We had big budgets. Unlike many of our brethren, we didn't have to teach. We could just research. It was heaven."

  But heaven is not a good place to commercialize a product. "We built a computer and it was a beautiful thing," Metcalfe went on. "We developed our computer language, our own display, our own language. It was a gold-plated product. But it cost sixteen thousand dollars, and it needed to cost three thousand dollars." For an actual product, you need threat and constraint--and the improvisation and creativity necessary to turn a gold-plated three-hundred-dollar mouse into something that works on Formica and costs fifteen dollars. Apple was Israel.

  Xerox couldn't have been I.B.M. and Microsoft combined, in other words. "You can be one of the most successful makers of enterprise technology products the world has ever known, but that doesn't mean your instincts will carry over to the consumer market," the tech writer Harry McCracken recently wrote. "They're really different, and few companies have ever been successful in both." He was talking about the decision by the networking giant Cisco System, this spring, to shut down its Flip camera business, at a cost of many hundreds of millions of dollars. But he could just as easily have been talking about the Xerox of forty years ago, which was one of the most successful makers of enterprise technology the world has ever known. The fair question is whether Xerox, through its research arm in Palo Alto, found a better way to be Xerox--and the answer is that it did, although that story doesn't get told nearly as often.

  4.

  One of the people at Xerox PARC when Steve Jobs visited was an optical engineer named Gary Starkweather. He is a solid and irrepressibly cheerful man, with large, practical hands and the engineer's gift of pretending that what is impossibly difficult is actually pretty easy, once you shave off a bit here, and remember some of your high-school calculus, and realize that the thing that you thought should go in left to right should actually go in right to left. Once, before the palatial Coyote Hill Road building was constructed, a group that Starkweather had to be connected to was moved to another building, across the Foothill Expressway, half a mile away. There was no way to run a cable under the highway. So Starkweather fired a laser through the air between the two buildings, an improvised communications system that meant that, if you were driving down the Foothill Expressway on a foggy night and happened to look up, you might see a mysterious red beam streaking across the sky. When a motorist drove into the median ditch, "we had to turn it down," Starkweather recalled, with a mischievous smile.

  Lasers were Starkweather's specialty. He started at Xerox's East Coast research facility in Webster, New York, outside Rochester. Xerox built machines that scanned a printed page of type using a photographic lens, and then printed a duplicate. Starkweather's idea was to skip the first step--to run a document from a computer directly into a photocopier, by means of a laser, and turn the Xerox machine into a printer. It was a radical idea. The printer, since Gutenberg, had been limited to the function of re-creation: if you wanted to print a specific image or letter, you had to have a physical character or mark corresponding to that image or letter. What Starkweather wanted to do was take the array of bits and bytes, ones and zeros that constitute digital images, and transfer them straight into the guts of a copier. That meant, at least in theory, that he could print anything.

  "One morning, I woke up and I thought, Why don't we just print something out directly?" Starkweather said. "But when I flew that past my boss he thought it was the most brain-dead idea he had ever heard. He basically told me to find something else to do. The feeling was that lasers were too expensive. They didn't work that well. Nobody wants to do this, computers aren't powerful enough. And I guess, in my naïveté, I kept thinking, He's just not right--there's something about this I really like. It got to be a frustrating situation. He and I came to loggerheads over the thing, about late 1969, early 1970. I was running my experiments in the back room behind a black curtain. I played with them when I could. He threatened to lay off my people if I didn't stop. I was having to make a decision: do I abandon this, or do I try and go up the ladder with it?"

  Then Starkweather heard that Xerox was opening a research center in Palo Alto, three thousand miles away from its New York headquarters. He went to a senior vice-president of Xerox, threatening to leave for I.B.M. if he didn't get a transfer. In January of 1971, his wish was granted, and, within ten months, he had a prototype up and running.

  Starkweather is retired now, and lives in a gated community just north of Orlando, Florida. When we spoke, he was sitting at a picnic table, inside a screened-in porch in his back yard. Behind him, golfers whirred by in carts. He was wearing white chinos and a shiny black short-sleeved shirt, decorated with fluorescent images of vintage hot rods. He had brought out two large plastic bins filled with the artifacts of his research, and he spread the contents on the table: a metal octagonal disk, sketches on lab paper, a black plastic laser housing that served as the innards for one of his printers.

  "There was still a tremendous amount of opposition from the Webster group, who saw no future in computer printing," he went on. "They said, 'I.B.M. is doing that. Why do we need to do that?' and so forth. Also, there were two or three competing projects, which I guess I have the luxury of calling ridiculous. One group had fifty people and another had twenty. I had two." Starkweather picked up a picture of one of his in-house competitors, something called an "optical carriage printer." It was the size of one of those modular Italian kitchen units that you see advertised in fancy design magazines. "It was an unbelievable device," he said, with a rueful chuckle. "It had a ten-inch drum, which turned at five thousand r.p.m., like a super washing machine. It had characters printed on its surface. I think they only ever sold ten of them. The problem was that it was spinning so fast that the drum would blow out and the characters would fly off. And there was only this one lady in Troy, New York, who knew how to put the characters on so that they would stay.

  "So we finally decided to have what I called a fly-off. There was a full page of text--where some of them were non-serif characters, Helvetica, stuff like that--and then a page of graph paper with grid lines, and pages with pictures and some other complex stuff--and everybody had to print all six pages. Well, once we decided on those six pages, I knew I'd won, because I knew there wasn't anything I couldn't print. Are you kidding? If you can translate it into bits, I can print it. Some of these other machines had to go through hoops just to print a curve. A week after the fly-off, they folded those other projects. I was the only game in town." The project turned into the Xerox 9700, the first high-speed, cut-paper laser printer in the world.

  5.

  In one sense, the Starkweather story is of a piece with the Steve Jobs visit. It is an example of the imaginative poverty of Xerox management. Starkweather had to hide his laser behind a curtain. He had to fight for his transfer to PARC. He had to endure the indignity of the fly-off, and even then Xerox management remained skeptical. The founder of PARC, Jack Goldman, had to bring in a team from Rochester for a personal demonstration. After that, Starkweather and Goldman had an idea for getting the laser printer to market quickly: graft a laser onto a Xerox copier called the 7000. The 7000 was an older model, and Xerox had lots of 7000s sitting around that had just come off lease. Goldman even had a customer ready: the Lawrence Livermore laboratory was prepared to buy a whole slate of the machines. Xerox said no. Then Starkweather wanted to make what he called a photo-typesetter, which produced camera-ready copy right on your desk. Xerox said no. "I wanted to work on higher-performance scanners," Starkweather continued. "In other words, what if we print something other than documents? For example, I made a high-resolution scanner and you could print on glass plates." He rummaged in one of the boxes on the picnic table and came out with a sheet of glass, roughly six inches square, on which a photograph of a child's face appeared. The same idea, he said, could have been used to make "masks" for the semiconductor industry--the densely patterned screens used to etch the designs on computer chips. "No one would ever follow through, because Xerox said, 'Now you're in Intel's market, what are you doing that for?' They just could not seem to see that they were in the information business. This"--he lifted up the plate with the little girl's face on it-- "is a copy. It's just not a copy of an office document." But he got nowhere. "Xerox had been infested by a bunch of spreadsheet experts who thought you could decide every product based on metrics. Unfortunately, creativity wasn't on a metric."

  A few days after that afternoon in his back yard, however, Starkweather e-mailed an addendum to his discussion of his experiences at PARC. "Despite all the hassles and risks that happened in getting the laser printer going, in retrospect the journey was that much more exciting," he wrote. "Often difficulties are just opportunities in disguise." Perhaps he felt that he had painted too negative a picture of his time at Xerox, or suffered a pang of guilt about what it must have been like to be one of those Xerox executives on the other side of the table. The truth is that Starkweather was a difficult employee. It went hand in hand with what made him such an extraordinary innovator. When his boss told him to quit working on lasers, he continued in secret. He was disruptive and stubborn and independent-minded--and he had a thousand ideas, and sorting out the good ideas from the bad wasn't always easy. Should Xerox have put out a special order of laser printers for Lawrence Livermore, based on the old 7000 copier? In "Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer" (1988)--a book dedicated to the idea that Xerox was run by the blind--Douglas Smith and Robert Alexander admit that the proposal was hopelessly impractical: "The scanty Livermore proposal could not justify the investment required to start a laser printing business.   . How and where would Xerox manufacture the laser printers? Who would sell and service them? Who would buy them and why?" Starkweather, and his compatriots at Xerox PARC, weren't the source of disciplined strategic insights. They were wild geysers of creative energy.

  The psychologist Dean Simonton argues that this fecundity is often at the heart of what distinguishes the truly gifted. The difference between Bach and his forgotten peers isn't necessarily that he had a better ratio of hits to misses. The difference is that the mediocre might have a dozen ideas, while Bach, in his lifetime, created more than a thousand full-fledged musical compositions. A genius is a genius, Simonton maintains, because he can put together such a staggering number of insights, ideas, theories, random observations, and unexpected connections that he almost inevitably ends up with something great. "Quality," Simonton writes, is "a probabilistic function of quantity."

  Simonton's point is that there is nothing neat and efficient "The more successes there are," he says, "the more failures there are as well" -- meaning that the person who had far more ideas than the rest of us will have far more bad ideas than the rest of us, too. This is why managing the creative process is so difficult. The making of the classic Rolling Stones album "Exile on Main Street" was an ordeal, Keith Richards writes in his new memoir, because the band had too many ideas. It had to fight from under an avalanche of mediocrity: "Head in the Toilet Blues," "Leather Jackets," "Windmill," "I Was Just a Country Boy," "Bent Green Needles," "Labour Pains," and "Pommes de Terre"--the last of which Richards explains with the apologetic, "Well, we were in France at the time."

  At one point, Richards quotes a friend, Jim Dickinson, remembering the origins of the song "Brown Sugar":

  I watched Mick write the lyrics. . . . He wrote it down as fast as he could move his hand. I'd never seen anything like it. He had one of those yellow legal pads, and he'd write a verse a page, just write a verse and then turn the page, and when he had three pages filled, they started to cut it. It was amazing.

  Richards goes on to marvel, "It's unbelievable how prolific he was." Then he writes, "Sometimes you'd wonder how to turn the fucking tap off. The odd times he would come out with so many lyrics, you're crowding the airwaves, boy." Richards clearly saw himself as the creative steward of the Rolling Stones (only in a rock-and-roll band, by the way, can someone like Keith Richards perceive himself as the responsible one), and he came to understand that one of the hardest and most crucial parts of his job was to "turn the fucking tap off," to rein in Mick Jagger's incredible creative energy.

  The more Starkweather talked, the more apparent it became that his entire career had been a version of this problem. Someone was always trying to turn his tap off. But someone had to turn his tap off: the interests of the innovator aren't perfectly aligned with the interests of the corporation. Starkweather saw ideas on their own merits. Xerox was a multinational corporation, with shareholders, a huge sales force, and a vast corporate customer base, and it needed to consider every new idea within the context of what it already had.

  Xerox's managers didn't always make the right decisions when they said no to Starkweather. But he got to PARC, didn't he? And Xerox, to its great credit, had a PARC--a place where, a continent away from the top managers, an engineer could sit and dream, and get every purchase order approved, and fire a laser across the Foothill Expressway if he was so inclined. Yes, he had to pit his laser printer against lesser ideas in the contest. But he won the contest. And, the instant he did, Xerox cancelled the competing projects and gave him the green light.

  "I flew out there and gave a presentation to them on what I was looking at," Starkweather said of his first visit to PARC. "They really liked it, because at the time they were building a personal computer, and they were beside themselves figuring out how they were going to get whatever was on the screen onto a sheet of paper. And when I showed them how I was going to put prints on a sheet of paper it was a marriage made in heaven." The reason Xerox invented the laser printer, in other words, is that it invented the personal computer. Without the big idea, it would never have seen the value of the small idea. If you consider innovation to be efficient and ideas precious, that is a tragedy: you give the crown jewels away to Steve Jobs, and all you're left with is a printer. But in the real, messy world of creativity, giving away the thing you don't really understand for the thing that you do is an inevitable tradeoff.

  "When you have a bunch of smart people with a broad enough charter, you will always get something good out of it," Nathan Myhrvold, formerly a senior executive at Microsoft, argues. "It's one of the best investments you could possibly make--but only if you chose to value it in terms of successes. If you chose to evaluate it in terms of how many times you failed, or times you could have succeeded and didn't, then you are bound to be unhappy. Innovation is an unruly thing. There will be some ideas that don't get caught in your cup. But that's not what the game is about. The game is what you catch, not what you spill."

  In the nineteen-nineties, Myhrvold created a research laboratory at Microsoft modelled in part on what Xerox had done in Palo Alto in the nineteen-seventies, because he considered PARC a triumph, not a failure. "Xerox did research outside their business model, and when you do that you should not be surprised that you have a hard time dealing with it--any more than if some bright guy at Pfizer wrote a word processor. Good luck to Pfizer getting into the word-processing business. Meanwhile, the thing that they invented that was similar to their own business--a really big machine that spit paper --they made a lot of money on it." And so they did. Gary Starkweather's laser printer made billions for Xerox. It paid for every other single project at Xerox PARC, many times over.

  6.

  In 1988, Starkweather got a call from the head of one of Xerox's competitors, trying to lure him away. It was someone whom he had met years ago. "The decision was painful," he said. "I was a year from being a twenty-five-year veteran of the company. I mean, I'd done enough for Xerox that unless I burned the building down they would never fire me. But that wasn't the issue. It's about having ideas that are constantly squashed. So I said, 'Enough of this,' and I left."

  He had a good many years at his new company, he said. It was an extraordinarily creative place. He was part of decision-making at the highest level. "Every employee from technician to manager was hot for the new, exciting stuff," he went on. "So, as far as buzz and daily environment, it was far and away the most fun I've ever had." But it wasn't perfect. "I remember I called in the head marketing guy and I said, 'I want you to give me all the information you can come up with on when people buy one of our products--what software do they buy, what business are they in--so I can see the model of how people are using the machines.' He looked at me and said, 'I have no idea about that.' " Where was the rigor? Then Starkweather had a scheme for hooking up a high-resolution display to one of his new company's computers. "I got it running and brought it into management and said, 'Why don't we show this at the tech expo in San Francisco? You'll be able to rule the world.' They said, 'I don't know. We don't have room for it.' It was that sort of thing. It was like me saying I've discovered a gold mine and you saying we can't afford a shovel."

  He shrugged a little wearily. It was ever thus. The innovator says go. The company says stop--and maybe the only lesson of the legend of Xerox PARC is that what happened there happens, in one way or another, everywhere. By the way, the man who hired Gary Starkweather away to the company that couldn't afford a shovel? His name was Steve Jobs.
GO TO TOP MENU

  Sometimes beauty is just business.

  1.

  Helena Rubinstein was born in 1872 in Krakow's Jewish ghetto, the eldest of eight daughters of a kerosene dealer. By her late teens, she had abandoned Poland for Australia, where she began cooking up vats of face cream. She called it Valaze, and claimed that it was the creation of an eminent European skin specialist named Dr. Lykuski and had been "compounded from rare herbs which only grow in the Carpathian mountains." She rented a storefront in downtown Melbourne, and peddled her concoction at a staggering markup.

  In just over a decade, she had become a millionaire. She expanded to London, then to Paris, then to New York--and from there to almost every other major city in the world. She added one product after another, until Helena Rubinstein Inc. comprised sixty-two creams; seventy-eight powders; forty-six perfumes, colognes, and eaux de toilette; sixty-nine lotions; and a hundred and fifteen lipsticks, plus soaps, rouges, and eyeshadows. In December of 1928, she sold her business to Lehman Brothers for the equivalent of eighty-four million dollars in today's money--and, when Lehman's mismanagement and the Depression brought the stock price down from sixty dollars to three dollars, she bought her firm back for a pittance and took it to even greater success. She was four feet ten and spoke an odd combination of Polish, Yiddish, French, and English. She insisted on being referred to as Madame. At the time of her death, in 1965, she was one of the richest women in the world.

  The biographer Ruth Brandon spends the first part of "Ugly Beauty" (Harper; $26.99) describing Rubinstein's rise, and the picture she paints of her subject is extraordinary. Rubinstein bought art by the truckload; a critic once said that she had "unimportant paintings by every important painter of the nineteenth and twentieth centuries." In just one room in her Park Avenue triplex, she had seven Renoirs hung above a fireplace. Her legendary collection of jewels was kept in a filing cabinet, sorted alphabetically: "A" for amethysts, "B" for beryls, "D" for diamonds. "Rubinstein's New York living room, like everything else about her, was tasteless but full of gusto," Brandon writes. "It sported an acid-green carpet designed by Miró, twenty Victorian carved chairs covered in purple and magenta velvets, Chinese pearl-inlaid coffee tables, gold Turkish floor lamps, life-sized Easter Island sculptures, six-foot-tall blue opaline vases, African masks around the fireplace, and paintings covering every inch of wall space." She once invited Edith Sitwell over for lunch and, upon hearing that Sitwell's ancestors had burned Joan of Arc at the stake, exclaimed, "Somebody had to do it!" In the nineteen-fifties, she took as a companion a young man half a century her junior, wooing him on a date that began with an enormous lunch ("I need to keep up my energy!") and a showing of "Ben-Hur" ("Most interesting! I'm glad the Jewish boy won!"). From then on, Rubinstein took the young man everywhere, even to a state dinner with the Israeli Prime Minister David Ben-Gurion, who asked her, "Who's your goy?" Rubinstein replied, "That's Patrick! And . . . and, yes, he is my goy."

  2.

  In the second part of "Ugly Beauty," Brandon tells a parallel story, about one of Rubinstein's contemporaries, a man named Eugène Schueller. He was born nine years after Rubinstein, in Paris. His parents ran a small pâtisserie on the Rue du Cherche-Midi in Montparnasse. He was an only child--his four brothers died in infancy--and his parents sacrificed to send him to a private school, subsidizing his tuition in cakes. After a successful academic career, he ended up teaching chemistry at the Sorbonne. But the leisurely pace of academic life bored him. "He would climb in and out though the window before and after hours, sometimes starting work at six a.m., sometimes staying on late into the evening--hours his colleagues inexplicably preferred to spend with their friends and family," Brandon writes. One day, a hairdresser approached him, looking for an improvement over the unreliable dyes then in use. Schueller quit his job, and converted his apartment into a laboratory. By 1907, he had perfected his formula, and began selling it to local hairdressers. In 1909, he recorded his first profit. By the nineteen-thirties, he was one of the wealthiest industrialists in France, and had moved his headquarters to a stately building on the Rue Royale. He would rise at 4 A.M., attend to company business for two hours, take an hour's walk, and then later be driven, by Rolls-Royce, to each of his various chemical plants, ending his day at midnight. He called his company L'Oréal.

  Brandon's aim in relating the histories of these two pioneers of the beauty business is to tease out their many connections and parallels--to explore what the development of cosmetics at L'Oréal and at Helena Rubinstein tells us about the social constructions of beauty. The juxtaposition of Rubinstein and Schueller, though, is most interesting as a kind of natural experiment in entrepreneurial style. After all, here were two people, born into the same class and era and charged with the same passion: making cosmetics respectable. Yet they could scarcely have been more different. Rubinstein was selling an illusion--the promise of eternal youth. What Schueller sold was real. "In the beauty industry, whose claims routinely bore little if any relation to reality, his product was unique in that both he and his customers knew it would always do precisely what the package promised," Brandon writes. "L'Oréal worked: it would dye your hair any color you wished--and safely. . . . The foundation of her business was folk wisdom; Schueller's business rested on science."

  Brandon calls Rubinstein's career "chaotic, a progression of brilliantly executed extempore sallies." She was a yeller and a worrier. She lurched from crisis to crisis. She peopled her sprawling empire with every relative she could get her hands on. "The essence of Madame was that business and emotion were not separable," Brandon says, and she goes on:

She ran on adrenaline: her chaotic, compulsive letters to [her friend] Rosa Hollay, in which the worry of the moment was scribbled down whenever it might occur on whatever scrap of paper lay to hand, reveal the constant, jumbled panic beneath her assured exterior: "I haven't paid any bills the last three weeks, let me know again what must and should be paid now. I am frightfully short of money, it seems worse and worse. . . . I often don't know if I am on my feet or my head."

  Schueller, by contrast, was the picture of reason and calculation. If he worried, he left no trace of it. He methodically applied the same principles and scientific techniques to one business after another, until he had expanded into soap and paint and photographic film and plastics. He hired professional managers and left behind a company that today is a colossus. Rubinstein was the nineteenth-century entrepreneur; her style was personal and idiosyncratic. Schueller was the modern entrepreneur. The business builders we venerate today, who bring technical innovation and discipline to primitive marketplaces, are cut in his image. Schueller is Steve Jobs. He is Mark Zuckerberg. And there the story of Schueller and Rubinstein would end, were it not for the small matter of the Second World War.

  3.

  One of the leaders of the far right in France, in the tumultuous years leading up to the Second World War, was Eugène Deloncle, whom Brandon describes as a "clever and charismatic naval engineer whose hypnotic personal charm nullified his somewhat absurd appearance--short, plump, invariably bowler-hatted." Deloncle ran what was essentially a terrorist group called La Cagoule, which conducted political assassinations, fired on a socialist demonstration, and set off two bombs near the Arc de Triomphe. He had a personal hit man, named Jean Filliol, who at one point tried to kill the French Prime Minister. After the German invasion of France in 1940, Deloncle formed a political party called the Mouvement Social Révolutionnaire, or M.S.R., which in Occupied France was one of the loudest voices favoring collaboration with the Nazis. Deloncle's men marched through Paris in jackboots and tunics, cataloguing Jewish property for expropriation. In October of 1941, the M.S.R. blew up seven Paris synagogues with explosives provided by the Gestapo. Deloncle was a gangster, a thug, and a vicious anti-Semite, and Eugène Schueller was one of his biggest backers. Schueller wrote for Deloncle's newspaper. He gave him money. He even set the M.S.R. up in an office next to his own at L'Oréal's headquarters.

  Brandon argues that Schueller's affection for the M.S.R. wasn't primarily ideological. He wasn't pro-German. He wasn't a Nazi. There is no evidence that he was particularly anti-Semitic. Many of Deloncle's followers were essentially right-wing royalists, pining for the long-lost Catholic monarchy. Schueller was the kid from the Rue du Cherche-Midi. He believed in a meritocracy.

  Schueller's behavior stemmed from pragmatism. He was a businessman, and collaborating with the Germans was to him the correct business decision. "The war years were very profitable for those who could keep manufacturing--anything that could be made could be sold, the occupiers would pay any price for luxuries, and there was a flourishing black market in scarce necessities," Brandon writes. "But only collaboration ensured access to raw materials."

  The minute Schueller sensed the tide of the war turning, he coolly changed course. In late 1941, he began to cut his ties to the M.S.R. and Deloncle. By the end of 1942, with America now in the war and Hitler overextended in Russia, Schueller began cozying up to the Resistance. He let a L'Oréal van be used for a covert mail drop. He donated seven hundred thousand francs to the Maquis, and two million francs to de Gaulle, in exile in England. He started working with a group that eventually helped two hundred people escape the Nazis. After the war was over, Schueller--along with a number of other French industrialists--was charged with collaboration. A contemporary, the automaker Louis Renault, ended his life in disgrace, and his business was taken over by the government. Not Schueller. The Resistance legend Pierre de Bénouville stood up and vouched for him. In the postwar years, the good word of a war hero was everything. Schueller was acquitted.

  The details of just how Schueller managed to wriggle out of legal trouble are sordid. Bénouville turns out to have barely known Schueller. He was, it seems, doing a favor for three friends: François Dalle, who ran L'Oréal after Schueller's death; André Bettencourt, who married Schueller's daughter and became one of the richest men in the world; and François Mitterrand, who worked for L'Oréal in the last days of the war and ended up as the President of France. But there is no denying the cynical brilliance of Schueller's strategy. As they would say on Wall Street, he hedged the war to perfection. When the market turned in 1942, he shorted Germany and went long on France, and the awkward fact about Schueller's behavior is that this ability to deal with unexpected obstacles is what we normally celebrate in entrepreneurs. The entrepreneur is someone obsessed with his creation, who applies the full force of his intellect to protect and sustain it. That's why Alexander Graham Bell didn't give up on the telephone, and why Hewlett and Packard kept plugging away in their garage in Silicon Valley. We like this about them. Here is Brandon describing what happened to Schueller when he first embarked on his hair-dye project. Having quit his job, and with just eight hundred francs to his name, he surrendered to the compulsion of invention:

The two-room apartment on rue d'Alger cost 400 francs a year, which since he had also to eat and buy materials gave him a little less than two years. The dining room became his office, the bedroom his lab. He lived alone, cooked for himself, and slept in a little camp bed until it was crowded out by laboratory equipment. . . . "When I think back to those days, I can't imagine how I got through them," he reflected forty years later.

  So why should we be surprised that Schueller would cross a moral line in the service of that same obsession? Schueller's daughter, Liliane Bettencourt, later tried to excuse her father's wartime behavior by saying that he was a "pathological optimist who hadn't the first idea about politics, and who always managed to be in the wrong place." Brandon is skeptical of that explanation. But it is not entirely wrong. The kind of people who retreat to their two-room apartments or cluttered garages and emerge, two years later, with a better mousetrap are pathological optimists, and seldom have the first idea about politics. Schueller wasn't for France; he wasn't for Germany. Schueller was for Schueller. An engineer who worked at L'Oréal said it best: "I think M. Schueller is too much of an opportunist to risk engaging himself absolutely in favor of anyone."

  4.

  One of the classic stories in the entrepreneurial canon involves the founder of Ikea, Ingvar Kamprad. In the early days of the company, other Swedish furniture manufacturers had been boycotting Kamprad, protesting what they considered his predatory pricing. His business was in a crisis: he could fill only a fraction of his orders. So Kamprad went to Poland, where manufacturing costs were half those in Sweden. There he struck a series of deals that eventually established Ikea as Europe's premier low-cost furniture company, and vaulted it from obscurity into one of the biggest retailers in the world. Here is the entrepreneur at work: brilliantly resolving an obstacle to his own advantage. But the official Ikea history hardly considers the implications of when Kamprad made that trip to Poland. It was 1961. The Berlin Wall was about to go up. The Cold War was at its peak. Poland, like the other Soviet-bloc countries, was in the grips of a repressive regime. "Their visit lasted a week and can still be tracked almost step by step by step in the documents the Polish secret police drew up," the journalist Bertil Torekull breezily writes, in the corporate biography he crafted with Ingvar Kamprad. And how did Kamprad set up shop? "At first we did a bit of advance smuggling," Kamprad recounts to Torekull. "Illegally, we took tools such as files, spare parts for machines, and even carbon paper for ancient typewriters. . . . We bought nose and mouth protectors when we saw the dreadful environment, and we took a whole lot of secondhand machines from a firm in Jönköping [in Sweden] and installed them in Poland instead." Because the Ikea history was written in 1998--long after the fall of Communism, when Poland had become a healthy democracy and the unpleasantness of the Soviet bloc had begun to recede into history, Kamprad's trip to Poland has been treated as a kind of heroic pilgrimage. But what Kamprad did in 1961--cozy up to a police state, break the law--is not radically different from what Schueller did in 1940. Kamprad didn't get too worked up about the moral consequences of collaborating with the Soviet bloc because he wasn't interested in moral consequences. He was an entrepreneur trying to save his business. He was too much of an opportunist to risk engaging himself absolutely in favor of anyone. Kamprad was for Kamprad.

  Compare the Holocaust hero Oskar Schindler. Schindler was an entrepreneur as well. He came to Krakow, at the outset of the war, and realized that through the Nazis' Aryanization program he could pick up a fully functioning Jewish-owned enamelware factory for next to nothing--essentially for promising to keep the factory's former owners employed. He landed a lucrative war-supply contract. At Emalia, as the factory was known, he began to produce munitions, which gave his factory and his Jewish workforce an "essential to the war effort" designation. In the first five years of the war, he made a huge amount of money. But when the Germans decided to shut down Schindler's operation in Krakow--and ship his workers to the gas chambers--Schindler did an about-face. He persuaded the Germans to let him move his employees and machinery to Brünnlitz, in Czechoslovakia. Here is the business professor Ray Jones, in his article "The Economic Puzzle of Oskar Schindler":

Schindler used the money he had made [in Krakow] . . . to pay bribes to acquire permission for the factory, to convert the factory into an armaments factory and subcamp, to transport his workers to the factory, to pay the SS for the prisoners' labor, to purchase food for them on the black market, to acquire additional laborers, and to pay the necessary bribes to keep the Brunnlitz factory open. By the end of the war he had literally spent all of the money that he had made at Emalia, his entire personal fortune.

  Schindler is the rare businessman who resolves the ethical conflicts of wartime capitalism in a way that we today find satisfactory. But he does so by violating every precept of good entrepreneurship--by jeopardizing his company and his investment and all his personal wealth for the welfare of his employees. Schindler's moment of moral greatness was his recognition that the Nazi threat demanded more of him than that he be a good businessman. So at Brünnlitz he kept countless people on the payroll who contributed little or nothing. He dragged his feet in getting the factory started, claiming--implausibly--that he was having startup difficulties. He sabotaged his machinery so that the shells he made for the German Army would be useless. He deliberately placed his company in peril. By 1944, Jones concludes, Schindler "had no serious industrial intentions." Virtually every business venture he tried, during the rest of his life, ended in failure, which makes perfect sense. The war had cured him of his entrepreneurial obsession. Schindler was no longer for Schindler.

  5.

  One morning in May of 1964, thieves broke into Helena Rubinstein's Park Avenue apartment. They posed as deliverymen, carrying roses, and tied up her butler at gunpoint. But when they surprised her, in her bedroom, she defied them. The keys to her safe were in her purse, and her purse was buried beneath a mound of papers on her bed. "Madame silently extracted the keys and with characteristic presence of mind dropped them in the one place she could be sure no one would ever look: down her ample bosom," Brandon writes. "By the time the thieves noticed the purse, it contained only some handfuls of paper, a powder compact, five twenty-dollar bills, and a pair of diamond earrings worth around forty thousand dollars. The earrings rolled away as they upended it, and Madame covered them with a Kleenex."

  The thieves tied Rubinstein to a chair with strips of her sheets and fled with the hundred dollars. When she was freed by her butler, she told him to put the roses in the icebox, in case they had company that day. Brandon says that she calculated that the thieves, "after paying $40 for the roses . . . had made just $60 profit on their morning."

  Rubinstein was ninety-one at the time, and still in full command of her business. She soon set off to Paris, Tangier, and Normandy--and then returned home to New York, where she died, following a stroke. "The Park Avenue triplex was rented, in a move that would surely have appalled her, to Charles Revson of Revlon, an upstart whose name she had always refused to utter, referring to him only as 'the nail man,' " Brandon writes. Her extraordinary collection of art, real estate, haute couture, and jewelry was dispersed, and the business she had spent the better part of a century building was eventually put on the block, passing from one owner to the next until finally, in the nineteen-eighties, it was acquired for a pittance by L'Oréal's U.S. affiliate Cosmair, whose chairman, Jacques Corrèze, was the former chief lieutenant to that Jew-hating Fascist Eugène Deloncle.

  This is the moment at which the two strands of Brandon's story come together, and the natural experiment in competing entrepreneurial styles is resolved. Schueller's side won. The twentieth century triumphed over the nineteenth, and Brandon understandably makes much of the final transaction. Corrèze had called himself a "colonel" in the M.S.R.'s uniformed brigade and had been one of those marching through the streets of Paris in jackboots, itemizing Jewish property for the Nazi expropriators. When he first came to New York for Cosmair, in the nineteen-fifties, he had immediately sought out Rubinstein. He had wanted to buy her company from the moment she died. He participated in secret negotiations with the Arab League, to figure out how to "scrub" the Rubinstein properties of their Jewishness so that L'Oréal would not fall under the Arab boycott. In a television interview in 1991, he was asked, "Do you feel you were a real anti-Semite?" To which he snapped, "I don't know if I was, but I'm about to become one." Brandon has no doubt that L'Oréal's acquisition of Helena Rubinstein Inc. was personal:

Given his past, and his defiant arrogance, it is hard to believe that Helena Rubinstein's Jewishness played no part in Corrèze's absolute determination to acquire her business. He never showed any interest in the very comparable Elizabeth Arden, who was an equally powerful player, who died only a year after Madame, and whose business went downhill in much the same way as Helena Rubinstein's. On the contrary, it seems in character that, having arrived in New York and sized up the situation, he should have decided to resume the old game he had so enjoyed in Paris--Colonel Corrèze redivivus, minus only the high boots and cross-belts. Everything he did points to his enjoyment of this underlying drama, his pleasure doubtless enhanced by the fact that only he was aware of it.

  The other possibility, of course, is that it wasn't personal at all. The uncomfortable lesson of the triumph of Eugène Schueller over Helena Rubinstein is that sometimes it's just business.
GO TO TOP MENU

  Who really rescued General Motors?

  1.

  In February of 2009, Steven Rattner was selected by the Obama Administration to oversee the federal bailout of General Motors and Chrysler. It was not a popular choice. Rattner was a Wall Street financier with no expertise in the automobile business. The head of the United Auto Workers, the chairman of Ford, and a number of congressmen from Michigan all complained that the White House should have hired someone with an industry background. But, as Rattner makes clear in "Overhaul" (Houghton Mifflin Harcourt; $27), his account of the experience, the critics misunderstood his role. "This was not a managerial job," he writes. "It was a restructuring and private-equity assignment," and private equity was Rattner's forte. He made his living buying troubled and mismanaged companies, turning them around, and then taking them public again--and that's exactly what the Obama Administration wanted him to do in Detroit. "Overhaul" is not a Washington memoir, even though it is set in Washington, and it involves one of the most deeply politicized issues in recent memory. It is a Wall Street memoir, a book about one of the biggest private-equity deals in history. The result is fascinating--although perhaps not entirely in the ways that its author intended.

  In the past twenty-five years, private equity has risen from obscurity to become one of the most powerful forces in the American economy. Private-equity firms collectively make hundreds of billions of dollars in investments every year. The industry's most prominent player, K.K.R., was by 2007 the fourth-largest employer in the world. Traditional investors, like Warren Buffett, scout for companies that the market has overlooked or undervalued, and buy stakes in them with an eye to the long term. Private-equity investors are activists. They acquire firms outright. Then they bring in their own specialists to "fix" the company. Typically, a private-equity firm plans to take its acquisition public again in three to five years, and the theory behind the enterprise is that buying, fixing, and reselling companies can be far more profitable than Buffett-style "buy and hold" investing. In one of the deals that put private equity on the map, Forstmann Little acquired Dr Pepper for six hundred and fifty million dollars in 1984, cut costs and spun off the company's less glamorous divisions--such as its textile business and its Canada Dry operation--and then took it public again within three years, at a reported gain to investors of more than eight hundred per cent. An investor like Warren Buffett has to think that he is smarter than the market. Private-equity managers aim higher. They see themselves as smarter than the managers of the companies they are buying. It is not a field for someone with any obvious deficits in self-confidence, and Rattner, a cofounder of the Quadrangle Group, was long considered one of the most intellectually able men on Wall Street.

  "Team Auto," as Rattner refers to the group that he assembled to help supervise the bailout, consisted of about a dozen people, some in their twenties and early thirties. They started work in March of 2009. One of the first major issues was whether to save Chrysler. To settle the question, Rattner tells us, Team Auto gathered in the office of Larry Summers, the President's chief economic adviser. The case against Chrysler was that most of the jobs lost by letting the company fail would eventually be offset by gains made by Ford and General Motors, as those companies picked up Chrysler's old customers. Letting Chrysler fail would make Ford and G.M. stronger. But did the team really want several hundred thousand jobs to disappear--even if the losses were short-term--in the middle of a severe recession? Chrysler's failure would also mean that Michigan's unemployment-insurance fund, for starters, would need to be bailed out. One of Rattner's team members made a counter-argument: "Given the uncertainty in our economy, it was better to invest $6 billion for a meaningful chance that Chrysler would survive than to invest several billion dollars in its funeral." Summers put the matter to a vote. The tally was 4-3 in favor of letting Chrysler die. When the vote came to Rattner, he said that it should live. Summers agreed. Chrysler lived.

  Next up was General Motors. Team Auto's idea was to bypass the traditional bankruptcy procedure, in which the entire company would be restructured through a protracted process of negotiation with creditors. Instead, the company would be divided into two. "Old G.M." would contain the unwanted factories and debts and unused assets--all of which would be wound down and sold over time. The best parts of the automaker would be transferred to "New G.M.," an entity funded and owned by the American taxpayer. The task of carving out the new entity was enormously complex, and involved rewriting countless contracts with unions, suppliers, and creditors. To minimize disruption to the company's operations, Team Auto worked with lightning speed. Rattner would rise at five-thirty, be on the treadmill at the gym by six, and in the office by seven. Lunch was a tuna-fish sandwich at his desk. He wouldn't be back at his rented condo in Foggy Bottom until eight or nine, catching up on the day's e-mails before heading to bed. One of Rattner's team members spent his first month on a friend's couch in Virginia. Another worked around the clock during the week, and then made the five-hour drive every weekend to see his family, in Pittsburgh. None had any time for ceremony. At one point, two members of Team Auto, Brian Osias and Clay Calhoon, called for a sitdown with senior Chrysler executives at eleven on a Saturday morning. "The executives were almost all middle-aged industry veterans," Rattner recounts. "Osias was thirty-two years old and Calhoon was twenty-six, and both looked younger than their years." Calhoon announced to the room, "We're going to sit at this table until we're done." They were there until 2 A.M. on Sunday. On another occasion, the Team Auto member Harry Wilson had a meeting with senior G.M. officials, who arrived with a hundred-and-fifty-page document. Rattner writes, "What's this?' Harry asked. 'The agenda,' came back the reply. Harry, almost laughing, said, 'You can't run a meeting with a 150-page agenda!' " He substituted his own. Rattner took the job as Auto Czar in February. He was back home in New York, mission accomplished, by July.

  Rattner has since run into some trouble. Recently, an S.E.C. investigation into a "pay to play" scandal involving the New York state pension fund led to sanctions against Rattner, who has reportedly accepted a two-year ban from the securities business. But there is no question that the auto bailout represents one of the signature accomplishments both of his career and of the Obama Administration. In August, G.M. posted its second quarterly profit in a row, its best result in three years. Chrysler, for its part, is now safely in the hands of Fiat, at least for the time being. Two years ago, when the heads of G.M., Ford, and Chrysler came to the Senate in the hope of gaining relief, no one could have imagined such a favorable outcome. At the time, the Center for Automotive Research estimated that the collapse of the Big Three would result in as many as three million lost jobs. So soon after the Wall Street rescue, there seemed little public or political appetite for another taxpayer bailout. The reaction of Richard Shelby, the ranking Republican on the Senate finance committee, was typical. "I don't believe they've got good management," he said of G.M. "They don't innovate. They are a dinosaur. . . . I don't believe the twenty-five billion dollars they're talking about will make them survive. It's just postponing the inevitable." The reason to bring in a private-equity expert is that he would never be so defeatist. To someone like Rattner, there is nothing wrong with giving a dinosaur money if you think you can fix the dinosaur. One might even say that the private-equity investor prefers the dinosaur, because dinosaurs are cheap, which increases the potential profit at the end. And then the world will look at him with awe and say, "Wow, you turned around a dinosaur--even if, on closer examination, that wasn't what happened at all.

  2.

  Steven Rattner never took to Rick Wagoner, the C.E.O. of General Motors. The problem started with Wagoner's testimony before the Senate, on the day in November of 2008 when he and his fellow auto C.E.O.s flew their private jets down to Washington to ask for taxpayers' money.

  "I do not agree with those who say we are not doing enough to position G.M. for success," Wagoner said, in his testimony. "What exposes us to failure now is not our product lineup, is not our business plan, is not our employees and their willingness to work hard, is not our long-term strategy. What exposes us to failure now was the global financial crisis, which has severely restricted credit availability and reduced industry sales to the lowest per-capita level since World War II."

  To Rattner, that comment summed up everything that was wrong with G.M. Its leaders were arrogant and out of touch. Their sales forecasts were bizarrely optimistic. Members of Team Auto had "surreal" conversations with the company's C.F.O., Ray Young. Rattner looked in vain for a sense of urgency. In one of G.M.'s endless PowerPoint presentations, he saw a product-price chart that included no comparison data for G.M.'s competitors. "Why would G.M. present the data in such a useless manner?" he wonders. "Whom were they trying to fool?"

  The culprit in all this, Rattner believed, was Wagoner. When the two men met, Rattner was struck by Wagoner's combination of "amiability and remoteness." The previous day, Team Auto had met with the Chrysler C.E.O., Robert Nardelli, and his engaged and direct manner had impressed Rattner. But Wagoner "gave listeners very little to grab onto." Rattner writes:

He made a few opening comments and then turned over the floor to his lieutenants, occasionally interjecting a remark here and there but mostly presiding. While I respected the collegiality this implied, it left nearly everyone with the impression that he held himself aloof. If Rick had taken a more central role it would probably not have affected our assessment of the company, but might have affected our judgment of him.

  Wagoner, in Rattner's judgment, simply didn't have what it would take:

Born and bred as an insider, Wagoner never displayed any fortitude for remaking GM's hidebound corporate culture. He operated as an incrementalist, and a slow-moving one at that. His guiding star appeared to be an unshakable faith that GM was not like any other company; it was General Motors. Whatever happened to other companies couldn't possibly happen to GM.

  Wagoner's testimony at the Senate hearing, to Rattner's mind, had been typical: "He and his team seemed certain that virtually all of their problems could be laid at the feet of some combination of the financial crisis, oil prices, the yen-dollar exchange rate, and the UAW." In fact, Wagoner's corporate team was simply dysfunctional: "A top-down, hierarchical approach" afflicted G.M.'s upper management. Wagoner and his senior executives "involved themselves in decisions that should have been left to executives several layers beneath them."

  This is a perplexing bundle of criticisms. We learn that Wagoner is aloof and excessively collegial--and also a meddler. We learn that Wagoner is perhaps unreasonably partisan toward his own company. We learn that his testimony before Congress rubbed Rattner the wrong way, and we learn that his subordinates gave flawed PowerPoint presentations. What we don't learn is whether Wagoner was any good at the job he was hired to do--that is, run General Motors--which is a critical omission, because by that criterion Wagoner actually comes off very well.

  Wagoner was not a perfect manager, by any means. Unlike Alan Mulally, the C.E.O. at Ford, he failed to build up cash reserves in anticipation of the economic downturn, which might have kept his company out of bankruptcy. He can be faulted for riding the S.U.V. wave too long, and for being too slow to develop a credible small-car alternative. But, especially given the mess that Wagoner inherited when he took over, in 2000--and the inherent difficulty of running a company that had to pay pension and medical benefits to half a million retirees--he accomplished a tremendous amount during his eight-year tenure. He cut the workforce from three hundred and ninety thousand to two hundred and seventeen thousand. He built a hugely profitable business in China almost from scratch: a G.M. joint venture is the leading automaker in what is now the world's largest automobile market. In 1995, it took forty-six man-hours to build the typical G.M. car, versus twenty-nine hours for the typical Toyota. Under Wagoner's watch, the productivity gap closed almost entirely.

  Most important, Wagoner--along with his counterparts at Ford and Chrysler--was responsible for a historic agreement with the United Auto Workers. Under that contract, which was concluded in 2007, new hires at G.M. receive between fourteen and seventeen dollars an hour--instead of the twenty-eight to thirty-three dollars an hour that preëxisting employees get--and give up all rights to the traditional retiree benefit package. The 2007 deal also transferred all responsibility for paying for the health care of G.M.'s retirees to a special fund, administered by the U.A.W. It is hard to overstate the importance of that second provision. G.M. has five hundred and seventeen thousand retirees. Between 1993 and 2007, the company paid out a hundred and three billion dollars to those former workers--a burden unimaginable to its foreign competitors. In the 2007 deal, G.M. agreed to make a series of lump-sum payments to the U.A.W. over ten years, worth some thirty-two billion dollars--at which point the company would be free of its outsized retiree health-care burden. It is estimated that, within a few years, G.M.'s labor --which were once almost fifty per cent higher than the domestic operations of Toyota, Nissan, and Honda--will be lower than its competitors'.

  In the same period, G.M.'s product line was transformed. In 1989, to give one example, Chevrolet's main midsize sedan had something like twice as many reported defects as its competitors at Honda and Toyota, according to the J. D. Power "initial quality" metrics. Those differences no longer exist. The first major new car built on Wagoner's watch--the midsize Chevy Malibu--scores equal to or better than the Honda Accord and Toyota Camry. G.M. earned more than a billion dollars in profits in the last quarter because American consumers have started to buy the cars that Wagoner brought to market--the Buick Regal and LaCrosse, the Envoy, the Cadillac CTS, the Chevy Malibu and Cruze, and others. They represent the most competitive lineup that G.M. has fielded since the nineteen-sixties. (Both the CTS and the Malibu have been named to Car and Driver's annual "10 Best Cars" list.)

  What Wagoner meant in his testimony before the Senate, in other words, was something like this: "At G.M., we are finally producing world-class cars. We have brought our costs, quality, and productivity into line with those of our competitors. We have finally disposed of the crippling burden of our legacy retiree costs. We have expanded into the world's fastest-growing markets more effectively than any other company in the United States. But the effort required to bring about that transformation has left our balance sheet thin--and, at the very moment that we need a couple of years of normal economic activity to refill our coffers, auto sales have fallen off a cliff. Do you mind giving us a hand until things get back to normal?" This is not arrogance. It happens to be something very close to the truth. And, when senators like Richard Shelby seem to have no idea what your company has accomplished in the past decade, forcefully making the case for your own company's merits is probably a sound strategy.

  Rattner was perfectly aware of the strides that G.M. had made. When members of Team Auto toured some of G.M.'s factories, he tells us, they came back marvelling at the "truly collaborative" relationship that existed between management and labor, and at how "consistent and disciplined" the manufacturing process was. One of his first major conclusions, after studying up on the auto industry, was that "U.S. automakers were no longer as pathetically inefficient as people thought." What Rattner cannot seem to see, though, is that his contempt for G.M.'s leadership is contradicted by the evidence of the company's accomplishments. How can Wagoner be a slow-moving incrementalist when, in less than a decade, he took the world's largest company from an uncompetitive monolith to a worthy competitor of Toyota and Honda? "GM's day-to-day workings were solid," Rattner writes at one point. "It was the head that was rotting." But, if the head was rotting, how did the day-to-day operations become solid?

  It is not hard to understand what is going on here. Team Auto was engaged in an act of financial engineering: it used the power of the bankruptcy process to rid G.M. of some of the liabilities that had been holding it back. This was cleverly and swiftly done. It was badly needed. But, at the end of the day, cleaning up a balance sheet is cleaning up a balance sheet. Kristin Dziczek, of the Center for Automotive Research, estimates that the "new" G.M. is roughly eighty-five per cent the product of the work that Wagoner, in concert with the U.A.W., did in his eight years at the company and fifteen per cent the product of Team Auto's efforts. That seems about right: car companies stand or fall, ultimately, on the strength of their product, and teaching a giant company how to build a quality car again is something that can't be done on the private-equity timetable. The problem is that no private-equity manager wants to be thought of as a mere financial engineer. The mythology of the business is that the specialists who swoop in from Wall Street are not economic opportunists, buying, stripping, and selling companies in order to extract millions in fees, but architects of rebirth. Rattner wants us to think of this as his G.M. "As we drafted press statements and fact sheets," he writes, "I would constantly force myself to write that 'GM' had done such and such. Just once I would have liked to write 'we' instead."

  So what did Rattner do with Wagoner? He fired him, of course: "Though I'd met Wagoner only once, to my mind there was no question but that he had to go." (Wait: once?) If this was to be Rattner's G.M., it needed to have Rattner's man at the helm: "After nearly a decade of experience as a private equity manager, I believed in a bedrock principle of that business: put money behind only a bankable team." Bankable does not mean a self-effacing C.E.O., with a heavy PowerPoint deck and a manner that leaves his listeners with nothing to hang on to. Bankable means a star. Rattner called Jack Welch for advice. He consulted with headhunters. He pondered the question during his morning runs on the treadmill until he found his leading man--Ed Whitacre, a former C.E.O. of A.T.&T. And, at this point, a book that began as a case study in twenty-first-century economic realities descends into schoolboy romance.

  "His reputation was for toughness," Rattner says of Whitacre. "I remembered having once read a Business Week story that described him killing rattlesnakes on his Texas ranch (he would pin down the snake with a stick and crush its head with a rock). His flinty image was reinforced by his lean, six-foot-four frame, his full head of gray hair, and his laconic speech. Ed believed that we are born with two ears and one mouth and we should use them in rough proportion."

  The two men have a date in Washington. Rattner chooses the steakhouse Bobby Vans, "on the theory that Ed, being Texan, would want red meat." Whitacre agrees to take the job, even though he has never worked for a manufacturing company in his life. Rattner follows Whitacre on his first trip to G.M. headquarters. "Having had no experience as a corporate executive, I was eager to observe a top-notch one at work," Rattner explains. He trails along as Whitacre meets with G.M. management: "For all his reputation as a tough guy, I was fascinated to see him take the time to get to know the individuals as people. By the end of the day, he could talk knowledgeably about each executive's background, personality, and aspirations." The top company brass gather in the chairman's conference room, and the lanky snake-killer rises to address the group:

The men and women listened intently as Ed explained in his measured Texas drawl that he had no interest in presiding over a second-rate company. He praised the people. He stressed the need to make decisions. . . . Then, looking straight into the eyes of one attendee after another, he said: "I'm used to winning and have no intention of seeing that change at GM." The GM executives, unused to this sort of bluntness, were impressed, and so was I. It was superlative leadership as I had always imagined it.

  Whitacre makes commercials for G.M., with himself as the star. He takes lunch in the food court, mingling with the rank and file. "Hi, I'm Ed. Who're you?" he'll say to some dumbstruck middle manager in the elevator. He walks into one meeting, listens for a while, says, " You are all smart guys, right? You know what to do," then walks out. He flies back to Texas every weekend, just to keep things in perspective. He is the face of the new G.M., the man handpicked to lead one of America's greatest companies through its time of gravest crisis. And then one day last August--just nine months into his term as C.E.O.--Rattner's superlative leader suddenly and mysteriously quits. Does this make Rattner question his own judgment? That's not the private-equity way. "I shared the board's disappointment," he writes briskly, and moves on with his narrative of triumph.

  3.

  Early on in his time in Washington, Rattner realized that Team Auto would have to make "at least one trip" to Detroit in order to

avoid more criticism from the heartland. By early March, we could delay no more. All the same, we were determined not to waste more than a day, and so arranged a packed itinerary that would touch all the right bases. To satisfy the futurists, we would visit GM's Technical center and drive its next generation vehicles. For traditionalists, we would tour an old-line Chrysler assembly plant. And to acknowledge the importance of labor, we would visit with UAW leaders at their headquarters, Solidarity House.

  One would have thought that a man as savvy as Rattner would have made the Detroit visit sound a little less of a burden. The Auto Czar should want to see the industry he is supposedly fixing, shouldn't he? But this is what makes "Overhaul" so unexpectedly fascinating. It is the product of someone so convinced of the value of his contribution, and of the private-equity model, that he feels no need to hide his condescension.

  Team Auto makes sure to rent a car at the airport with G.P.S., "since none of us knew our way around Detroit." At the U.A.W., Rattner looks with pleasure at what he takes to be evidence of his own industry's handiwork: "Far from browbeating us, they gave a thorough presentation that included as many details and figures as investment bankers would have used. (I later learned that my old firm Lazard had helped prepare it.)"

  Then it was on to G.M. and finally to Chrysler. But not for long, because time was short and the real work of saving Detroit, of course, has nothing to do with Detroit. "We walked among the vehicles--sedans and trucks and even a Fiat 500--as the Chrysler people talked about advanced hybrid power trains and new, environmentally friendly diesels," Rattner continues. "But by this point our goal was not to miss our flight back to the mountain of work that awaited us back in Washington."
GO TO TOP MENU

  Why do we pay our stars so much money?

  1.

  When Marvin Miller took over as the head of the Major League Baseball Players Association, in 1966, he quickly realized that his members did not know what being in a union meant. He would talk to the players about how unfair their contracts were, about how the owners took an outsized portion of the profits, how pitiful their pensions and health-care benefits were, and how much better things would be if they organized themselves. But they weren't listening. The players were young, and many came from small towns far from the centers of organized labor. They thought of themselves as privileged: they got to eat steak for dinner, and be cheered by thousands of fans. Even when Miller brought up something as seemingly straightforward as the baseball-card business, the players were oblivious of their worth. "The Topps bubble-gum company would go into the minor leagues,"

Miller recalls, "and if the scouts told them someone was going to make it to the majors, they would sign those kids. You know what they would pay them? Five dollars. We're talking about 1966, 1967. Five dollars to sign. And that meant they would own them for five years, and the players got no percentage of sales, the way you would with any other kind of deal like that. If you made it to the majors, you got a hundred and twenty-five dollars per year--for the company's right to take your picture, use it in their advertising, put it on a card, use your name and your record on the back of it. I used to say to the players, "�"�Why'd you sign?' And they'd look sheepish and say, "�When I was a kid, I used to collect cards, and now they want to put me on one!'"

  One season when Miller was making his annual rounds of the spring-training sites, he decided to put his argument to the players as plainly as he could. He was visiting the San Francisco Giants, in Phoenix, Arizona. "The right fielder for the Giants was Bobby Bonds, a nice man,"

Miller says. "I knew what Bonds's salary was. And, considering that he was a really prime ballplayer--that he had hit more home runs as a lead-off man than anyone had ever hit in the major leagues, that he was a speedy base runner--the number shocked me. I was always shocked when I looked at salaries in those days. So I said, "�I want to tell you something. Take any one of you--take Bobby Bonds. I'm going to make a prediction.' "

The prediction was about Bobby Bonds's son. The Giants' owner encouraged his players to bring their families to spring training, so Miller knew Bonds's son well. He was just a little kid, but already it was clear that he was something special. "I said, "�If we can get rid of the system as we now know it, then Bobby Bonds's son, if he makes it to the majors, will make more in one year than Bobby will in his whole career.' And the eyebrows went up."

Bobby Bonds's son, of course, was Barry Bonds, one of the greatest players of his generation. And Miller was absolutely right: he ended up making more in one year than all the members of his father's San Francisco Giants team made in their entire careers, combined. That was Marvin Miller's revolution--and, nearly half a century later, we are still dealing with its consequences.

  2.

  There was a time, not so long ago, when people at the --very top of their professionthe "talent"

--did not make a lot of money. In the postwar years, corporate lawyers, Wall Street investment bankers, Fortune 500 executives, all-star professional athletes, and the like made a fraction of what they earn today. In baseball, between the mid-nineteen-forties and the mid-nineteen-sixties, the game's minimum and highest salaries both fell by more than a third, in constant dollars. In 1935, lawyers in the United States made, on average, four times the country's per-capita income. By 1958, that number was 2.4. The president of DuPont, Crawford Greenewalt, testified before Congress in 1955 that he took home half what his predecessor had made thirty years earlier. ("Being an honest man,"

Greenewalt added wryly, "I think I should say that when I pointed the discrepancy out to him he replied merely that he was easily twice as good as I and hence deserved it."

)

  That era was an upside-down version of our own: when society gazed upon captains of industry and commerce, it marvelled at how ordinary their lives were. A Wall Street Journal profile of the C.E.O. of one of the country's "top industrial concerns"

in the late nineteen-forties began with a description of "Mr. C"

jotting down the cost of a taxi ride in a little black book he used to track his expenses. His after-tax income, Mr. C said, was $36,611, and the year before it had been $21,032. He'd bought two cars the previous year, but had to dip into his savings to afford them. "Mr. C has never lived extravagantly or even elegantly,"

the article reported. "He's never owned a yacht or a string of race horses. His main relaxation is swimming. "�My idea of fun would be to have a swimming pool behind the house so I could take a dip whenever I felt like it. But to build it I'd have to sell a couple of my Government bonds. I don't like to do that, so I'm getting along without the pool.' "

Getting along without the pool?

  In 1959, the Ladies' Home Journal dispatched a writer to the suburban Chicago home of "Mr. O'Rourke,"

one of the country's "most successful executives."

Since the Wall Street Journal's visit with Mr. C, America had undergone extraordinary growth. But Mr. O'Rourke's life was no more extravagant than that of his counterpart of a dozen years earlier. He lived in an ivy-covered Georgian, with ten rooms. Mrs. O'Rourke, "a slim blonde in a tweed suit and loafers,"

gave the writer a tour. "For our neighborhood this is not a large place,"

she said. "You can see that we've made do with rugs from our old home and that this room has never seen the services of an interior decorator. We've bought our furniture piece by piece over the years and I've never thrown anything away."

Their summer house was a small cottage on a lake. "I'm president of one of the larger companies in the U.S.,"

Mr. O'Rourke said, "yet chances are I will never become a millionaire."

  The truly rich in the nineteen-fifties and sixties were people who had inherited --the heirs of the great fortunes of the Gilded Age. Entrepreneurs who sold their own businesses could also become wealthy, because capital-gains taxes were relatively low. But the marketplace chose not to pay salaried professionals and managers a lot of money, and society chose not to let them keep much of what they made. On income above two hundred thousand dollars a year, the marginal tax rate was as high as ninety-one per cent. Formerly exclusive occupations, meanwhile, were opening themselves to new talent, as a result of the expansion of the public university system. Economists of the era were convinced, as one analysis put it, that there was a "connection between economic growth and the advance of democracy on the one hand and the worsening economic status of the intellectual and professional classes on the other."

In 1956, Roswell Magill, a partner at Cravath, Swaine & Moore, spoke for a generation of professionals when he wrote that law firms "can no longer honestly assure promising young men that if they become partners they can save money in substantial amounts, build country homes and gardens for themselves like their fathers and grandfathers did, and plan extensive European holidays."

  And then, suddenly, the world changed. Taxes began to fall. The salaries paid to high-level professionals--"talent"

--started to rise. Baseball players became multimillionaires. C.E.O.s got private jets. The lawyers at Cravath, Swaine & Moore who once despaired of their economic future began saving money in substantial amounts, building country homes and gardens for themselves like their fathers and grandfathers did, and planning extensive European holidays. In the nineteen-seventies, against all expectations, the salaryman rose from the dead.

  The story of how this transformation happened has been told in many different ways. Economists have pointed to the globalization of the world economy and the rise of what Robert Frank and Philip Cook call the "winner-take-all"

economy. Political scientists speak of how the social consensus changed in favor of privilege: taxes came down, and the commitment to economic equality eroded. But there is one more crucial piece to the puzzle. As Roger Martin, the dean of the Rotman School of Management, at the University of Toronto, argued in the Harvard Business Review a few years ago, people who fell into the category of "Talent"

came to realize that what they possessed was relatively scarce compared with what the class of owners, "Capital,"

had at their disposal. People like O'Rourke and Mr. C and Roswell Magill "woke up"

--in Martin's phrase--to what they were really worth. And who woke them up? The Marvin Millers of the world.

  3.

  Marvin Miller is in his nineties now. He lives in a modest Manhattan high-rise on the Upper East Side, decorated with Japanese prints. He is slight and wiry: the ballplayers he represented for so long always loomed over him. His head is large and his features are aquiline. He has a wispy mustache--a slightly diminished version of the mustache he was asked to shave, by baseball's traditionalists, upon his election to the union job. He said no to that request, just as he said no to the suggestion, floated by the players, that Richard Nixon serve as his general counsel, and no to virtually every scheme that baseball's owners tried to foist upon him during his time in office. It was never wise to cross Marvin Miller. Bowie Kuhn, the commissioner of baseball, once accused Miller of failing to "reciprocate"

his overtures of friendship. Miller responded that Kuhn was not trying to be friends: he was trying to "pick my brains,"

and "there was scant possibility of reciprocity in that department."

  Miller came to baseball from the United Steelworkers union, where he was the chief economist. He was present during the epic eight-day White House negotiating session that narrowly averted a strike at the height of the Vietnam War, in 1965. He cut his teeth at the State, County and Municipal Workers of America and later served on the National War Labor Board. By temperament and experience, he is that increasingly rare species--the union man. Miller remembers going to Manhattan's Lower East Side one Saturday morning as a child and seeing his father on the picket line for the Retail, Wholesale and Department Store Union. "My father sold ladies' coats on Division Street,"

Miller recalled, speaking in his apartment one muggy day this summer. "Management had been non-union for years and years and years, the Depression was on, and they were cutting the work year for everybody. The strike lasted a month. One day, my father came home later than usual. I was still up, and he had a document with him that he wanted me to read. It was a settlement. The workers got almost everything they were striking for."

Miller was describing a day in his life that happened more than seventy-five years ago. But there are some things that a union man never forgets. "They got a restoration of the work-year cuts,"

he went on, ticking off the details as if the contract were in front of him. "They got affirmation that the management would obey the new wage law and pay time and a half for overtime and Sunday work--and a whole raft of small things. It was very impressive to me as a kid."

  The baseball union that Miller took over, however, was not a real union. It was a Players Association. Each team elected a representative, and the reps formed a loosely structured committee--headed by a part-time adviser who was selected and paid for by the owners. The owners did as they pleased. They required the players to abide by rules and regulations without even giving them copies of the rules and regulations they were to abide by. Every player signed what was called the Uniform Player's Contract, a document so lopsided in its provisions, and so utterly without regard for the rights of the player, that it reminded Miller of the standard lease that the landlords' association in New York draws up for prospective tenants. A few months after taking the job, Miller was invited to attend a meeting, in Chicago, of Major League Baseball's Executive Council, a group consisting of the commissioner and a handful of owners, who served as the game's governing body. There he discovered that the league had decided to terminate the agreement that had been in place for the previous ten years for funding the players' pension system. The owners had been giving the players sixty per cent of the revenue from broadcasting the World Series. But, with a new, much larger television deal coming up, the owners had decided that sixty per cent would be too much.

  Miller--the veteran of a hundred union negotiations--was stunned as he listened to the owners' "decision."

In the world he had just come from, this did not happen. There had been no collective bargaining; the upcoming announcement was presented to him as a settled matter. Years later, in his memoirs, he recalled:

I looked across the room, hoping to find a sign that someone understood how blatantly illegal and offensive this all was. My eyes fell on Bowie, the only practicing lawyer in the room. I looked for a flicker of comprehension in his eyes, an awareness that his clients were about to display publicly their violations of law, demonstrating for all to see that they had engaged in a willful refusal to bargain. . . . Kuhn showed not the slightest sign of comprehension.

  Miller had no staff at that point, and virtually no budget. He was up against a group of owners who were among the wealthiest men in America. In the past, whenever major battles between the owners and the players had been taken to the courts--including to the Supreme Court--the owners had invariably won. Miller's own members barely understood what a union was for--and there he was, at a meeting of baseball's governing committee, being treated like a potted plant.

  Yet when Miller pushed back, the owners capitulated. He ended up winning the television-revenue battle. He rebuilt the players' pension system. He got the owners to agree to collective bargaining--which meant that the players had a seat at the table on every issue affecting the game. He won binding arbitration for salary disputes and other grievances, a victory that he describes as the "difference between dictatorship and democracy"

; no longer would players be forced to take whatever they were offered by their team. Then he won free agency, which gave veteran players the right to offer their services to any team they chose.

  Not even Miller thought it would be that easy. At one point, he wanted the owners to use surplus income from the pension fund to pay for increased benefits. The owners drew a line in the sand. Reluctantly, Miller led the players out on strike--the first strike in the history of professional sports. This time, surely, the fight would be long and bloody. It was not. The owners folded after thirteen days. As Leonard Koppett, of the Times, memorably summed up the course of the negotiations:

PLAYERS: We want higher pensions.
OWNERS: We won't give you one damn cent for that.
PLAYERS: You don't have to--the money is already there. Just let us use it.
OWNERS: It would be imprudent.
PLAYERS: We did it before, and anyhow, we won't play unless we can have some of it.
OWNERS: Okay.

  This discovery that Capital was suddenly vulnerable swept across the professional classes in the mid-nineteen-seventies. At exactly the same time that Miller was leading the ballplayers out on strike, for example, a parallel revolution was taking place in the publishing world, as authors and their agents began to rewrite the terms of their relationship with publishers. One of the instigators of that revolution was Mort Janklow, a corporate lawyer who, in 1972, did a favor for his college friend William Safire, and sold Safire's memoir of his years as a speechwriter in the Nixon Administration to William Morrow & Company. Here is how Janklow describes the earliest days of the uprising:

  "So Bill delivers the book on September 1, 1973,"

Janklow said, in his Park Avenue office, this past summer. "Ten or fifteen days go by, and Larry Hughes, his editor at Morrow, calls me and says, "�This really doesn't work for us. There's no central theme, it seems too episodic--a bunch of the editors read it, and it's really unacceptable. We feel bad about it, because we love Bill. But we're going to return the book to you, and we want you to give back the advance.'"

  Hughes was exercising the terms of what was then the standard publishing contract--the agreement that every author signed when he or she sold a book. Janklow knew nothing about the publishing world when he agreed to help his friend, and remembers looking at that contract for the first time and being aghast. "My first thought was, Jesus, does anybody sign this?"

he said. "The analogy I've always made is, the old publishing agreement was to the writer what the New York apartment lease is to a tenant. Because, if you ever read your lease, the only thing that's permanent is the obligation to pay rent. The building breaks down, you pay rent. It's very weighted in favor of the landlord. That was the existing agreement in publishing.

  "The author needed to deliver a book at a certain time at a certain quality of content, which had to be "�acceptable' to the publisher,"

Janklow went on. "But there were no parameters on what acceptability meant. So all the publisher had to say was "�It's unacceptable,' and he was out of the contract."

  To Janklow, the real reason for Morrow's decision was obvious. It was about what had happened in the interval between when the company bought 's book and when the manuscript was handed in: Watergate. Morrow just didn't want to publish a pre-Watergate book in a post-Watergate world.

  Janklow decided to fight. His friend's reputation was on the line. Hughes referred Janklow to the publisher's lawyer, Maurice Greenbaum, of Greenbaum, Wolff & Ernst. "It was considered a very literary, high-level firm,"

Janklow recalled. "And Maury Greenbaum was the classic aristocratic fourth-generation German Jew, with a pince-nez. So I went to see him, and he said, "�Let me tell you about how publishing works,' and off he went in the most sanctimonious manner. I was a serious corporate lawyer, and he was lecturing me like I was a freshman in law school. He said, "�You're in a standards business. You can't force a publisher to publish a book. If the publisher 't want the book, you give the money back and you take back the book. That's the way the business has worked for hundreds of years.' When he was finished, I said, "�Mr. 'm not trying to force the publisher to publish the book. I'm just trying to force the publisher to pay for it. This acceptability clause was being fraudulently exercised, and I'm going to sue you.' So Greenbaum's jaw clenched, and the veins on his forehead popped, and he said, "�You don't understand. If you start a lawsuit, I will see to it that you never work in this business again.'"

  The case went to arbitration. Janklow had uncovered a William Morrow memo written in the summer of 1973--before Safire handed in his manuscript--saying that because of the Watergate scandal the firm ought to back out of its deal with Safire. Humiliated, Morrow settled, and a jolt of electricity went through the literary world. The likes of Larry Hughes and Maury Greenbaum didn't have all the power after all, and, as one author after another--Judith Krantz, Barbara Taylor Bradford, and Sidney Sheldon, among others--called Janklow asking him to represent them, he began steadily extracting concessions from publishers, revising the acceptability clause and the financial terms so that authors were no longer held hostage to the whims of their publishers. "The publisher would say, "�Send back that contract or there's no deal,' "

Janklow went on. "And I would say, "�Fine, there's no deal,' and hang up. They'd call back in an hour: "�Whoa, what do you mean?' The point I was making was that the author was more important than the publisher."

  Janklow and Miller have never met, and they occupy entirely different social universes. Miller is a class warrior. Janklow is a rich corporate lawyer. Miller organized the ballplayers. The only thing Janklow ever organized was his Columbia Law School reunion. But their stories are remarkably similar. The insurgent comes to a previously insular professional world. He studies the prevailing rules of engagement, and is aghast. (For New Yorkers of a certain age, apparently, nothing represents injustice quite like the landlord's contract.) And when he mounts an attack on what everyone else had assumed was the impregnable fortress of Capital, Capital crumbles. Comrade Janklow, meet Comrade Miller.

  4.

  Why did Capital crumble? Maury Greenbaum had no doubt been glowering at upstart agents for years and no one had ever challenged him before. Bobby Bonds was as deserving of a big contract as his son. So what changed to allow Talent's value to be realized?

  The economists Aya Chacar and William Hesterly offer an answer, in a recent issue of the journal Managerial and Decision Economics, by drawing on the work of Alan Page Fiske. Fiske is a U.C.L.A. anthropologist who argues that people use one of four models to guide the way they interact with one another: communal sharing, equality matching, market pricing, and authority ranking. Communal sharing is a group of roommates in a house who are free to read one another's books and wear one another's clothing. Equality matching is a car pool: if I drive your child to school today, you drive my child to school tomorrow. Market pricing is where the terms of exchange are open to negotiation, or subject to the laws of supply and demand. And authority ranking is paternalism: it is a hierarchical system in which "superiors appropriate or pre-empt what they wish,"

as Fiske writes, and "have pastoral responsibility to provide for inferiors who are in need and to protect them."

  Fiske's point isn't that one of these paradigms is better than the rest. It is that, as human beings, we choose the relational form that's most appropriate to a particular circumstance. Fiske gives the example of a dinner party. You buy the food at the store, paying more for those items which are considered more valuable. That's market pricing. Some of the people who come may have been invited because they invited you to a dinner party in the past: that's equality matching. At the party, everyone is asked to serve himself or herself (communal sharing), but, as the host, you tell your guests where to sit and they do as they are told (authority ranking). Suppose, though, you were to switch the models you were using for your dinner party. If you use equality matching to acquire the food, communal sharing for your invitations, authority ranking for the choice of what to serve, and market pricing for the seating, then you could have the same food, the same guests, and the same venue, but you wouldn't have a dinner party anymore. You'd have a community fund-raiser. The model chosen in any situation has a profound effect on the nature of the interaction, and that, Chacar and Hesterly maintain, is what explains Talent's transformation: across the professional world, relational models suddenly changed.

  When Miller took over the players' union, in 1966, the game was governed by the so-called reserve clause. It was a provision interpreted to mean that the club owned the rights to a player in perpetuity--that is, from the moment a player signed a minor-league contract, he belonged to his team, and no longer had any freedom of choice about where he played. Whenever Miller, in his early years as the union's head, was speaking to players, he would pound away at how wrong that system was. "At every meeting,"

he said, "I talked about how the reserve clause made them pieces of property. It took away all their dignity as human beings. It left them without any real bargaining power. You can't bargain if the employer can simply say, "�This is your salary and if you don't like it you are barred from organized baseball.'"

  But it wasn't easy to convince the players that the reserve clause was an indignity. Miller remembers running into the Yankees' ace Jim Bouton, and listening to Bouton argue, bafflingly, that the reserve clause somehow preserved the continuity and integrity of the game. "I ran into that kind of thinking over and over,"

Miller says.

  The attitude of the players was a textbook example of authority ranking. The players didn't see themselves as exploited. Their salaries may not have been as high as they could have been, but they were higher than those of their peers back home. And the owners took care of them in the paternalistic way that superiors, in an authority-ranking system, take care of their subordinates: an owner might slip a player a fifty-dollar bill after a particularly good game, or pick up his medical bills if his child was sick. The players were content with the reserve clause because they had a model in their heads that said their best interests lay in letting the owners call the shots. "The implicit assumption in economics is that agents who find themselves on the short side of a bargaining regime will aggressively seek greater parity,"

Chacar and Hesterly write. "This reasoning, however, presumes that agents view economic relationships through a market lens"

--and until Miller worked his magic the players simply hadn't made that leap.

  Even on Wall Street, authority ranking held sway. In 1956, the head of Goldman Sachs, Sidney Weinberg, took the Ford Motor Company public, in what was at that point far and away the biggest initial public offering in history. As Charles Ellis, in "The Partnership,"

describes the deal:

When Henry Ford had asked Weinberg at the outset what his fee would be, Weinberg had declined to get specific; he offered to work for a dollar a year until everything was over and then let the family decide what his efforts were really worth. Far more than the actual fee, Weinberg always said he appreciated an affectionate, handwritten letter he received from Ford, which says, along with other flattering things, "Without you, it could not have been accomplished."

Weinberg had the letter framed and hung in his office, where he would proudly direct visitors' attention to it, saying: "That's the big payoff as far as I'm concerned."

He was speaking more literally than his guests knew. The fee finally paid was estimated at the time to be as high as a million dollars. The actual fee was nowhere near that amount: For two years' work and a dazzling success, the indispensable man was paid only $250,000. Deeply disappointed, Sidney Weinberg never mentioned the amount.

  Let us enumerate all the surreal details of that transaction. A dollar a year? No banker today would so completely defer to a client on the details of his compensation, or suppress his anger at being tossed such a meagre bone, or point triumphantly at a thank-you letter as the real payoff. Weinberg was at that time the leading investment banker on Wall Street. He was so "indispensable"

that the parties to the transaction (the Ford I.P.O. was an enormously complex deal involving both the Ford family and the Ford Foundation) fought over who would get to retain him. Weinberg had enormous leverage: a banker in the same position today would have held Ford for ransom. But he chose not to. He didn't create a bidding war between family and foundation; he chose to swallow his pride and profess to be happy with a handwritten note. That's authority ranking: Weinberg's assumption was that he worked for Ford; that if the client gave him two hundred and fifty thousand dollars for two years' work on the biggest deal of all time then he was obliged to accept it.

  5.

  What has changed today is not just that there is an extra zero or two on the end of that fee. What has changed is that the investment banker now perceives the social relations between client and banker as open to negotiation. When Roger Martin, of the University of Toronto, says that the talent revolution represented a change in sensibility, this is what he means. Maurice Greenbaum, sneering over his pince-nez at Janklow, represented the last gasp of the old order. He assumed that the social relations of publishing were settled. Janklow, on the other side of the table, was the new breed. He assumed that they were up in the air.

  In the mid-nineteen-seventies, private-equity managers like Teddy Forstmann and Henry Kravis pioneered the practice of charging the now common sum of "two and twenty"

for managing their clients' money--that is, an annual service fee of one and a half or two per cent of assets under management, in addition to a twenty-per-cent share of any profits. Two and twenty was the basis on which the modern high-end money-management world was built: it is what has created billionaires where there were once only millionaires. Why did Forstmann do it? Because, he says now, "I wanted to be a principal and not an agent."

He was no longer satisfied with a social order that placed him below his investors. He wanted an order that placed him on the same plane with his investors.

  Why did the Hollywood agent Tom Pollock demand, in 1975, that Twentieth Century Fox grant his client George Lucas full ownership of any potential sequels to "Star Wars"

? Because Lucas didn't want to work for the studio. In his apprenticeship with Francis Ford Coppola, he'd seen the frustrations that could lead to. He wanted to work with the studio. "That way, the project could never be buried in studio hell,"

Pollock said. "The whole deal came from a fear that the studio wouldn't agree to make the movie and wouldn't let it go. What he was looking for was control."

  At that same moment, the world of modelling was transformed, when Lauren Hutton decided that she would no longer do piecework, the way every model had always done, and instead demanded that her biggest client, Revlon, sign her to a proper contract. Here is Hutton in an interview with her fellow-model Paulina Porizkova, in Vogue last year:

PAULINA PORIZKOVA: Can I ask a question? What was modeling like in 1975?
LAUREN: We modeled by the hour before 1974 or 1975. A good working model would have six jobs a day. You'd get a dollar a minute, $60 an hour. . . . So when the Revlon thing came, suddenly it was no longer about $60 an hour. I was getting $25,000 a day, and that was shocking.
PAULINA: How did that happen?
LAUREN: I read an article about a sports guy named Catfish Hunter on the bottom right-hand corner of the New York Times front page one day. It said he was going to get a million-dollar contract. . . . Veruschka had retired; Twiggy had retired; Jean Shrimpton had retired. All the stars were gone. Dick Avedon had no choice but to work with me continually. I yelled over to my boyfriend and asked, "How do you get a contract?"

He didn't even take a second to yell back: "Don't tell them. Don't do any makeup ads. Just refuse to do it. Tell all your photographers you want a contract."

Avedon got it like that, and after six months, that was it. . . . It took six months to work out a contract that had never been worked out before, and basically all contracts [after that] were based on that.
PAULINA: Lauren, I salute you. I got my house because of you.

  Catfish Hunter, of course, pitched for the New York Yankees. He was the first baseball player Marvin Miller liberated from the tyranny of the reserve clause. The insurgent comes to a previously insular professional world. She studies the prevailing rules of engagement, and when she mounts an attack on what everyone had assumed was the impregnable fortress of Capital, Capital crumbles. Comrade Hutton, meet Comrade Miller.

  6.

  At one of the many meetings that Miller had with each baseball team when he first took over as union boss, a player stood up and hesitantly asked a question: "We know that you have been working for unions for most of your adult life, and we gather from what general managers and club presidents and owners and league presidents and the commissioner's office are telling us that they don't like you. So what we want to know is, can you get along with these people? Or is this going to be a perpetual conflict?"

  Miller tried to answer as truthfully as he could: "I said, "�I think I can get along with most people. But you have to remember that labor relations in this country are adversarial. The interests of the owners and your interests are diametrically opposed on many things, and you can't hold up as a standard whether they like me.' Then I said, "�I'm going to go further. If at any time during my tenure here you find there's a pattern of owners and owners' officials singing my praises, you'd better fire me. I'm not kidding.'"

  Miller knew firsthand, from his days with the United Steelworkers, how important the lessons of class solidarity and confrontation were. The men who worked in the mills had few financial or social resources. They could not threaten to go to the mill across the street, because their own mill could easily replace them. But a roller operator at U.S. Steel, by standing firm with his fellow-steelworkers, saw the $2.31 an hour he made in 1948 rise to $2.56 in 1950, $2.81 in 1952, up again the next year to $2.90, and then to $2.95, and on and on, rising steadily to $3.57, in 1958.

  Miller's goal was to get his ballplayers to think like steelworkers--to persuade members of the professional class to learn from members of the working class. His great insight was that if you brought trade unionism to the world of talent--to a class with great social and economic resources, whose abilities were exceptional and who couldn't easily be --you wouldn't be measuring your success in fractions of a dollar anymore. The class struggle that characterized the conventional world of organized labor would turn into a rout. And so it did: the share of total baseball revenues paid to baseball players in salary went from ten per cent in the pre-Miller years to near fifty per cent by the beginning of the eighties. By 2003, the minimum salary in baseball was fifty times as high as it was when Miller took over, and the average salary was a hundred and thirty-four times as high. In 2005, Barry Bonds, the little boy playing at Miller's feet, was paid twenty-two million dollars by the San Francisco Giants--which is not only more than what his father's teammates collectively received in their lifetimes but more than a good number of baseball's owners ever made in a single year.

  Miller knew that the owners would never sing his praises. How could they, after he had so thoroughly trounced them? Twenty-eight years after his retirement, Miller is still not even a member of baseball's Hall of Fame, despite the fact that no one outside of Babe Ruth and Jackie Robinson has had as great an impact on the modern game.

  "My wife was alive and well one of the last times they voted--three years ago this December,"

Miller said, shaking his head at the sheer pettiness of it all. "We had a long talk. She said she had come to the conclusion that some things had had their time, and that this is not funny anymore. She said, "�You can do what you want, of course. But I think you would do well to ask them not to nominate you anymore.' I thought about it, and she was right. So I wrote a letter to the group that did the nominating, and I thanked them, but I said we'd all be better off if they didn't do it again."

  He shrugged. He hadn't taken over the Players Association for the money. He lives in a plain vanilla high-rise on Manhattan's Upper East Side. He doesn't summer in the Hamptons. He was a union man--and in his world you measured your success by the size of the bite you took out of Capital. "I guess I have to remember what I said to the players originally,"

he said, after a moment's reflection. "If the owners praise me, fire me."

  But, if one side so thoroughly dominates another in the marketplace, is it really market pricing anymore? A negotiation in which a man can get paid twenty-two million dollars for hitting a baseball is not really a negotiation. It is a capitulation, and the lingering question left by Miller's revolution is whether the scales ended up being tilted too far in the direction of Talent--whether what Talent did with its newfound power was simply create a new authority ranking, this time with itself at the top. A few years ago, a group of economists looked at more than a hundred Fortune 500 firms, trying to figure out what predicted how much money the C.E.O. made. Compensation, it turned out, was only weakly related to the size and profitability of the company. What really mattered was how much money the members of the compensation committee of the board of directors made in their jobs. Pay is not determined vertically, in other words, according to the characteristics of the organization an executive works for; it is determined horizontally, according to the characteristics of the executive's peers. They decide, among themselves, what the right amount is. This is not a market.

  Chacar and Hesterly observe that in the modern knowledge economy the most successful and profitable industries aren't necessarily those with the talented employees. That's because Talent can demand so much in salary that there's no money left over for shareholders. Retail banking--the branch down the street where you have your checking account--is a more reliable source of profits, Roger Martin says, than the billion-dollar deal-making of investment banks. What do you do when your branch manager threatens to walk if he doesn't get a big raise? You let him walk. But you can't do that if you're Lazard. "The problem with investment banking is that you need investment "

Martin says, "and they become superstars on their own, and, oops, you got a problem--which is that they take all the money."

  When, a few years ago, Robert Nardelli left Home Depot, he was given a severance package worth two hundred and ten million dollars, even though the board of the company was unhappy with his performance. He got that much because he wrote that number into his contract when he was hired, and the reason Home Depot agreed to that provision back then is that Capital has become accustomed to saying yes to Talent, even in cases where Talent does not end up being all that Talented. The problem with the old system of authority ranking was arrogance--the assumption that the world ought to be ordered according to the whims of Capital. The problem with the new order is greed--the assumption that Talent deserves whatever it can extort. As Martin says, the attitude of the professional class has gone from "how much is enough to how much can I get."

  A decade before Marvin Miller came to baseball, Stan Musial, one of the greatest players in the history of the game, had his worst season as a professional, hitting seventy-six points below his career average. Musial then went to the general manager of his team and asked for a twenty-per-cent pay cut from his salary of a hundred thousand dollars. Miller would be outraged by that story: even at his original salary, Musial was grossly underpaid. Miller would also point out that Musial's team would not have unilaterally raised his salary by twenty per cent if he'd performed brilliantly that season. In both cases, Miller would have been absolutely right. But it is hard--in an era in which failed executives are rewarded like kings and hedge-fund managers live like the robber barons of the Gilded Age--not to be just a little nostalgic for the explanation that Musial gave for his decision: "There wasn't anything noble about it. I had a lousy year. I didn't deserve the money."


GO TO TOP MENU

  Social media can't provide what social change has always required.

  1.

  At four-thirty in the afternoon on Monday, February 1, 1960, four college students sat down at the lunch counter at the Woolworth's in downtown Greensboro, North Carolina. They were freshmen at North Carolina A. & T., a black college a mile or so away.

  "I'd like a cup of coffee, please," one of the four, Ezell Blair, said to the waitress.

  "We don't serve Negroes here," she replied.

  The Woolworth's lunch counter was a long L-shaped bar that could seat sixty-six people, with a standup snack bar at one end. The seats were for whites. The snack bar was for blacks. Another employee, a black woman who worked at the steam table, approached the students and tried to warn them away. "You're acting stupid, ignorant!" she said. They didn't move. Around five-thirty, the front doors to the store were locked. The four still didn't move. Finally, they left by a side door. Outside, a small crowd had gathered, including a photographer from the Greensboro Record. "I'll be back tomorrow with A. & T. College," one of the students said.

  By next morning, the protest had grown to twenty-seven men and four women, most from the same dormitory as the original four. The men were dressed in suits and ties. The students had brought their schoolwork, and studied as they sat at the counter. On Wednesday, students from Greensboro's "Negro" secondary school, Dudley High, joined in, and the number of protesters swelled to eighty. By Thursday, the protesters numbered three hundred, including three white women, from the Greensboro campus of the University of North Carolina. By Saturday, the sit-in had reached six hundred. People spilled out onto the street. White teen-agers waved Confederate flags. Someone threw a firecracker. At noon, the A. & T. football team arrived. "Here comes the wrecking crew," one of the white students shouted.

  By the following Monday, sit-ins had spread to Winston-Salem, twenty-five miles away, and Durham, fifty miles away. The day after that, students at Fayetteville State Teachers College and at Johnson C. Smith College, in Charlotte, joined in, followed on Wednesday by students at St. Augustine's College and Shaw University, in Raleigh. On Thursday and Friday, the protest crossed state lines, surfacing in Hampton and Portsmouth, Virginia, in Rock Hill, South Carolina, and in Chattanooga, Tennessee. By the end of the month, there were sit-ins throughout the South, as far west as Texas. "I asked every student I met what the first day of the sitdowns had been like on his campus," the political theorist Michael Walzer wrote in Dissent. "The answer was always the same: 'It was like a fever. Everyone wanted to go.' " Some seventy thousand students eventually took part. Thousands were arrested and untold thousands more radicalized. These events in the early sixties became a civil-rights war that engulfed the South for the rest of the decade--and it happened without e-mail, texting, Facebook, or Twitter.

  2.

  The world, we are told, is in the midst of a revolution. The new tools of social media have reinvented social activism. With Facebook and Twitter and the like, the traditional relationship between political authority and popular will has been upended, making it easier for the powerless to collaborate, coördinate, and give voice to their concerns. When ten thousand protesters took to the streets in Moldova in the spring of 2009 to protest against their country's Communist government, the action was dubbed the Twitter Revolution, because of the means by which the demonstrators had been brought together. A few months after that, when student protests rocked Tehran, the State Department took the unusual step of asking Twitter to suspend scheduled maintenance of its Web site, because the Administration didn't want such a critical organizing tool out of service at the height of the demonstrations. "Without Twitter the people of Iran would not have felt empowered and confident to stand up for freedom and democracy," Mark Pfeifle, a former national-security adviser, later wrote, calling for Twitter to be nominated for the Nobel Peace Prize. Where activists were once defined by their causes, they are now defined by their tools. Facebook warriors go online to push for change. "You are the best hope for us all," James K. Glassman, a former senior State Department official, told a crowd of cyber activists at a recent conference sponsored by Facebook, A. T. & T., Howcast, MTV, and Google. Sites like Facebook, Glassman said, "give the U.S. a significant competitive advantage over terrorists. Some time ago, I said that Al Qaeda was 'eating our lunch on the Internet.' That is no longer the case. Al Qaeda is stuck in Web 1.0. The Internet is now about interactivity and conversation."

  These are strong, and puzzling, claims. Why does it matter who is eating whose lunch on the Internet? Are people who log on to their Facebook page really the best hope for us all? As for Moldova's so-called Twitter Revolution, Evgeny Morozov, a scholar at Stanford who has been the most persistent of digital evangelism's critics, points out that Twitter had scant internal significance in Moldova, a country where very few Twitter accounts exist. Nor does it seem to have been a revolution, not least because the protests--as Anne Applebaum suggested in the Washington Post--may well have been a bit of stagecraft cooked up by the government. (In a country paranoid about Romanian revanchism, the protesters flew a Romanian flag over the Parliament building.) In the Iranian case, meanwhile, the people tweeting about the demonstrations were almost all in the West. "It is time to get Twitter's role in the events in Iran right," Golnaz Esfandiari wrote, this past summer, in Foreign Policy. "Simply put: There was no Twitter Revolution inside Iran." The cadre of prominent bloggers, like Andrew Sullivan, who championed the role of social media in Iran, Esfandiari continued, misunderstood the situation. "Western journalists who couldn't reach--or didn't bother reaching?--people on the ground in Iran simply scrolled through the English-language tweets post with tag #iranelection," she wrote. "Through it all, no one seemed to wonder why people trying to coordinate protests in Iran would be writing in any language other than Farsi."

  Some of this grandiosity is to be expected. Innovators tend to be solipsists. They often want to cram every stray fact and experience into their new model. As the historian Robert Darnton has written, "The marvels of communication technology in the present have produced a false consciousness about the past--even a sense that communication has no history, or had nothing of importance to consider before the days of television and the Internet." But there is something else at work here, in the outsized enthusiasm for social media. Fifty years after one of the most extraordinary episodes of social upheaval in American history, we seem to have forgotten what activism is.

  3.

  Greensboro in the early nineteen-sixties was the kind of place where racial insubordination was routinely met with violence. The four students who first sat down at the lunch counter were terrified. "I suppose if anyone had come up behind me and yelled 'Boo,' I think I would have fallen off my seat," one of them said later. On the first day, the store manager notified the police chief, who immediately sent two officers to the store. On the third day, a gang of white toughs showed up at the lunch counter and stood ostentatiously behind the protesters, ominously muttering epithets such as "burr-head nigger." A local Ku Klux Klan leader made an appearance. On Saturday, as tensions grew, someone called in a bomb threat, and the entire store had to be evacuated.

  The dangers were even clearer in the Mississippi Freedom Summer Project of 1964, another of the sentinel campaigns of the civil-rights movement. The Student Nonviolent Coordinating Committee recruited hundreds of Northern, largely white unpaid volunteers to run Freedom Schools, register black voters, and raise civil-rights awareness in the Deep South. "No one should go anywhere alone, but certainly not in an automobile and certainly not at night," they were instructed. Within days of arriving in Mississippi, three volunteers--Michael Schwerner, James Chaney, and Andrew Goodman--were kidnapped and killed, and, during the rest of the summer, thirty-seven black churches were set on fire and dozens of safe houses were bombed; volunteers were beaten, shot at, arrested, and trailed by pickup trucks full of armed men. A quarter of those in the program dropped out. Activism that challenges the status quo--that attacks deeply rooted problems--is not for the faint of heart.

  What makes people capable of this kind of activism? The Stanford sociologist Doug McAdam compared the Freedom Summer dropouts with the participants who stayed, and discovered that the key difference wasn't, as might be expected, ideological fervor. "All of the applicants--participants and withdrawals alike--emerge as highly committed, articulate supporters of the goals and values of the summer program," he concluded. What mattered more was an applicant's degree of personal connection to the civil-rights movement. All the volunteers were required to provide a list of personal contacts--the people they wanted kept apprised of their activities--and participants were far more likely than dropouts to have close friends who were also going to Mississippi. High-risk activism, McAdam concluded, is a "strong-tie" phenomenon.

  This pattern shows up again and again. One study of the Red Brigades, the Italian terrorist group of the nineteen-seventies, found that seventy per cent of recruits had at least one good friend already in the organization. The same is true of the men who joined the mujahideen in Afghanistan. Even revolutionary actions that look spontaneous, like the demonstrations in East Germany that led to the fall of the Berlin Wall, are, at core, strong-tie phenomena. The opposition movement in East Germany consisted of several hundred groups, each with roughly a dozen members. Each group was in limited contact with the others: at the time, only thirteen per cent of East Germans even had a phone. All they knew was that on Monday nights, outside St. Nicholas Church in downtown Leipzig, people gathered to voice their anger at the state. And the primary determinant of who showed up was "critical friends"--the more friends you had who were critical of the regime the more likely you were to join the protest.

  So one crucial fact about the four freshmen at the Greensboro lunch counter--David Richmond, Franklin McCain, Ezell Blair, and Joseph McNeil--was their relationship with one another. McNeil was a roommate of Blair's in A. & T.'s Scott Hall dormitory. Richmond roomed with McCain one floor up, and Blair, Richmond, and McCain had all gone to Dudley High School. The four would smuggle beer into the dorm and talk late into the night in Blair and McNeil's room. They would all have remembered the murder of Emmett Till in 1955, the Montgomery bus boycott that same year, and the showdown in Little Rock in 1957. It was McNeil who brought up the idea of a sit-in at Woolworth's. They'd discussed it for nearly a month. Then McNeil came into the dorm room and asked the others if they were ready. There was a pause, and McCain said, in a way that works only with people who talk late into the night with one another, "Are you guys chicken or not?" Ezell Blair worked up the courage the next day to ask for a cup of coffee because he was flanked by his roommate and two good friends from high school.

  4.

  The kind of activism associated with social media isn't like this at all. The platforms of social media are built around weak ties. Twitter is a way of following (or being followed by) people you may never have met. Facebook is a tool for efficiently managing your acquaintances, for keeping up with the people you would not otherwise be able to stay in touch with. That's why you can have a thousand "friends" on Facebook, as you never could in real life.

  This is in many ways a wonderful thing. There is strength in weak ties, as the sociologist Mark Granovetter has observed. Our acquaintances--not our friends--are our greatest source of new ideas and information. The Internet lets us exploit the power of these kinds of distant connections with marvellous efficiency. It's terrific at the diffusion of innovation, interdisciplinary collaboration, seamlessly matching up buyers and sellers, and the logistical functions of the dating world. But weak ties seldom lead to high-risk activism.

  In a new book called "The Dragonfly Effect: Quick, Effective, and Powerful Ways to Use Social Media to Drive Social Change," the business consultant Andy Smith and the Stanford Business School professor Jennifer Aaker tell the story of Sameer Bhatia, a young Silicon Valley entrepreneur who came down with acute myelogenous leukemia. It's a perfect illustration of social media's strengths. Bhatia needed a bone-marrow transplant, but he could not find a match among his relatives and friends. The odds were best with a donor of his ethnicity, and there were few South Asians in the national bone-marrow database. So Bhatia's business partner sent out an e-mail explaining Bhatia's plight to more than four hundred of their acquaintances, who forwarded the e-mail to their personal contacts; Facebook pages and YouTube videos were devoted to the Help Sameer campaign. Eventually, nearly twenty-five thousand new people were registered in the bone-marrow database, and Bhatia found a match.

  But how did the campaign get so many people to sign up? By not asking too much of them. That's the only way you can get someone you don't really know to do something on your behalf. You can get thousands of people to sign up for a donor registry, because doing so is pretty easy. You have to send in a cheek swab and--in the highly unlikely event that your bone marrow is a good match for someone in need--spend a few hours at the hospital. Donating bone marrow isn't a trivial matter. But it doesn't involve financial or personal risk; it doesn't mean spending a summer being chased by armed men in pickup trucks. It doesn't require that you confront socially entrenched norms and practices. In fact, it's the kind of commitment that will bring only social acknowledgment and praise.

  The evangelists of social media don't understand this distinction; they seem to believe that a Facebook friend is the same as a real friend and that signing up for a donor registry in Silicon Valley today is activism in the same sense as sitting at a segregated lunch counter in Greensboro in 1960. "Social networks are particularly effective at increasing motivation," Aaker and Smith write. But that's not true. Social networks are effective at increasing participation--by lessening the level of motivation that participation requires. The Facebook page of the Save Darfur Coalition has 1,282,339 members, who have donated an average of nine cents apiece. The next biggest Darfur charity on Facebook has 22,073 members, who have donated an average of thirty-five cents. Help Save Darfur has 2,797 members, who have given, on average, fifteen cents. A spokesperson for the Save Darfur Coalition told Newsweek, "We wouldn't necessarily gauge someone's value to the advocacy movement based on what they've given. This is a powerful mechanism to engage this critical population. They inform their community, attend events, volunteer. It's not something you can measure by looking at a ledger." In other words, Facebook activism succeeds not by motivating people to make a real sacrifice but by motivating them to do the things that people do when they are not motivated enough to make a real sacrifice. We are a long way from the lunch counters of Greensboro.

  5.

  The students who joined the sit-ins across the South during the winter of 1960 described the movement as a "fever." But the civil-rights movement was more like a military campaign than like a contagion. In the late nineteen-fifties, there had been sixteen sit-ins in various cities throughout the South, fifteen of which were formally organized by civil-rights organizations like the N.A.A.C.P. and CORE. Possible locations for activism were scouted. Plans were drawn up. Movement activists held training sessions and retreats for would-be protesters. The Greensboro Four were a product of this groundwork: all were members of the N.A.A.C.P. Youth Council. They had close ties with the head of the local N.A.A.C.P. chapter. They had been briefed on the earlier wave of sit-ins in Durham, and had been part of a series of movement meetings in activist churches. When the sit-in movement spread from Greensboro throughout the South, it did not spread indiscriminately. It spread to those cities which had preëxisting "movement centers"--a core of dedicated and trained activists ready to turn the "fever" into action.

  The civil-rights movement was high-risk activism. It was also, crucially, strategic activism: a challenge to the establishment mounted with precision and discipline. The N.A.A.C.P. was a centralized organization, run from New York according to highly formalized operating procedures. At the Southern Christian Leadership Conference, Martin Luther King, Jr., was the unquestioned authority. At the center of the movement was the black church, which had, as Aldon D. Morris points out in his superb 1984 study, "The Origins of the Civil Rights Movement," a carefully demarcated division of labor, with various standing committees and disciplined groups. "Each group was task-oriented and coordinated its activities through authority" Morris writes. "Individuals were held accountable for their assigned duties, and important conflicts were resolved by the minister, who usually exercised ultimate authority over the congregation."

  This is the second crucial distinction between traditional activism and its online variant: social media are not about this kind of hierarchical organization. Facebook and the like are tools for building networks, which are the opposite, in structure and character, of hierarchies. Unlike hierarchies, with their rules and procedures, networks aren't controlled by a single central authority. Decisions are made through consensus, and the ties that bind people to the group are loose.

  This structure makes networks enormously resilient and adaptable in low-risk situations. Wikipedia is a perfect example. It doesn't have an editor, sitting in New York, who directs and corrects each entry. The effort of putting together each entry is self-organized. If every entry in Wikipedia were to be erased tomorrow, the content would swiftly be restored, because that's what happens when a network of thousands spontaneously devote their time to a task.

  There are many things, though, that networks don't do well. Car companies sensibly use a network to organize their hundreds of suppliers, but not to design their cars. No one believes that the articulation of a coherent design philosophy is best handled by a sprawling, leaderless organizational system. Because networks don't have a centralized leadership structure and clear lines of authority, they have real difficulty reaching consensus and setting goals. They can't think strategically; they are chronically prone to conflict and error. How do you make difficult choices about tactics or strategy or philosophical direction when everyone has an equal say?

  The Palestine Liberation Organization originated as a network, and the international-relations scholars Mette Eilstrup-Sangiovanni and Calvert Jones argue in a recent essay in International Security that this is why it ran into such trouble as it grew: "Structural features typical of networks--the absence of central authority, the unchecked autonomy of rival groups, and the inability to arbitrate quarrels through formal mechanisms--made the P.L.O. excessively vulnerable to outside manipulation and internal strife."

  In Germany in the nineteen-seventies, they go on, "the far more unified and successful left-wing terrorists tended to organize hierarchically, with professional management and clear divisions of labor. They were concentrated geographically in universities, where they could establish central leadership, trust, and camaraderie through regular, face-to-face meetings." They seldom betrayed their comrades in arms during police interrogations. Their counterparts on the right were organized as decentralized networks, and had no such discipline. These groups were regularly infiltrated, and members, once arrested, easily gave up their comrades. Similarly, Al Qaeda was most dangerous when it was a unified hierarchy. Now that it has dissipated into a network, it has proved far less effective.

  The drawbacks of networks scarcely matter if the network isn't interested in systemic change--if it just wants to frighten or humiliate or make a splash--or if it doesn't need to think strategically. But if you're taking on a powerful and organized establishment you have to be a hierarchy. The Montgomery bus boycott required the participation of tens of thousands of people who depended on public transit to get to and from work each day. It lasted a year. In order to persuade those people to stay true to the cause, the boycott's organizers tasked each local black church with maintaining morale, and put together a free alternative private carpool service, with forty-eight dispatchers and forty-two pickup stations. Even the White Citizens Council, King later said, conceded that the carpool system moved with "military precision." By the time King came to Birmingham, for the climactic showdown with Police Commissioner Eugene (Bull) Connor, he had a budget of a million dollars, and a hundred full-time staff members on the ground, divided into operational units. The operation itself was divided into steadily escalating phases, mapped out in advance. Support was maintained through consecutive mass meetings rotating from church to church around the city.

  Boycotts and sit-ins and nonviolent confrontations--which were the weapons of choice for the civil-rights movement--are high-risk strategies. They leave little room for conflict and error. The moment even one protester deviates from the script and responds to provocation, the moral legitimacy of the entire protest is compromised. Enthusiasts for social media would no doubt have us believe that King's task in Birmingham would have been made infinitely easier had he been able to communicate with his followers through Facebook, and contented himself with tweets from a Birmingham jail. But networks are messy: think of the ceaseless pattern of correction and revision, amendment and debate, that characterizes Wikipedia. If Martin Luther King, Jr., had tried to do a wiki-boycott in Montgomery, he would have been steamrollered by the white power structure. And of what use would a digital communication tool be in a town where ninety-eight per cent of the black community could be reached every Sunday morning at church? The things that King needed in Birmingham--discipline and strategy--were things that online social media cannot provide.

  6.

  The bible of the social-media movement is Clay Shirky's "Here Comes Everybody." Shirky, who teaches at New York University, sets out to demonstrate the organizing power of the Internet, and he begins with the story of Evan, who worked on Wall Street, and his friend Ivanna, after she left her smart phone, an expensive Sidekick, on the back seat of a New York City taxicab. The telephone company transferred the data on Ivanna's lost phone to a new phone, whereupon she and Evan discovered that the Sidekick was now in the hands of a teen-ager from Queens, who was using it to take photographs of herself and her friends.

  When Evan e-mailed the teen-ager, Sasha, asking for the phone back, she replied that his "white ass" didn't deserve to have it back. Miffed, he set up a Web page with her picture and a description of what had happened. He forwarded the link to his friends, and they forwarded it to their friends. Someone found the MySpace page of Sasha's boyfriend, and a link to it found its way onto the site. Someone found her address online and took a video of her home while driving by; Evan posted the video on the site. The story was picked up by the news filter Digg. Evan was now up to ten e-mails a minute. He created a bulletin board for his readers to share their stories, but it crashed under the weight of responses. Evan and Ivanna went to the police, but the police filed the report under "lost," rather than "stolen," which essentially closed the case. "By this point millions of readers were watching," Shirky writes, "and dozens of mainstream news outlets had covered the story." Bowing to the pressure, the N.Y.P.D. reclassified the item as "stolen." Sasha was arrested, and Evan got his friend's Sidekick back.

  Shirky's argument is that this is the kind of thing that could never have happened in the pre-Internet age--and he's right. Evan could never have tracked down Sasha. The story of the Sidekick would never have been publicized. An army of people could never have been assembled to wage this fight. The police wouldn't have bowed to the pressure of a lone person who had misplaced something as trivial as a cell phone. The story, to Shirky, illustrates "the ease and speed with which a group can be mobilized for the right kind of cause" in the Internet age.

  Shirky considers this model of activism an upgrade. But it is simply a form of organizing which favors the weak-tie connections that give us access to information over the strong-tie connections that help us persevere in the face of danger. It shifts our energies from organizations that promote strategic and disciplined activity and toward those which promote resilience and adaptability. It makes it easier for activists to express themselves, and harder for that expression to have any impact. The instruments of social media are well suited to making the existing social order more efficient. They are not a natural enemy of the status quo. If you are of the opinion that all the world needs is a little buffing around the edges, this should not trouble you. But if you think that there are still lunch counters out there that need integrating it ought to give you pause.

  Shirky ends the story of the lost Sidekick by asking, portentously, "What happens next?"-- no doubt imagining future waves of digital protesters. But he has already answered the question. What happens next is more of the same. A networked, weak-tie world is good at things like helping Wall Streeters get phones back from teen-age girls. Viva la revolución.
GO TO TOP MENU

  Why is it so difficult to develop drugs for cancer?

  1.

  In the world of cancer research, there is something called a Kaplan-Meier curve, which tracks the health of patients in the trial of an experimental drug. In its simplest version, it consists of two lines. The first follows the patients in the "control arm," the second the patients in the "treatment arm." In most cases, those two lines are virtually identical. That is the sad fact of cancer research: nine times out of ten, there is no difference in survival between those who were given the new drug and those who were not. But every now and again--after millions of dollars have been spent, and tens of thousands of pages of data collected, and patients followed, and toxicological issues examined, and safety issues resolved, and manufacturing processes fine-tuned--the patients in the treatment arm will live longer than the patients in the control arm, and the two lines on the Kaplan-Meier will start to diverge.

  Seven years ago, for example, a team from Genentech presented the results of a colorectal-cancer drug trial at the annual meeting of the American Society of Clinical Oncology--a conference attended by virtually every major cancer researcher in the world. The lead Genentech researcher took the audience through one slide after another--click, click, click--laying out the design and scope of the study, until he came to the crucial moment: the Kaplan-Meier. At that point, what he said became irrelevant. The members of the audience saw daylight between the two lines, for a patient population in which that almost never happened, and they leaped to their feet and gave him an ovation. Every drug researcher in the world dreams of standing in front of thousands of people at ASCO and clicking on a Kaplan-Meier like that. "It is why we are in this business," Safi Bahcall says. Once he thought that this dream would come true for him. It was in the late summer of 2006, and is among the greatest moments of his life.

  Bahcall is the C.E.O. of Synta Pharmaceuticals, a small biotechnology company. It occupies a one-story brick nineteen-seventies building outside Boston, just off Route 128, where many of the region's high-tech companies have congregated, and that summer Synta had two compounds in development. One was a cancer drug called elesclomol. The other was an immune modulator called apilimod. Experimental drugs must pass through three phases of testing before they can be considered for government approval. Phase 1 is a small trial to determine at what dose the drug can be taken safely. Phase 2 is a larger trial to figure out if it has therapeutic potential, and Phase 3 is a definitive trial to see if it actually works, usually in comparison with standard treatments. Elesclomol had progressed to Phase 2 for soft-tissue sarcomas and for lung cancer, and had come up short in both cases. A Phase 2 trial for metastatic melanoma--a deadly form of skin cancer--was also under way. But that was a long shot: nothing ever worked well for melanoma. In the previous thirty-five years, there had been something like seventy large-scale Phase 2 trials for metastatic-melanoma drugs, and if you plotted all the results on a single Kaplan-Meier there wouldn't be much more than a razor's edge of difference between any two of the lines.

  That left apilimod. In animal studies and early clinical trials for autoimmune disorders, it seemed promising. But when Synta went to Phase 2 with a trial for psoriasis, the results were underwhelming. "It was ugly," Bahcall says. "We had lung cancer fail, sarcoma next, and then psoriasis. We had one more trial left, which was for Crohn's disease. I remember my biostats guy coming into my office, saying, 'I've got some good news and some bad news. The good news is that apilimod is safe. We have the data. No toxicity. The bad news is that it's not effective.' It was heartbreaking."

  Bahcall is a boyish man in his early forties, with a round face and dark, curly hair. He was sitting at the dining-room table in his sparsely furnished apartment in Manhattan, overlooking the Hudson River. Behind him, a bicycle was leaning against a bare wall, giving the room a post-college feel. Both his parents were astrophysicists, and he, too, was trained as a physicist, before leaving academia for the business world. He grew up in the realm of the abstract and the theoretical--with theorems and calculations and precise measurements. But drug development was different, and when he spoke about the failure of apilimod there was a slight catch in his voice.

  Bahcall started to talk about one of the first patients ever treated with elesclomol: a twenty-four-year-old African-American man. He'd had Kaposi's sarcoma; tumors covered his lower torso. He'd been at Beth Israel Deaconess Medical Center, in Boston, and Bahcall had flown up to see him. On a Monday in January of 2003, Bahcall sat by his bed and they talked. The patient was just out of college. He had an I.V. in his arm. You went to the hospital and you sat next to some kid whose only wish was not to die, and it was impossible not to get emotionally involved. In physics, failure was disappointing. In drug development, failure was heartbreaking. Elesclomol wasn't much help against Kaposi's sarcoma. And now apilimod didn't work for Crohn's. "I mean, we'd done charity work for the Crohn's & Colitis Foundation," Bahcall went on. "I have relatives and friends with Crohn's disease, personal experience with Crohn's disease. We had Crohn's patients come in and talk in meetings and tell their stories. We'd raised money for five years from investors. I felt terrible. Here we were with our lead drug and it had failed. It was the end of the line."

  That summer of 2006, in one painful meeting after another, Synta began to downsize. "It was a Wednesday," Bahcall said. "We were around a table, and we were talking about pruning the budget and how we're going to contain costs, one in a series of tough discussions, and I noticed my chief medical officer, Eric Jacobson, at the end of the table, kind of looking a little unusually perky for one of those kinds of discussions." After the meeting, Bahcall pulled Jacobson over: "Is something up?" Jacobson nodded. Half an hour before the meeting, he'd received some news. It was about the melanoma trial for elesclomol, the study everyone had given up on. "The consultant said she had never seen data this good," Jacobson told him.

  Bahcall called back the management team for a special meeting. He gave the floor to Jacobson. "Eric was, like, 'Well, you know we've got this melanoma trial,' " Bahcall began, "and it took a moment to jog people's memories, because we'd all been so focussed on Crohn's disease and the psoriasis trials. And Eric said, 'Well, we got the results. The drug worked! It was a positive trial!' " One person slammed the table, stood up, and hollered. Others peppered Eric with questions. "Eric said, 'Well, the group analyzing the data is trying to disprove it, and they can't disprove it.' And he said, 'The consultant handed me the data on Wednesday morning, and she said it was boinking good.' And everyone said, 'What?' Because Eric is the sweetest guy, who never swears. A bad word cannot cross his lips. Everyone started yelling, 'What? What? What did she say, Eric? Eric! Eric! Say it! Say it!' "

  Bahcall contacted Synta's board of directors. Two days later, he sent out a company-wide e-mail saying that there would be a meeting that afternoon. At four o'clock, all hundred and thirty employees trooped into the building's lobby. Jacobson stood up. "So the lights go down," Bahcall continued. "Clinical guys, when they present data, tend to do it in a very bottoms-up way: this is the disease population, this is the treatment, and this is the drug, and this is what was randomized, and this is the demographic, and this is the patient pool, and this is who had toenail fungus, and this is who was Jewish. They go on and on and on, and all anyone wants is, Show us the fucking Kaplan-Meier! Finally he said, 'All right, now we can get to the efficacy.' It gets really silent in the room. He clicks the slide. The two lines separate out beautifully--and a gasp goes out, across a hundred and thirty people. Eric starts to continue, and one person goes like this"--Bahcall started clapping slowly--"and then a couple of people joined in, and then soon the whole room is just going like this--clap, clap, clap. There were tears. We all realized that our lives had changed, the lives of patients had changed, the way of treating the disease had changed. In that moment, everyone realized that this little company of a hundred and thirty people had a chance to win. We had a drug that worked, in a disease where nothing worked. That was the single most moving five minutes of all my years at Synta."

  2.

  In the winter of 1955, a young doctor named Emil Freireich arrived at the National Cancer Institute, in Bethesda, Maryland. He had been drafted into the Army, and had been sent to fulfill his military obligation in the public-health service. He went to see Gordon Zubrod, then the clinical director for the N.C.I. and later one of the major figures in cancer research. "I said, 'I'm a hematologist,' " Freireich recalls. "He said, 'I've got a good idea for you. Cure leukemia.' It was a military assignment." From that assignment came the first great breakthrough in the war against cancer.

  Freireich's focus was on the commonest form of childhood leukemia--acute lymphoblastic leukemia (ALL). The diagnosis was a "Glad8"The children would come in bleeding," Freireich says. "They'd have infections. They would be in pain. Median survival was about eight weeks, and everyone was dead within the year." At the time, three drugs were known to be useful against ALL. One was methotrexate, which, the pediatric pathologist Sidney Farber had shown seven years earlier, could push the disease into remission. Corticosteroids and 6-mercaptopurine (6-MP) had since proved useful. But even though methotrexate and 6-MP could kill a lot of cancer cells, they couldn't kill them all, and those which survived would regroup and adjust and multiply and return with a vengeance. "These remissions were all temporary--two or three months," Freireich, who now directs the adult-leukemia research program at the M. D. Anderson Cancer Center, in Houston, says. "The authorities in hematology didn't even want to use them in children. They felt it just prolonged the agony, made them suffer, and gave them side effects. That was the landscape."

  In those years, the medical world had made great strides against tuberculosis, and treating t.b. ran into the same problem as treating cancer: if doctors went after it with one drug, the bacteria eventually developed resistance. Their solution was to use multiple drugs simultaneously that worked in very different ways. Freireich wondered about applying that model to leukemia. Methotrexate worked by disrupting folic-acid uptake, which was crucial in the division of cells; 6-MP shut down the synthesis of purine, which was also critical in cell division. Putting the two together would be like hitting the cancer with a left hook and a right hook. Working with a group that eventually included Tom Frei, of the N.C.I., and James Holland, of the Roswell Park Cancer Institute, in Buffalo, Freireich started treating ALL patients with methotrexate and 6-MP in combination, each at two-thirds its regular dose to keep side effects in check. The remissions grew more frequent. Freireich then added the steroid prednisone, which worked by a mechanism different from that of either 6-MP or methotrexate; he could give it at full dose and not worry about the side effects getting out of control. Now he had a left hook, a right hook, and an uppercut.

  "So things are looking good," Freireich went on. "But still everyone dies. The remissions are short. And then out of the blue came the gift from Heaven"--another drug, derived from periwinkle, that had been discovered by Irving Johnson, a researcher at Eli Lilly. "In order to get two milligrams of drug, it took something like two train-car loads of periwinkle," Freireich said. "It was expensive. But Johnson was persistent." Lilly offered the new drug to Freireich. "Johnson had done work in mice, and he showed me the results. I said, 'Gee whiz, I've got ten kids on the ward dying. I'll give it to them tomorrow.' So I went to Zubrod. He said, 'I don't think it's a good ' But I said, 'These kids are dying. What's the difference?' He said, 'O.K., I'll let you do a few children.' The response rate was fifty-five per cent. The kids jumped out of bed." The drug was called vincristine, and, by itself, it was no wonder drug. Like the others, it worked only for a while. But the good news was that it had a unique mechanism of action--it interfered with cell division by binding to what is called the spindle protein--and its side effects were different from those of the other drugs. "So I sat down at my desk one day and I thought, Gee, if I can give 6-MP and meth at two-thirds dose and prednisone at full dose and vincristine has different limiting toxicities, I bet I can give a full dose of that, too. So I devised a trial where we would give all four in combination." The trial was called VAMP. It was a left hook, a right hook, an uppercut, and a jab, and the hope was that if you hit leukemia with that barrage it would never get up off the canvas.

  The first patient treated under the experimental regimen was a young girl. Freireich started her off with a dose that turned out to be too high, and she almost died. She was put on antibiotics and a respirator. Freireich saw her eight times a day, sitting at her bedside. She pulled through the chemo-induced crisis, only to die later of an infection. But Freireich was learning. He tinkered with his protocol and started again, with patient No. 2. Her name was Janice. She was fifteen, and her recovery was nothing short of miraculous. So was the recovery of the next patient and the next and the next, until nearly every child was in remission, without need of antibiotics or transfusions. In 1965, Frei and Freireich published one of the most famous articles in the history of oncology, "Progress and Perspective in the Chemotherapy of Acute Leukemia," in Advances in Chemotherapy. Almost three decades later, a perfectly healthy Janice graced the cover of the journal Cancer Research.

  What happened with ALL was a formative experience for an entire generation of cancer fighters. VAMP proved that medicine didn't need a magic bullet--a superdrug that could stop all cancer in its tracks. A drug that worked a little bit could be combined with another that worked a little bit and another that worked a little bit, and, as long as all three worked in different ways and had different side effects, the combination could turn out to be spectacular. To be valuable, a cancer drug didn't have to be especially effective on its own; it just had to be novel in the way it acted. And, from the beginning, this was what caused so much excitement about elesclomol.

  3.

  Safi Bahcall's partner in the founding of Synta was a cell biologist at Harvard Medical School named Lan Bo Chen. Chen, who is in his mid-sixties, was born in Taiwan. He is a mischievous man, with short-cropped straight black hair and various quirks--including a willingness to say whatever is on his mind, a skepticism about all things Japanese (the Japanese occupied Taiwan during the war, after all), and a keen interest in the marital prospects of his unattached co-workers. Bahcall, who is Jewish, describes him affectionately as "the best and worst parts of a Jewish father and the best and worst parts of a Jewish mother rolled into one." (Sample e-mail from Chen: "Safi is in Israel. Hope he finds wife.")

  Drug hunters like Chen fall into one of two broad schools. The first school, that of "rational design," believes in starting with the disease and working backward--designing a customized solution based on the characteristics of the problem. Herceptin, one of the most important of the new generation of breast-cancer drugs, is a good example. It was based on genetic detective work showing that about a quarter of all breast cancers were caused by the overproduction of a protein called HER2. HER2 kept causing cells to divide and divide, and scientists set about designing a drug to turn HER2 off. The result is a drug that improved survival in twenty-five per cent of patients with advanced breast cancer. (When Herceptin's Kaplan-Meier was shown at ASCO, there was stunned silence.) But working backward to a solution requires a precise understanding of the problem, and cancer remains so mysterious and complex that in most cases scientists don't have that precise understanding. Or they think they do, and then, after they turn off one mechanism, they discover that the tumor has other deadly tricks in reserve.

  The other approach is to start with a drug candidate and then hunt for diseases that it might attack. This strategy, known as "mass screening," doesn't involve a theory. Instead, it involves a random search for matches between treatments and diseases. This was the school to which Chen belonged. In fact, he felt that the main problem with mass screening was that it wasn't mass enough. There were countless companies outside the drug business--from industrial research labs to photography giants like Kodak and Fujifilm--that had millions of chemicals sitting in their vaults. Yet most of these chemicals had never been tested to see if they had potential as drugs. Chen couldn't understand why. If the goal of drug discovery was novelty, shouldn't the hunt for new drugs go as far and wide as possible?

  "In the early eighties, I looked into how Merck and Pfizer went about drug discovery," Chen recalls. "How many compounds are they using? Are they doing the best they can? And I come up with an incredible number. It turns out that mankind had, at this point, made tens of millions of compounds. But Pfizer was screening only six hundred thousand compounds, and Merck even fewer, about five hundred thousand. How could they screen for drugs and use only five hundred thousand, when mankind has already made so many more?"

  An early financial backer of Chen's was Michael Milken, the junk-bond king of the nineteen-eighties who, after being treated for prostate cancer, became a major cancer philanthropist. "I told Milken my story," Chen said, "and very quickly he said, 'I'm going to give you four million dollars. Do whatever you want.' Right away, Milken thought of Russia. Someone had told him that the Russians had had, for a long time, thousands of chemists in one city making compounds, and none of those compounds had been disclosed." Chen's first purchase was a batch of twenty-two thousand chemicals, gathered from all over Russia and Ukraine. They cost about ten dollars each, and came in tiny glass vials. With his money from Milken, Chen then bought a six-hundred-thousand-dollar state-of-the-art drug-screening machine. It was a big, automated Rube Goldberg contraption that could test ninety-six compounds at a time and do a hundred batches a day. A robotic arm would deposit a few drops of each chemical onto a plate, followed by a clump of cancer cells and a touch of blue dye. The mixture was left to sit for a week, and then reëxamined. If the cells were still alive, they would show as blue. If the chemical killed the cancer cells, the fluid would be clear.

  Chen's laboratory began by testing his compounds against prostate-cancer cells, since that was the disease Milken had. Later, he screened dozens of other cancer cells as well. In the first go-around, his batch of chemicals killed everything in sight. But plenty of compounds, including pesticides and other sorts of industrial poisons, will kill cancer cells. The trouble is that they'll kill healthy cells as well. Chen was looking for something that was selective--that was more likely to kill malignant cells than normal cells. He was also interested in sensitivity--in a chemical's ability to kill at low concentrations. Chen reduced the amount of each chemical on the plate a thousandfold, and tried again. Now just one chemical worked. He tried the same chemical on healthy cells. It left them alone. Chen lowered the dose another thousandfold. It still worked. The compound came from the National Taras Shevchenko University of Kiev. It was an odd little chemical, the laboratory equivalent of a jazz musician's riff. "It was pure chemist's joy," Chen said. "Homemade, random, and clearly made for no particular purpose. It was the only one that worked on everything we tried."

  Mass screening wasn't as elegant or as efficient as rational drug design. But it provided a chance of stumbling across something by accident--something so novel and unexpected that no scientist would have dreamed it up. It provided for serendipity, and the history of drug discovery is full of stories of serendipity. Alexander Fleming was looking for something to fight bacteria, but didn't think the answer would be provided by the mold that grew on a petri dish he accidentally left out on his bench. That's where penicillin came from. Pfizer was looking for a new heart treatment and realized that a drug candidate's unexpected side effect was more useful than its main effect. That's where Viagra came from. "The end of surprise would be the end of science," the historian Robert Friedel wrote in the 2001 essay "Serendipity Is No Accident." "To this extent, the scientist must constantly seek and hope for surprises." When Chen gathered chemical compounds from the farthest corners of the earth and tested them against one cancer-cell line after another, he was engineering surprise.

  What he found was exactly what he'd hoped for when he started his hunt: something he could never have imagined on his own. When cancer cells came into contact with the chemical, they seemed to go into crisis mode: they acted as if they had been attacked with a blowtorch. The Ukrainian chemical, elesclomol, worked by gathering up copper from the bloodstream and bringing it into cells' mitochondria, sparking an electrochemical reaction. His focus was on the toxic, oxygen-based compounds in the cell called ROS, reactive oxygen species. Normal cells keep ROS in check. Many kinds of cancer cells, though, generate so much ROS that the cell's ability to keep functioning is stretched to the breaking point, and elesclomol cranked ROS up even further, to the point that the cancer cells went up in flames. Researchers had long known that heating up a cancer cell was a good way of killing it, and there had been plenty of interest over the years in achieving that effect with ROS. But the idea of using copper to set off an electrochemical reaction was so weird--and so unlike the way cancer drugs normally worked--that it's not an approach anyone would have tried by design. That was the serendipity. It took a bit of "chemist's joy," constructed for no particular reason by some bench scientists in Kiev, to show the way. Elesclomol was wondrously novel. "I fell in love," Chen said. "I can't explain it. I just did."

  4.

  When Freireich went to Zubrod with his idea for VAMP, Zubrod could easily have said no. Drug protocols are typically tested in advance for safety in animal models. This one wasn't. Freireich freely admits that the whole idea of putting together poisonous drugs in such dosages was "insane," and, of course, the first patient in the trial had nearly been killed by the toxic regimen. If she had died from it, the whole trial could have been derailed.

  The ALL success story provided a hopeful road map for a generation of cancer fighters. But it also came with a warning: those who pursued the unexpected had to live with unexpected consequences. This was not the elegance of rational drug design, where scientists perfect their strategy in the laboratory before moving into the clinic. Working from the treatment to the disease was an exercise in uncertainty and trial and error.

  If you're trying to put together a combination of three or four drugs out of an available pool of dozens, how do you choose which to start with? The number of permutations is vast. And, once you've settled on a combination, how do you administer it? A child gets sick. You treat her. She goes into remission, and then she relapses. VAMP established that the best way to induce remission was to treat the child aggressively when she first showed up with leukemia. But do you treat during the remission as well, or only when the child relapses? And, if you treat during remission, do you treat as aggressively as you did during remission induction, or at a lower level? Do you use the same drugs in induction as you do in remission and as you do in relapse? How do you give the drugs, sequentially or in combination? At what dose? And how frequently--every day, or do you want to give the child's body a few days to recover between bouts of chemo?

  Oncologists compared daily 6-MP plus daily methotrexate with daily 6-MP plus methotrexate every four days. They compared methotrexate followed by 6-MP, 6-MP followed by methotrexate, and both together. They compared prednisone followed by full doses of 6-MP, methotrexate, and a new drug, cyclophosphamide (CTX), with prednisone followed by half doses of 6-MP, methotrexate, and CTX. It was endless: vincristine plus prednisone and then methotrexate every four days or vincristine plus prednisone and then methotrexate daily? They tried new drugs, and different combinations. They tweaked and refined, and gradually pushed the cure rate from forty per cent to eighty-five per cent. At St. Jude Children's Research Hospital, in Memphis--which became a major center of ALL research--no fewer than sixteen clinical trials, enrolling 3,011 children, have been conducted in the past forty-eight years.

  And this was just childhood leukemia. Beginning in the nineteen-seventies, Lawrence Einhorn, at Indiana University, pushed cure rates for testicular cancer above eighty per cent with a regimen called BEP: three to four rounds of bleomycin, etoposide, and cisplatin. In the nineteen-seventies, Vincent T. DeVita, at the N.C.I., came up with MOPP for advanced Hodgkin's disease: mustargen, oncovin, procarbazine, and prednisone. DeVita went on to develop a combination therapy for breast cancer called CMF--cyclophosphamide, methotrexate, and 5-fluorouracil. Each combination was a variation on the combination that came before it, tailored to its target through a series of iterations. The often asked question "When will we find a cure for cancer?" implies that there is some kind of master code behind the disease waiting to be cracked. But perhaps there isn't a master code. Perhaps there is only what can be uncovered, one step at a time, through trial and error.

  When elesclomol emerged from the laboratory, then, all that was known about it was that it did something novel to cancer cells in the laboratory. Nobody had any idea what its best target was. So Synta gave elesclomol to an oncologist at Beth Israel in Boston, who began randomly testing it out on his patients in combination with paclitaxel, a standard chemotherapy drug. The addition of elesclomol seemed to shrink the tumor of someone with melanoma. A patient whose advanced ovarian cancer had failed multiple rounds of previous treatment had some response. There was dramatic activity against Kaposi's sarcoma. They could have gone on with Phase 1s indefinitely, of course. Chen wanted to combine elesclomol with radiation therapy, and another group at Synta would later lobby hard to study elesclomol's effects on acute myeloid leukemia (AML), the commonest form of adult leukemia. But they had to draw the line somewhere. Phase 2 would be lung cancer, soft-tissue sarcomas, and melanoma.

  Now Synta had its targets. But with this round of testing came an even more difficult question. What's the best way to conduct a test of a drug you barely understand? To complicate matters further, melanoma, the disease that seemed to be the best of the three options, is among the most complicated of all cancers. Sometimes it confines itself to the surface of the skin. Sometimes it invades every organ in the body. Some kinds of melanoma have a mutation involving a gene called BRAF; others don't. Some late-stage melanoma tumors pump out high levels of an enzyme called LDH. Sometimes they pump out only low levels of LDH, and patients with low-LDH tumors lived so much longer that it was as if they had a different disease. Two patients could appear to have identical diagnoses, and then one would be dead in six months and the other would be fine. Tumors sometimes mysteriously disappeared. How did you conduct a drug trial with a disease like this?

  It was entirely possible that elesclomol would work in low-LDH patients and not in high-LDH patients, or in high-LDH patients and not in low-LDH ones. It might work well against the melanoma that confined itself to the skin and not against the kind that invaded the liver and other secondary organs; it might work in the early stages of metastasis and not in the later stages. Then, there was the prior-treatment question. Because of how quickly tumors become resistant to drugs, new treatments sometimes work better on "naïve" patients--those who haven't been treated with other forms of chemotherapy. So elesclomol might work on chemo-naïve patients and not on prior-chemo patients. And, in any of these situations, elesclomol might work better or worse depending on which other drug or drugs it was combined with. There was no end to the possible combinations of patient populations and drugs that Synta could have explored.

  At the same time, Synta had to make sure that whatever trial it ran was as big as possible. With a disease as variable as melanoma, there was always the risk in a small study that what you thought was a positive result was really a matter of spontaneous remissions, and that a negative result was just the bad luck of having patients with an unusually recalcitrant form of the disease. John Kirkwood, a melanoma specialist at the University of Pittsburgh, had done the math: in order to guard against some lucky or unlucky artifact, the treatment arm of a Phase 2 trial should have at least seventy patients.

  Synta was faced with a dilemma. Given melanoma's variability, the company would ideally have done half a dozen or more versions of its Phase 2 trial: low-LDH, high-LDH, early-stage, late-stage, prior-chemo, chemo-naïve, multi-drug, single-drug. There was no way, though, that they could afford to do that many trials with seventy patients in each treatment arm. The American biotech industry is made up of lots of companies like Synta, because small start-ups are believed to be more innovative and adventurous than big pharmaceutical houses. But not even big firms can do multiple Phase 2 trials on a single disease--not when trials cost more than a hundred thousand dollars per patient and not when, in the pursuit of serendipity, they are simultaneously testing that same experimental drug on two or three other kinds of cancer. So Synta compromised. The company settled on one melanoma trial: fifty-three patients were given elesclomol plus paclitaxel, and twenty-eight, in the control group, were given paclitaxel alone, representing every sort of LDH level, stage of disease, and prior-treatment status. That's a long way from half a dozen trials of seventy each.

  Synta then went to Phase 3: six hundred and fifty-one chemo-naïve patients, drawn from a hundred and fifty hospitals, in fifteen countries. The trial was dubbed SYMMETRY. It was funded by the pharmaceutical giant Glaxo Smith Kline. Glaxo agreed to underwrite the cost of the next round of clinical trials and--should the drug be approved by the Food and Drug Administration--to split the revenues with Synta.

  But was this the perfect trial? Not really. In the Phase 2 trial, elesclomol had been mixed with an organic solvent called Cremophore and then spun around in a sonicator, which is like a mini washing machine. Elesclomol, which is rock-hard in its crystalline form, needed to be completely dissolved if it was going to work as a drug. For SYMMETRY, though, sonicators couldn't be used. "Many countries said that it would be difficult, and some hospitals even said, 'We don't allow sonication in the preparation room,' " Chen explained. "We got all kinds of unbelievable feedback. In the end, we came up with something that, after mixing, you use your hand to shake it." Would hand shaking be a problem? No one knew.

  Then a Synta chemist, Mitsunori Ono, figured out how to make a water-soluble version of elesclomol. When the head of Synta's chemistry team presented the results, he "sang a Japanese drinking song," Chen said, permitting himself a small smile at the eccentricities of the Japanese. "He was very happy." It was a great accomplishment. The water-soluble version could be given in higher doses. Should they stop SYMMETRY and start again with elesclomol 2.0? They couldn't. A new trial would cost many millions of dollars more, and set the whole effort back two or three years. So they went ahead with a drug that didn't dissolve easily, against a difficult target, with an assortment of patients who may or may not have been ideal--and crossed their fingers.

  SYMMETRY began in late 2007. It was a double-blind, randomized trial. No one had any idea who was getting elesclomol and who wasn't, and no one would have any idea how well the patients on elesclomol were doing until the trial data were unblinded. Day-to-day management of the study was shared with a third-party contractor. The trial itself was supervised by an outside group, known as a data-monitoring committee. "We send them all the data in some database format, and they plug that into their software package and then they type in the code and press 'Enter,' " Bahcall said. "And then this line"--he pointed at the Kaplan-Meier in front of him--"will, hopefully, separate into two lines. They will find out in thirty seconds. It's, literally, those guys press a button and for the next five years, ten years, the life of the drug, that's really the only bit of evidence that matters." It was January, 2009, and the last of the six hundred and fifty-one patients were scheduled to be enrolled in the trial in the next few weeks. According to protocol, when the results began to come in, the data-monitoring committee would call Jacobson, and Jacobson would call Bahcall. "ASCO starts May 29th," Bahcall said. "If we get our data by early May, we could present at ASCO this year."

  5.

  In the course of the SYMMETRY trial, Bahcall's dining-room-table talks grew more reflective. He drew Kaplan-Meiers on the back of napkins. He talked about the twists and turns that other biotech companies had encountered on the road to the marketplace. He told wry stories about Lan Bo Chen, the Jewish mother and Jewish father rolled into one--and, over and over, he brought up the name of Judah Folkman. Folkman died in 2008, and he was a legend. He was the father of angiogenesis--a wholly new way of attacking cancer tumors. Avastin, the drug that everyone cheered at ASCO seven years ago, was the result of Folkman's work.

  Folkman's great breakthrough had come while he was working with mouse melanoma cells at the National Naval Medical Center: when the tumors couldn't set up a network of blood vessels to feed themselves, they would stop growing. Folkman realized that the body must have its own system for promoting and halting blood-vessel formation, and that if he could find a substance that prevented vessels from being formed he would have a potentially powerful cancer drug. One of the researchers in Folkman's laboratory, Michael O'Reilly, found what seemed to be a potent inhibitor: angiostatin. O'Reilly then assembled a group of mice with an aggressive lung cancer, and treated half with a saline solution and half with angiostatin. In the book "Dr. Folkman's War" (2001), Robert Cooke describes the climactic moment when the results of the experiment came in:

With a horde of excited researchers jam-packed into a small laboratory room, Folkman euthanized all fifteen mice, then began handing them one by one to O'Reilly to dissect. O'Reilly took the first mouse, made an incision in its chest, and removed the lung. The organ was overwhelmed by cancer. Folkman checked a notebook to see which group the mouse had been in. It was one of those that had gotten only saline. O'Reilly cut into the next mouse and removed its lung. It was perfect. What treatment had it gotten? The notebook revealed it was angiostatin.

  It wasn't Folkman's triumph that Bahcall kept coming back to, however. It was his struggle. Folkman's great insight at the Naval Medical Center occurred in 1960. O'Reilly's breakthrough experiment occurred in 1994. In the intervening years, Folkman's work was dismissed and attacked, and confronted with every obstacle.

  At times, Bahcall tried to convince himself that elesclomol's path might be different. Synta had those exciting Phase 2 results, and the endorsement of the Glaxo deal. "For the results not to be real, you'd have to believe that it was just a statistical fluke that the patients who got drugs are getting better," Bahcall said, in one of those dining-room-table moments. "You'd have to believe that the fact that there were more responses in the treatment group was also a statistical fluke, along with the fact that we've seen these signs of activity in Phase 1, and the fact that the underlying biology strongly says that we have an extremely active anti-cancer agent."

  But then he would remember Folkman. Angiostatin and a companion agent also identified by Folkman's laboratory, endostatin, were licensed by a biotech company called EntreMed. And EntreMed never made a dime off either drug. The two drugs failed to show any clinical effects in both Phase 1 and Phase 2. Avastin was a completely different anti-angiogenesis agent, discovered and developed by another team entirely, and brought to market a decade after O'Reilly's experiment. What's more, Avastin's colorectal-cancer trial--the one that received a standing ovation at ASCO--was the drug's second go-around. A previous Phase 3 trial, for breast cancer, had been a crushing failure. Even Folkman's beautifully elaborated theory about angiogenesis may not fully explain the way Avastin works. In addition to cutting off the flow of blood vessels to the tumor, Avastin seems also to work by repairing some of the blood vessels feeding the tumor, so that the drugs administered in combination with Avastin can get to the tumor more efficiently.

  Bahcall followed the fortunes of other biotech companies the way a teen-age boy follows baseball statistics, and he knew that nothing ever went smoothly. He could list, one by one, all the breakthrough drugs that had failed their first Phase 3 or had failed multiple Phase 2s, or that turned out not to work the way they were supposed to work. In the world of serendipity and of trial and error, failure was a condition of discovery, because, when something was new and worked in ways that no one quite understood, every bit of knowledge had to be learned, one experiment at a time. You ended up with VAMP, which worked, but only after you compared daily 6-MP and daily methotrexate with daily 6-MP and methotrexate every four days, and so on, through a great many iterations, none of which worked very well at all. You had results that looked "boinking good," but only after a trial with a hundred compromises.

  Chen had the same combination of realism and idealism that Bahcall did. He was the in-house skeptic at Synta. He was the one who worried the most about the hand shaking of the drugs in the SYMMETRY trial. He had never been comfortable with the big push behind melanoma. "Everyone at Dana- Farber"--the cancer hospital at Harvard--"told me, 'Don't touch melanoma,' " Chen said. " 'It is so hard. Maybe you save it as the last, after you have already treated and tried everything else.' " The scientists at Synta were getting better and better at understanding just what it was that elesclomol did when it confronted a cancer cell. But he knew that there was always a gap between what could be learned in the laboratory and what happened in the clinic. "We just don't know what happens in vivo," he said. "That's why drug development is still so hard and so expensive, because the human body is such a black box. We are totally shooting in the dark." He shrugged. "You have to have good science, sure. But once you shoot the drug in humans you go home and pray."

  Chen was sitting in the room at Synta where Eric Jacobson had revealed the "boinking good" news about elesclomol's Phase 2 melanoma study. Down the hall was a huge walk-in freezer, filled with thousands of chemicals from the Russian haul. In another room was the Rube Goldberg drug-screening machine, bought with Milken's money. Chen began to talk about elesclomol's earliest days, when he was still scavenging through the libraries of chemical companies for leads and Bahcall was still an ex-physicist looking to start a biotech company. "I could not convince anyone that elesclomol had potential," Chen went on. "Everyone around me tried to stop it, including my research partner, who is a Nobel laureate. He just hated it." At one point, Chen was working with Fujifilm. The people there hated elesclomol. He worked for a while for the Japanese chemical company Shionogi. The Japanese hated it. "But you know who I found who believed in it?" Chen's eyes lit up: "Safi!"

  6.

  Last year, on February 25th, Bahcall and Chen were at a Synta board meeting in midtown Manhattan. It was five-thirty in the afternoon. As the meeting was breaking up, Bahcall got a call on his cell phone. "I have to take this," he said to Chen. He ducked into a nearby conference room, and Chen waited for him, with the company's chairman, Keith Gollust. Fifteen minutes passed, then twenty. "I tell Keith it must be the data-monitoring committee," Chen recalls. "He says, 'No way. Too soon. How could the D.M.C. have any news just yet?' I said, 'It has to be.' So he stays with me and we wait. Another twenty minutes. Finally Safi comes out, and I looked at him and I knew. He didn't have to say anything. It was the color of his face."

  The call had been from Eric Jacobson. He had just come back from Florida, where he had met with the D.M.C. on the SYMMETRY trial. The results of the trial had been unblinded. Jacobson had spent the last several days going over the data, trying to answer every question and double-check every conclusion. "I have some really bad news," he told Bahcall. The trial would have to be halted: more people were dying in the treatment arm than in the control arm. "It took me about a half hour to come out of primary shock," Bahcall said. "I didn't go home. I just grabbed my bag, got into a cab, went straight to LaGuardia, took the next flight to Logan, drove straight to the office. The chief medical officer, the clinical guys, statistical guys, operational team were all there, and we essentially spent the rest of the night, until about one or two in the morning, reviewing the data." It looked as if patients with high-LDH tumors were the problem: elesclomol seemed to fail them completely. It was heartbreaking. Glaxo, Bahcall knew, was certain to pull out of the deal. There would have to be many layoffs.

  The next day, Bahcall called a meeting of the management team. They met in the Synta conference room. "Eric has some news," Bahcall said. Jacobson stood up and began. But before he got very far he had to stop, because he was overcome with emotion, and soon everyone else in the room was, too.

  On December 7, 2009, Synta released the following statement:

Synta Pharmaceuticals Corp. (NASDAQ: SNTA), a biopharmaceutical company focused on discovering, developing, and commercializing small molecule drugs to treat severe medical conditions, today announced the results of a study evaluating the activity of elesclomol against acute myeloid leukemia (AML) cell lines and primary leukemic blast cells from AML patients, presented at the Annual Meeting of the American Society of Hematology (ASH) in New Orleans. . . ."The experiments conducted at the University of Toronto showed elesclomol was highly active against AML cell lines and primary blast cells from AML patients at concentrations substantially lower than those already achieved in cancer patients in clinical trials," said Vojo Vukovic, M.D., Ph.D., Senior Vice President and Chief Medical Officer, Synta. "Of particular interest were the ex vivo studies of primary AML blast cells from patients recently treated at Toronto, where all 10 samples of leukemic cells responded to exposure to elesclomol. These results provide a strong rationale for further exploring the potential of elesclomol in AML, a disease with high medical need and limited options for patients."

  "I will bet anything I have, with anybody, that this will be a drug one day," Chen said. It was January. The early AML results had just come in. Glaxo was a memory. "Now, maybe we are crazy, we are romantic. But this kind of characteristic you have to have if you want to be a drug hunter. You have to be optimistic, you have to have supreme confidence, because the odds are so incredibly against you. I am a scientist. I just hope that I would be so romantic that I become deluded enough to keep hoping."
GO TO TOP MENU

  It was a dazzling feat of wartime espionage. But does it argue for or against spying?

  1.

  On April 30, 1943, a fisherman came across a badly decomposed corpse floating in the water off the coast of Huelva, in southwestern Spain. The body was of an adult male dressed in a trenchcoat, a uniform, and boots, with a black attaché case chained to his waist. His wallet identified him as Major William Martin, of the Royal Marines. The Spanish authorities called in the local British vice-consul, Francis Haselden, and in his presence opened the attaché case, revealing an official-looking military envelope. The Spaniards offered the case and its contents to Haselden. But Haselden declined, requesting that the handover go through formal channels--an odd decision, in retrospect, since, in the days that followed, British authorities in London sent a series of increasingly frantic messages to Spain asking the whereabouts of Major Martin's briefcase.

  It did not take long for word of the downed officer to make its way to German intelligence agents in the region. Spain was a neutral country, but much of its military was pro-German, and the Nazis found an officer in the Spanish general staff who was willing to help. A thin metal rod was inserted into the envelope; the documents were then wound around it and slid out through a gap, without disturbing the envelope's seals. What the officer discovered was astounding. Major Martin was a courier, carrying a personal letter from Lieutenant General Archibald Nye, the vice-chief of the Imperial General Staff, in London, to General Harold Alexander, the senior British officer under Eisenhower in Tunisia. Nye's letter spelled out what Allied intentions were in southern Europe. American and British forces planned to cross the Mediterranean from their positions in North Africa, and launch an attack on German-held Greece and Sardinia. Hitler transferred a Panzer division from France to the Peloponnese, in Greece, and the German military command sent an urgent message to the head of its forces in the region: "The measures to be taken in Sardinia and the Peloponnese have priority over any others."

  The Germans did not realize--until it was too late--that "William Martin" was a fiction. The man they took to be a high-level courier was a mentally ill vagrant who had eaten rat poison; his body had been liberated from a London morgue and dressed up in "s clothing. The letter was a fake, and the frantic messages between London and Madrid a carefully choreographed act. When a hundred and sixty thousand Allied troops invaded Sicily on July 10, 1943, it became clear that the Germans had fallen victim to one of the most remarkable deceptions in modern military history.

  The story of Major William Martin is the subject of the British journalist Ben Macintyre's brilliant and almost absurdly entertaining "Operation Mincemeat" (Harmony; $25.99). The cast of characters involved in Mincemeat, as the caper was called, was extraordinary, and Macintyre tells their stories with gusto. The ringleader was Ewen Montagu, the son of a wealthy Jewish banker and the brother of Ivor Montagu, a pioneer of table tennis and also, in one of the many strange footnotes to the Mincemeat case, a Soviet spy. Ewen Montagu served on the so-called Twenty Committee of the British intelligence services, and carried a briefcase full of classified documents on his bicycle as he rode to work each morning.

  His partner in the endeavor was a gawky giant named Charles Cholmondeley, who lifted the toes of his size-12 feet when he walked, and, Macintyre writes, "gazed at the world through thick round spectacles, from behind a remarkable moustache fully six inches long and waxed into magnificent points." The two men coördinated with Dudley Clarke, the head of deception for all the Mediterranean, whom Macintyre describes as "unmarried, nocturnal and allergic to children." In 1925, Clarke organized a pageant "depicting imperial artillery down the ages, which involved two elephants, thirty-seven guns and 'fourteen of the biggest Nigerians he could find.' He loved uniforms, disguises and dressing up." In 1941, British authorities had to bail him out of a Spanish jail, dressed in "high heels, lipstick, pearls, and a chic cloche hat, his hands, in long opera gloves, demurely folded in his lap. He was not supposed to even be in Spain, but in Egypt." Macintyre, who has perfect pitch when it comes to matters of British eccentricity, reassures us, "It did his career no long-term damage."

  To fashion the container that would keep the corpse "fresh," before it was dumped off the coast of Spain, Mincemeat's planners turned to Charles Fraser-Smith, whom Ian Fleming is thought to have used as the model for Q in the James Bond novels. Fraser-Smith was the inventor of, among other things, garlic-flavored chocolate intended to render authentic the breath of agents dropping into France and "a compass hidden in a button that unscrewed clockwise, based on the impeccable theory that the 'unswerving logic of the German mind' would never guess that something might unscrew the wrong way." The job of transporting the container to the submarine that would take it to Spain was entrusted to one of England's leading race-car drivers, St. John (Jock) Horsfall, who, Macintyre notes, "was short-sighted and astigmatic but declined to wear spectacles." At one point during the journey, Horsfall nearly drove into a tram stop, and then "failed to see a roundabout until too late and shot over the grass circle in the middle."

  Each stage of the deception had to be worked out in advance. Martin's personal effects needed to be detailed enough to suggest that he was a real person, but not so detailed as to suggest that someone was trying to make him look like a real person. Cholmondeley and Montagu filled Martin's pockets with odds and ends, including angry letters from creditors and a bill from his tailor. "Hour after hour, in the Admiralty basement, they discussed and refined this imaginary person, his likes and dislikes, his habits and hobbies, his talents and weaknesses," Macintyre writes. "In the evening, they repaired to the Gargoyle Club, a glamorous Soho dive of which Montagu was a member, to continue the odd process of creating a man from scratch." Francis Haselden, for his part, had to look as if he desperately wanted the briefcase back. But he couldn't be too diligent, because he had to make sure that the Germans had a look at it first. "Here lay an additional, but crucial, consideration," Macintyre goes on. "The Germans must be made to believe that they had gained access to the documents undetected; they should be made to assume that the British believed the Spaniards had returned the documents unopened and unread. Operation Mincemeat would only work if the Germans could be fooled into believing that the British had been fooled." It was an impossibly complex scheme, dependent on all manner of unknowns and contingencies. What if whoever found the body didn't notify the authorities? What if the authorities disposed of the matter so efficiently that the Germans never caught wind of it? What if the Germans saw through the ruse?

  In mid-May of 1943, when Winston Churchill was in Washington, D.C., for the Trident conference, he received a telegram from the code breakers back home, who had been monitoring German military transmissions: "MINCEMEAT SWALLOWED ROD, LINE AND SINKER." Macintyre's "Operation Mincemeat" is part of a long line of books celebrating the cleverness of Britain's spies during the Second World War. It is equally instructive, though, to think about Mincemeat from the perspective of the spies who found the documents and forwarded them to their superiors. The things that spies do can help win battles that might otherwise have been lost. But they can also help lose battles that might otherwise have been won.

  2.

  In early 1943, long before Major Martin's body washed up onshore, the German military had begun to think hard about Allied intentions in southern Europe. The Allies had won control of North Africa from the Germans, and were clearly intending to cross the Mediterranean. But where would they attack? One school of thought said Sardinia. It was lightly defended and difficult to reinforce. The Allies could mount an invasion of the island relatively quickly. It would be ideal for bombing operations against southern Germany, and Italy's industrial hub in the Po Valley, but it didn't have sufficient harbors or beaches to allow for a large number of ground troops to land. Sicily did. It was also close enough to North Africa to be within striking distance of Allied short-range fighter planes, and a successful invasion of Sicily had the potential to knock the Italians out of the war.

  Mussolini was in the Sicily camp, as was Field Marshal Kesselring, who headed up all German forces in the Mediterranean. In the Italian Commando Supremo, most people picked Sardinia, however, as did a number of senior officers in the German Navy and Air Force. Meanwhile, Hitler and the Oberkommando der Wehrmacht--the German armed-forces High Command--had a third candidate. They thought that the Allies were most likely to strike at Greece and the Balkans, given the Balkans' crucial role in supplying the German war effort with raw materials such as oil, bauxite, and copper. And Greece was far more vulnerable to attack than Italy. As the historians Samuel Mitcham and Friedrich von Stauffenberg have pointed out, "in Greece all Axis reinforcements and supplies would have to be shipped over a single rail line of limited capacity, running for 1,300 kilometers (more than 800 miles) through an area vulnerable to air and partisan attack."

  All these assessments were strategic inferences from an analysis of known facts. But this kind of analysis couldn't point to a specific target. It could only provide a range of probabilities. The intelligence provided by Major Martin's documents was in a different category. It was marvellously specific. It said: Greece and Sardinia. But because that information washed up onshore, as opposed to being derived from the rational analysis of known facts, it was difficult to know whether it was true. As the political scientist Richard Betts has argued, in intelligence analysis there tends to be an inverse relationship between accuracy and significance, and this is the dilemma posed by the Mincemeat case.

  As Macintyre observes, the informational supply chain that carried the Mincemeat documents from Huelva to Berlin was heavily corrupted. The first great enthusiast for the Mincemeat find was the head of German intelligence in Madrid, Major Karl-Erich Kühlenthal. He personally flew the documents to Berlin, along with a report testifying to their significance. But, as Macintyre writes, Kühlenthal was "a one-man espionage disaster area." One of his prized assets was a Spaniard named Juan Pujol García, who was actually a double agent. When British code breakers looked at Kühlenthal's messages to Berlin, they found that he routinely embellished and fictionalized his reports. According to Macintyre, Kühlenthal was "frantically eager to please, ready to pass on anything that might consolidate his " in part because he had some Jewish ancestry and was desperate not to be posted back to Germany.

  When the documents arrived in Berlin, they were handed over to one of Hitler's top intelligence analysts, a man named Alexis Baron von Roenne. Von Roenne vouched for their veracity as well. But in some respects von Roenne was even less reliable than Kühlenthal. He hated Hitler and seemed to have done everything in his power to sabotage the Nazi war effort. Before D Day, Macintyre writes, "he faithfully passed on every deception ruse fed to him, accepted the existence of every bogus unit regardless of evidence, and inflated forty-four divisions in Britain to an astonishing eighty-nine." It is entirely possible, Macintyre suggests, that von Roenne "did not believe the Mincemeat deception for an instant."

  These are two fine examples of why the proprietary kind of information that spies purvey is so much riskier than the products of rational analysis. Rational inferences can be debated openly and widely. Secrets belong to a small assortment of individuals, and inevitably become hostage to private agendas. Kühlenthal was an advocate of the documents because he needed them to be true; von Roenne was an advocate of the documents because he suspected them to be false. In neither case did the audiences for their assessments have an inkling about their private motivations. As Harold Wilensky wrote in his classic work "Organizational Intelligence" (1967), "The more secrecy, the smaller the intelligent audience, the less systematic the distribution and indexing of research, the greater the anonymity of authorship, and the more intolerant the attitude toward deviant views." Wilensky had the Bay of Pigs debacle in mind when he wrote that. But it could just as easily have applied to any number of instances since, including the private channels of "intelligence" used by members of the Bush Administration to convince themselves that Saddam Hussein had weapons of mass destruction.

  It was the requirement of secrecy that also prevented the Germans from properly investigating the Mincemeat story. They had to make it look as if they had no knowledge of Martin's documents. So their hands were tied. The dated papers in Martin's pockets indicated that he had been in the water for barely five days. Had the Germans seen the body, though, they would have realized that it was far too decomposed to have been in the water for less than a week. And, had they talked to the Spanish coroner who examined Martin, they would have discovered that he had noticed various red flags. The doctor had seen the bodies of many drowned fishermen in his time, and invariably there were fish and crab bites on the ears and other appendages. In this case, there were none. Hair, after being submerged for a week, becomes brittle and dull. Martin's hair was not. Nor did his clothes appear to have been in the water very long. But the Germans couldn't talk to the coroner without blowing their cover. Secrecy stood in the way of accuracy.

  3.

  Suppose that Kühlenthal had not been so eager to please Berlin, and that von Roenne had not loathed Hitler, and suppose that the Germans had properly debriefed the coroner and uncovered all the holes in the Mincemeat story. Would they then have seen through the British deception? Maybe so. Or maybe they would have found the flaws in Mincemeat a little too obvious, and concluded that the British were trying to deceive Germany into thinking that they were trying to deceive Germany into thinking that Greece and Sardinia were the real targets--in order to mask the fact that Greece and Sardinia were the real targets.

  This is the second, and more serious, of the problems that surround the products of espionage. It is not just that secrets themselves are hard to fact-check; it's that their interpretation is inherently ambiguous. Any party to an intelligence transaction is trapped in what the sociologist Erving Goffman called an "expression game." I'm trying to fool you. You realize that I'm trying to fool you, and I--realizing that--try to fool you into thinking that I don't realize that you have realized that I am trying to fool you. Goffman argues that at each turn in the game the parties seek out more and more specific and reliable cues to the other's intentions. But that search for specificity and reliability only makes the problem worse. As Goffman writes in his 1969 book "Strategic Interaction":

The more the observer relies on seeking out foolproof cues, the more vulnerable he should appreciate he has become to the exploitation of his efforts. For, after all, the most reliance-inspiring conduct on the subject's part is exactly the conduct that it would be most advantageous for him to fake if he wanted to hoodwink the observer. The very fact that the observer finds himself looking to a particular bit of evidence as an incorruptible check on what is or might be corrupted is the very reason why he should be suspicious of this evidence; for the best evidence for him is also the best evidence for the subject to tamper with.

  Macintyre argues that one of the reasons the Germans fell so hard for the Mincemeat ruse is that they really had to struggle to gain access to the documents. They ----and failed--to find a Spanish accomplice when the briefcase was still in Huelva. A week passed, and the Germans grew more and more anxious. The briefcase was transferred to the Spanish Admiralty, in Madrid, where the Germans redoubled their efforts. Their assumption, Macintyre says, was that if Martin was a plant the British would have made their task much easier. But Goffman's argument reminds us that the opposite is equally plausible. Knowing that a struggle would be a sign of authenticity, the Germans could just as easily have expected the British to provide one.

  The absurdity of such expression games has been wittily explored in the spy novels of Robert Littell and, with particular brio, in Peter Ustinov's 1956 play, "Romanoff and Juliet." In the latter, a crafty general is the head of a tiny European country being squabbled over by the United States and the Soviet Union, and is determined to play one off against the other. He tells the U.S. Ambassador that the Soviets have broken the Americans' secret code. "We know they know our code," the Ambassador, Moulsworth, replies, beaming. "We only give them things we want them to know." The general pauses, during which, the play's stage directions say, "he tries to make head or tail of this intelligence." Then he crosses the street to the Russian Embassy, where he tells the Soviet Ambassador, Romanoff, "They know you know their code." Romanoff is unfazed: "We have known for some time that they knew we knew their code. We have acted accordingly--by pretending to be duped." The general returns to the American Embassy and confronts Moulsworth: "They know you know they know you know." Moulsworth (genuinely alarmed): "What? Are you sure?"

  The genius of that parody is the final line, because spymasters have always prided themselves on knowing where they are on the "I-know-they-know-I-know-they-know" regress. Just before the Allied invasion of Sicily, a British officer, Colonel Knox, left a classified cable concerning the invasion plans on the terrace of Shepheard's Hotel, in Cairo--and no one could find it for two days. "Dudley Clarke was confident, however, that if it had fallen into enemy hands through such an obvious and 'gross breach of security' then it would probably be dismissed as a plant, pointing to Sicily as the cover target in accordance with Mincemeat," Macintyre writes. "He concluded that 'Colonel Knox may well have assisted rather than hindered us.' " In the face of a serious security breach, that's what a counter-intelligence officer would say. But, of course, there is no way for him to know how the Germans would choose to interpret that discovery--and no way for the Germans to know how to interpret that discovery, either.

  At one point, the British discovered that a French officer in Algiers was spying for the Germans. They "turned" him, keeping him in place but feeding him a steady diet of false and misleading information. Then, before D Day--when the Allies were desperate to convince Germany that they would be invading the Calais sector in July--they used the French officer to tell the Germans that the real invasion would be in Normandy on June 5th, 6th, or 7th. The British theory was that using someone the Germans strongly suspected was a double agent to tell the truth was preferable to using someone the Germans didn't realize was a double agent to tell a lie. Or perhaps there wasn't any theory at all. Perhaps the spy game has such an inherent opacity that it doesn't really matter what you tell your enemy so long as your enemy is aware that you are trying to tell him something.

  At around the time that Montagu and Cholmondeley were cooking up Operation Mincemeat, the personal valet of the British Ambassador to Turkey approached the German Embassy in Ankara with what he said were photographed copies of his boss's confidential papers. The valet's name was Elyesa Bazna. The Germans called him Cicero, and in this case they performed due diligence. Intelligence that came in over the transom was always considered less trustworthy than the intelligence gathered formally, so Berlin pressed its agents in Ankara for more details. Who was Bazna? What was his background? What was his motivation?

  "Given the extraordinary ease with which seemingly valuable documents were being obtained, however, there was widespread worry that the enemy had mounted some purposeful deception," Richard Wires writes, in "The Cicero Spy Affair: German Access to British Secrets in World War II" (1999). Bazna was, for instance, highly adept with a camera, in a way that suggested professional training or some kind of assistance. Bazna claimed that he didn't use a tripod but simply held each paper under a light with one hand and took the picture with the other. So why were the photographs so clear? Berlin sent a photography expert to investigate. The Germans tried to figure out how much English he knew--which would reveal whether he could read the documents he was photographing or was just being fed them. In the end, many German intelligence officials thought that Cicero was the real thing. But Joachim von Ribbentrop, the Foreign Minister, remained wary--and his doubts and political infighting among the German intelligence agencies meant that little of the intelligence provided by Cicero was ever acted upon.

  Cicero, it turned out, was the real thing. At least, we think he was the real thing. The Americans had a spy in the German Embassy in Turkey who learned that a servant was spying in the British Embassy. She told her bosses, who told the British. Just before his death, Stewart Menzies, the head of the British Secret Intelligence Service during the war, told an interviewer, "Of course, Cicero was under our control," meaning that the minute they learned about Cicero they began feeding him false documents. Menzies, it should be pointed out, was a man who spent much of his professional career deceiving other people, and if you had been the wartime head of M.I.6, giving an interview shortly before your death, you probably would say that Cicero was one of yours. Or perhaps, in interviews given shortly before death, people are finally free to tell the truth. Who knows?

  In the case of Operation Mincemeat, Germany's spies told their superiors that something false was actually true (even though, secretly, some of those spies might have known better), and Germany acted on it. In the case of Cicero, Germany's spies told their superiors that something was true that may indeed have been true, though maybe wasn't, or maybe was true for a while and not true for a while, depending on whether you believe the word of someone two decades after the war was over--and in this case Germany didn't really act on it at all. Looking at that track record, you have to wonder if Germany would have been better off not having any spies at all.

  4.

  The idea for Operation Mincemeat, Macintyre tells us, had its roots in a mystery story written by Basil Thomson, a former head of Scotland Yard's criminal-investigation unit. Thomson was the author of a dozen detective stories, and his 1937 book "The Milliner's Hat Mystery" begins with the body of a dead man carrying a set of documents that turn out to be forged. "The Milliner's Hat Mystery" was read by Ian Fleming, who worked for naval intelligence. Fleming helped create something called the Trout Memo, which contained a series of proposals for deceiving the Germans, including this idea of a dead man carrying forged documents. The memo was passed on to John Masterman, the head of the Twenty Committee--of which Montagu and Cholmondeley were members. Masterman, who also wrote mysteries on the side, starring an Oxford don and a Sherlock Holmes-like figure, loved the idea. Mincemeat, Macintyre writes, "began as fiction, a plot twist in a long-forgotten novel, picked up by another novelist, and approved by a committee presided over by yet another novelist."

  Then, there was the British naval attaché in Madrid, Alan Hillgarth, who stage-managed Mincemeat's reception in Spain. He was a "spy, former gold prospector, and, perhaps inevitably, successful novelist," Macintyre writes. "In his six novels, Alan Hillgarth hankered for a lost age of personal valor, chivalry, and self-reliance." Unaccountably, neither Montagu nor Cholmondeley seems to have written mysteries of his own. But, then again, they had Mincemeat. "As if constructing a character in a novel, Montagu and Cholmondeley . . . set about creating a personality with which to clothe their dead body," Macintyre observes. Martin didn't have to have a fiancée. But, in a good spy thriller, the hero always has a beautiful lover. So they found a stunning young woman, Jean Leslie, to serve as Martin's betrothed, and Montagu flirted with her shamelessly, as if standing in for his fictional creation. They put love letters from her among his personal effects. "Don't please let them send you off into the blue the horrible way they do nowadays," she wrote to her fiancé. "Now that we've found each other out of the whole world, I don't think I could bear it."

  The British spymasters saw themselves as the authors of a mystery story, because it gave them the self-affirming sense that they were in full command of the narratives they were creating. They were not, of course. They were simply lucky that von Roenne and Kühlenthal had private agendas aligned with the Allied cause. The intelligence historian Ralph Bennett writes that one of the central principles of Dudley Clarke (he of the cross-dressing, the elephants, and the fourteen Nigerian giants) was that "deception could only be successful to the extent to which it played on existing hopes and fears." That's why the British chose to convince Hitler that the Allied focus was on Greece and the Balkans--Hitler, they knew, believed that the Allied focus was on Greece and the Balkans. But we are, at this point, reduced to a logical merry-go-round: Mincemeat fed Hitler what he already believed, and was judged by its authors to be a success because Hitler continued to believe what he already believed. How do we know the Germans wouldn't have moved that Panzer division to the Peloponnese anyway? Bennett is more honest: "Even had there been no deception, [the Germans] would have taken precautions in the Balkans." Bennett also points out that what the Germans truly feared, in the summer of 1943, was that the Italians would drop out of the Axis alliance. Soldiers washing up on beaches were of little account next to the broader strategic considerations of the southern Mediterranean. Mincemeat or no Mincemeat, Bennett writes, the Germans "would probably have refused to commit more troops to Sicily in support of the Italian Sixth Army lest they be lost in the aftermath of an Italian defection." Perhaps the real genius of spymasters is found not in the stories they tell their enemies during the war but in the stories they tell in their memoirs once the war is over.

  5.

  It is helpful to compare the British spymasters' attitudes toward deception with that of their postwar American counterpart James Jesus Angleton. Angleton was in London during the nineteen-forties, apprenticing with the same group that masterminded gambits such as Mincemeat. He then returned to Washington and rose to head the C.I.A.'s counter-intelligence division throughout the Cold War.

  Angleton did not write detective stories. His nickname was the Poet. He corresponded with the likes of Ezra Pound, E. E. Cummings, T. S. Eliot, Archibald MacLeish, and William Carlos Williams, and he championed William Empson's "Seven Types of Ambiguity." He co-founded a literary journal at Yale called Furioso. What he brought to spycraft was the intellectual model of the New Criticism, which, as one contributor to Furioso put it, was propelled by "the discovery that it is possible and proper for a poet to mean two differing or even opposing things at the same time." Angleton saw twists and turns where others saw only straight lines. To him, the spy game was not a story that marched to a predetermined conclusion. It was, in a phrase of Eliot's that he loved to use, "a wilderness of mirrors."

  Angleton had a point. The deceptions of the intelligence world are not conventional mystery narratives that unfold at the discretion of the narrator. They are poems, capable of multiple interpretations. Kühlenthal and von Roenne, Mincemeat's audience, contributed as much to the plan's success as Mincemeat's authors. A body that washes up onshore is either the real thing or a plant. The story told by the ambassador's valet is either true or too good to be true. Mincemeat seems extraordinary proof of the cleverness of the British Secret Intelligence Service, until you remember that just a few years later the Secret Intelligence Service was staggered by the discovery that one of its most senior officials, Kim Philby, had been a Soviet spy for years. The deceivers ended up as the deceived.

  But, if you cannot know what is true and what is not, how on earth do you run a spy agency? In the nineteen-sixties, Angleton turned the C.I.A. upside down in search of K.G.B. moles that he was sure were there. As a result of his mole hunt, the agency was paralyzed at the height of the Cold War. American intelligence officers who were entirely innocent were subjected to unfair accusations and scrutiny. By the end, Angleton himself came under suspicion of being a Soviet mole, on the ground that the damage he inflicted on the C.I.A. in the pursuit of his imagined Soviet moles was the sort of damage that a real mole would have sought to inflict on the C.I.A. in the pursuit of Soviet interests.

  "The remedy he had proposed in 1954 was for the CIA to have what would amount to two separate mind-sets," Edward Jay Epstein writes of Angleton, in his 1989 book "Deception." "His counterintelligence staff would provide the alternative view of the picture. Whereas the Soviet division might see a Soviet diplomat as a possible CIA mole, the counterintelligence staff would view him as a possible disinformation agent. What division case officers would tend to look at as valid information, furnished by Soviet sources who risked their lives to cooperate with them, counterintelligence officers tended to question as disinformation, provided by KGB-controlled sources. This was, as Angleton put it, 'a necessary duality.'"

  Translation: the proper function of spies is to remind those who rely on spies that the kinds of thing found out by spies can't be trusted. If this sounds like a lot of trouble, there's a simpler alternative. The next time a briefcase washes up onshore, don't open it.

  How much people drink may matter less than how they drink it.

  1.

  In 1956, Dwight Heath, a graduate student in anthropology at Yale University, was preparing to do field work for his dissertation. He was interested in land reform and social change, and his first choice as a study site was Tibet. But six months before he was to go there he got a letter from the Chinese government rejecting his request for a visa. "I had to find a place where you can master the literature in four months, and that was accessible," Heath says now. "It was a hustle." Bolivia was the next best choice. He and his wife, Anna Cooper Heath, flew to Lima with their baby boy, and then waited for five hours while mechanics put boosters on the plane's engines. "These were planes that the U.S. had dumped after World War II," Heath recalls. "They weren't supposed to go above ten thousand feet. But La Paz, where we were headed, was at twelve thousand feet." As they flew into the Andes, Cooper Heath says, they looked down and saw the remnants of "all the planes where the boosters didn't work."

  From La Paz, they travelled five hundred miles into the interior of eastern Bolivia, to a small frontier town called Montero. It was the part of Bolivia where the Amazon Basin meets the Chaco--vast stretches of jungle and lush prairie. The area was inhabited by the Camba, a mestizo people descended from the indigenous Indian populations and Spanish settlers. The Camba spoke a language that was a mixture of the local Indian languages and seventeenth-century Andalusian Spanish. "It was an empty spot on the map," Heath says. "There was a railroad coming. There was a highway coming. There was a national government . . . coming."

  They lived in a tiny house just outside of town. "There was no pavement, no sidewalks," Cooper Heath recalls. "If there was meat in town, they'd throw out the hide in front, so you'd know where it was, and you would bring banana leaves in your hand, so it was your dish. There were adobe houses with stucco and tile roofs, and the town plaza, with three palm trees. You heard the rumble of oxcarts. The padres had a jeep. Some of the women would serve a big pot of rice and some sauce. That was the restaurant. The guy who did the coffee was German. The year we came to Bolivia, a total of eighty-five foreigners came into the country. It wasn't exactly a hot spot."

  In Montero, the Heaths engaged in old-fashioned ethnography--"vacuuming up everything," Dwight says, "learning everything." They convinced the Camba that they weren't missionaries by openly smoking cigarettes. They took thousands of photographs. They walked around the town and talked to whomever they could, and then Dwight went home and spent the night typing up his notes. They had a Coleman lantern, which became a prized social commodity. Heath taught some of the locals how to build a split-rail fence. They sometimes shared a beer in the evenings with a Bolivian Air Force officer who had been exiled to Montero from La Paz. "He kept on saying, 'Watch me, I will be somebody,' " Dwight says. (His name was René Barrientos; eight years later he became the President of Bolivia, and the Heaths were invited to his inauguration.) After a year and a half, the Heaths packed up their photographs and notes and returned to New Haven. There Dwight Heath sat down to write his dissertation--only to discover that he had nearly missed what was perhaps the most fascinating fact about the community he had been studying.

  Today, the Heaths are in their late seventies. Dwight has neatly combed gray hair and thick tortoiseshell glasses, a reserved New Englander through and through. Anna is more outgoing. They live not far from the Brown University campus, in Providence, in a house filled with hundreds of African statues and sculptures, with books and papers piled high on tables, and they sat, in facing armchairs, and told the story of what happened half a century ago, finishing each other's sentences.

  "It was August or September of 1957," Heath said. "We had just gotten back. She's tanned. I'm tanned. I mean, really tanned, which you didn't see a lot of in New Haven in those days."

  "I'm an architecture nut," Anna said. "And I said I wanted to see the inside of this building near the campus. It was always closed. But Dwight says, 'You never know,' so he walked over and pulls on the door and it opens." Anna looked over at her husband.

  "So we go in," Dwight went on, "and there was a couple of little white-haired guys there. And they said, 'You're tanned. Where have you been?' And I said Bolivia. And one of them said, 'Well, can you tell me how they drink?' " The building was Yale's Center of Alcohol Studies. One of the white-haired men was E. M. Jellinek, perhaps the world's leading expert on alcoholism at the time; the other was Mark Keller, the editor of the well-regarded Quarterly Journal of Studies on Alcohol. Keller stood up and grabbed Heath by the lapels: "I don't know anyone who has ever been to Bolivia. Tell me about it!" He invited Heath to write up his alcohol-related observations for his journal.

  After the Heaths went home that day, Anna said to Dwight, "Do you realize that every weekend we were in Bolivia we went out drinking?" The code he used for alcohol in his notebooks was 30A, and when he went over his notes he found 30A references everywhere. Still, nothing about the alcohol question struck him as particularly noteworthy. People drank every weekend in New Haven, too. His focus was on land reform. But who was he to say no to the Quarterly Journal of Studies on Alcohol? So he sat down and wrote up what he knew. Only after his article, "Drinking Patterns of the Bolivian Camba," was published, in September of 1958, and the queries and reprint requests began flooding in from around the world, did he realize what he had found. "This is so often true in anthropology," Anna said. "It is not anthropologists who recognize the value of what they've done. It's everyone else. The anthropologist is just reporting."

  2.

  The abuse of alcohol has, historically, been thought of as a moral failing. Muslims and Mormons and many kinds of fundamentalist Christians do not drink, because they consider alcohol an invitation to weakness and sin. Around the middle of the last century, alcoholism began to be widely considered a disease: it was recognized that some proportion of the population was genetically susceptible to the effects of drinking. Policymakers, meanwhile, have become increasingly interested in using economic and legal tools to control alcohol-related behavior: that's why the drinking age has been raised from eighteen to twenty-one, why drunk-driving laws have been toughened, and why alcohol is taxed heavily. Today, our approach to the social burden of alcohol is best described as a mixture of all three: we moralize, medicalize, and legalize.

  In the nineteen-fifties, however, the researchers at the Yale Center of Alcohol Studies found something lacking in this emerging approach, and the reason had to do with what they observed right in their own town. New Haven was a city of immigrants--Jewish, Irish, and, most of all, Italian. Recent Italian immigrants made up about a third of the population, and whenever the Yale researchers went into the Italian neighborhoods they found an astonishing thirst for alcohol. The overwhelming majority of Italian-American men in New Haven drank. A group led by the director of the Yale alcohol-treatment clinic, Giorgio Lolli, once interviewed a sixty-one-year-old father of four who consumed more than three thousand calories a day of food and beverages--of which a third was wine. "He usually has an 8-oz. glass of wine immediately following his breakfast every morning," Lolli and his colleagues wrote. "He always takes wine with his noonday lunch--as much as 24 oz." But he didn't display the pathologies that typically accompany that kind of alcohol consumption. The man was successfully employed, and had been drunk only twice in his life. He was, Lolli concluded, "a healthy, happy individual who has made a satisfactory adjustment to life."

  By the late fifties, Lolli's clinic had admitted twelve hundred alcoholics. Plenty of them were Irish. But just forty were Italians (all of whom were second- or third-generation immigrants). New Haven was a natural experiment. Here were two groups who practiced the same religion, who were subject to the same laws and constraints, and who, it seemed reasonable to suppose, should have the same assortment within their community of those genetically predisposed to alcoholism. Yet the heavy-drinking Italians had nothing like the problems that afflicted their Irish counterparts.

  "That drinking must precede alcoholism is obvious," Mark Keller once wrote. "Equally obvious, but not always sufficiently considered, is the fact that drinking is not necessarily followed by alcoholism." This was the puzzle of New Haven, and why Keller demanded of Dwight Heath, that day on the Yale campus, Tell me how the Camba drink. The crucial ingredient, in Keller's eyes, had to be cultural.

  The Heaths had been invited to a party soon after arriving in Montero, and every weekend and holiday thereafter. It was their Coleman lantern. "Whatever the occasion, it didn't matter," Anna recalled. "As long as the party was at night, we were first on the list."

  The parties would have been more aptly described as drinking parties. The host would buy the first bottle and issue the invitations. A dozen or so people would show up on Saturday night, and the party would proceed--often until everyone went back to work on Monday morning. The composition of the group was informal: sometimes people passing by would be invited. But the structure of the party was heavily ritualized. The group would sit in a circle. Someone might play the drums or a guitar. A bottle of rum, from one of the sugar refineries in the area, and a small drinking glass were placed on a table. The host stood, filled the glass with rum, and then walked toward someone in the circle. He stood before the "toastee," nodded, and raised the glass. The toastee smiled and nodded in return. The host then drank half the glass and handed it to the toastee, who would finish it. The toastee eventually stood, refilled the glass, and repeated the ritual with someone else in the circle. When people got too tired or too drunk, they curled up on the ground and passed out, rejoining the party when they awoke. The Camba did not drink alone. They did not drink on work nights. And they drank only within the structure of this elaborate ritual.

  "The alcohol they drank was awful," Anna recalled. "Literally, your eyes poured tears. The first time I had it, I thought, I wonder what will happen if I just vomit in the middle of the floor. Not even the Camba said they liked it. They say it tastes bad. It burns. The next day they are sweating this stuff. You can smell it." But the Heaths gamely persevered. "The anthropology graduate student in the nineteen-fifties felt that he had to adapt," Dwight Heath said. "You don't want to offend anyone, you don't want to decline anything. I gritted my teeth and accepted those drinks."

  "We didn't get drunk that much," Anna went on, "because we didn't get toasted as much as the other folks around. We were strangers. But one night there was this really big party--sixty to eighty people. They'd drink. Then pass out. Then wake up and party for a while. And I found, in their drinking patterns, that I could turn my drink over to Dwight. The husband is obliged to drink for his wife. And Dwight is holding the Coleman lantern with his arm wrapped around it, and I said, 'Dwight, you are burning your arm.' " She mimed her husband peeling his forearm off the hot surface of the lantern. "And he said--very deliberately--'So I am.' "

  When the Heaths came back to New Haven, they had a bottle of the Camba's rum analyzed and learned that it was a hundred and eighty proof. It was laboratory alcohol--the concentration that scientists use to fix tissue. No one had ever heard of anyone drinking it. This was the first of the astonishing findings of the Heaths' research--and, predictably, no one believed it at first.

  "One of the world's leading physiologists of alcohol was at the Yale center," Heath recalled. "His name was Leon Greenberg. He said to me, 'Hey, you spin a good yarn. But you couldn't really have drunk that stuff.' And he needled me just enough that he knew he would get a response. So I said, 'You want me to drink it? I have a bottle.' So one Saturday I drank some under controlled conditions. He was taking blood samples every twenty minutes, and, sure enough, I did drink it, the way I said I'd drunk it."

  Greenberg had an ambulance ready to take Heath home. But Heath decided to walk. Anna was waiting up for him in the third-floor walkup they rented, in an old fraternity house. "I was hanging out the window waiting for him, and there's the ambulance driving along the street, very slowly, and next to it is Dwight. He waves, and he looks fine. Then he walks up the three flights of stairs and says, 'Ahh, I'm drunk,' and falls flat on his face. He was out for three hours."

  The bigger surprise was what happened when the Camba drank. The Camba had weekly benders with laboratory-proof alcohol, and, Dwight Heath said, "There was no social pathology--none. No arguments, no disputes, no sexual aggression, no verbal aggression. There was pleasant conversation or silence." On the Brown University campus, a few blocks away, beer--which is to Camba rum approximately what a peashooter is to a bazooka--was known to reduce the student population to a raging hormonal frenzy on Friday nights. "The drinking didn't interfere with work," Heath went on. "It didn't bring in the police. And there was no alcoholism, either."

  3.

  What Heath found among the Camba is hard to believe. We regard alcohol's behavioral effects as inevitable. Alcohol disinhibits, we assume, as reliably as caffeine enlivens. It gradually unlocks the set of psychological constraints that keep our behavior in check, and makes us do things that we would not ordinarily do. It's a drug, after all.

  But, after Heath's work on the Camba, anthropologists began to take note of all the puzzling ways in which alcohol wasn't reliable in its effects. In the classic 1969 work "Drunken Comportment," for example, the anthropologists Craig MacAndrew and Robert B. Edgerton describe an encounter that Edgerton had while studying a tribe in central Kenya. One of the tribesmen, he was told, was "very dangerous" and "totally beyond control" after he had been drinking, and one day Edgerton ran across the man:

I heard a commotion, and saw people running past me. One young man stopped and urged me to flee because this dangerous drunk was coming down the path attacking all whom he met. As I was about to take this advice and leave, the drunk burst wildly into the clearing where I was sitting. I stood up, ready to run, but much to my surprise, the man calmed down, and as he walked slowly past me, he greeted me in polite, even deferential terms, before he turned and dashed away. I later learned that in the course of his "drunken rage" that day he had beaten two men, pushed down a small boy, and eviscerated a goat with a large knife.

  The authors include a similar case from Ralph Beals's work among the Mixe Indians of Oaxaca, Mexico:

The Mixe indulge in frequent fist fights, especially while drunk. Although I probably saw several hundred, I saw no weapons used, although nearly all men carried machetes and many carried rifles. Most fights start with a drunken quarrel. When the pitch of voices reaches a certain point, everyone expects a fight. The men hold out their weapons to the onlookers, and then begin to fight with their fists, swinging wildly until one falls down [at which point] the victor helps his opponent to his feet and usually they embrace each other.

  The angry Kenyan tribesman was disinhibited toward his own people but inhibited toward Edgerton. Alcohol turned the Mixe into aggressive street fighters, but they retained the presence of mind to "hold out their weapons to the onlookers." Something that truly disinhibits ought to be indiscriminate in its effects. That's not the picture of alcohol that these anthropologists have given us. (MacAndrew and Edgerton, in one of their book's many wry asides, point out that we are all acquainted with people who can hold their liquor. "In the absence of anything observably untoward in such a one's drunken comportment," they ask, "are we seriously to presume that he is devoid of inhibitions?")

  Psychologists have encountered the same kinds of perplexities when they have set out to investigate the effects of drunkenness. One common "belief is that alcohol causes self-inflation." It makes us see ourselves through rose-tinted glasses. Oddly, though, it doesn't make us view everything about ourselves through rose-tinted glasses. When the psychologists Claude Steele and Mahzarin Banaji gave a group of people a personality questionnaire while they were sober and then again when they were drunk, they found that the only personality aspects that were inflated by drinking were those where there was a gap between real and ideal states. If you are good-looking and the world agrees that you are good-looking, drinking doesn't make you think you're even better-looking. Drinking only makes you feel you're better-looking if you think you're good-looking and the world doesn't agree.

  Alcohol is also commonly believed to reduce anxiety. That's what a disinhibiting agent should do: relax us and make the world go away. Yet this effect also turns out to be selective. Put a stressed-out drinker in front of an exciting football game and he'll forget his troubles. But put him in a quiet bar somewhere, all by himself, and he'll grow more anxious.

  Steele and his colleague Robert Josephs's explanation is that we've misread the effects of alcohol on the brain. Its principal effect is to narrow our emotional and mental field of vision. It causes, they write, "a state of shortsightedness in which superficially understood, immediate aspects of experience have a disproportionate influence on behavior and emotion."

  Alcohol makes the thing in the foreground even more salient and the thing in the background disappear. That's why drinking makes you think you are attractive when the world thinks otherwise: the alcohol removes the little constraining voice from the outside world that normally keeps our self-assessments in check. Drinking relaxes the man watching football because the game is front and center, and alcohol makes every secondary consideration fade away. But in a quiet bar his problems are front and center--and every potentially comforting or mitigating thought recedes. Drunkenness is not disinhibition. Drunkenness is myopia.

  Myopia theory changes how we understand drunkenness. Disinhibition suggests that the drinker is increasingly insensitive to his environment--that he is in the grip of an autonomous physiological process. Myopia theory, on the contrary, says that the drinker is, in some respects, increasingly sensitive to his environment: he is at the mercy of whatever is in front of him.

  A group of Canadian psychologists led by Tara MacDonald recently went into a series of bars and made the patrons read a short vignette. They had to imagine that they had met an attractive person at a bar, walked him or her home, and ended up in bed--only to discover that neither of them had a condom. The subjects were then asked to respond on a scale of one (very unlikely) to nine (very likely) to the proposition: "If I were in this situation, I would have sex." You'd think that the subjects who had been drinking heavily would be more likely to say that they would have sex--and that's exactly what happened. The drunk people came in at 5.36, on average, on the nine-point scale. The sober people came in at 3.91. The drinkers couldn't sort through the long-term consequences of unprotected sex. But then MacDonald went back to the bars and stamped the hands of some of the patrons with the phrase "AIDS kills." Drinkers with the hand stamp were slightly less likely than the sober people to want to have sex in that situation: they couldn't sort through the kinds of rationalizations necessary to set aside the risk of AIDS. Where norms and standards are clear and consistent, the drinker can become more rule-bound than his sober counterpart.

  In other words, the frat boys drinking in a bar on a Friday night don't have to be loud and rowdy. They are responding to the signals sent by their immediate environment--by the pulsing music, by the crush of people, by the dimmed light, by the countless movies and television shows and general cultural expectations that say that young men in a bar with pulsing music on a Friday night have permission to be loud and rowdy. "Persons learn about drunkenness what their societies import to them, and comporting themselves in consonance with these understandings, they become living confirmations of their society's teachings," MacAndrew and Edgerton conclude. "Since societies, like individuals, get the sorts of drunken comportment that they allow, they deserve what they get."

  4.

  This is what connects the examples of Montero and New Haven. On the face of it, the towns are at opposite ends of the spectrum. The Camba got drunk every weekend on laboratory-grade alcohol. The Italians drank wine, in civil amounts, every day. The Italian example is healthy and laudable. The Camba's fiestas were excessive and surely took a long-term physical toll. But both communities understood the importance of rules and structure. Camba society, Dwight Heath says, was marked by a singular lack of "communal expression." They were itinerant farmworkers. Kinship ties were weak. Their daily labor tended to be solitary and the hours long. There were few neighborhood or civic groups. Those weekly drinking parties were not chaotic revels; they were the heart of Camba community life. They had a function, and the elaborate rituals--one bottle at a time, the toasting, the sitting in a circle--served to give the Camba's drinking a clear structure.

  In the late nineteen-forties, Phyllis Williams and Robert Straus, two sociologists at Yale, selected ten first- and second-generation Italian-Americans from New Haven to keep diaries detailing their drinking behavior, and their entries show how well that community understood this lesson as well. Here is one of their subjects, Philomena Sappio, a forty-year-old hairdresser from an island in the Bay of Naples, describing what she drank one week in October of 1948:

Fri.--Today for dinner 4 oz. of wine [noon]. In the evening, I had fish with 8 oz. of wine [6 P.M.].

Sat.--Today I did not feel like drinking at all. Neither beer nor any other alcohol. I drank coffee and water.

Sun.--For dinner I made lasagna at noon, and had 8 oz. of wine. In the evening, I had company and took one glass of liqueur [1 oz. strega] with my company. For supper--I did not have supper because I wasn't hungry.

Mon.--At dinner I drank coffee, at supper 6 oz. of wine[5 P.M.].

Tues.--At dinner, 4 oz. wine [noon]. One of my friends and her husband took me and my daughter out this evening in a restaurant for supper. We had a splendid supper. I drank 1 oz. of vermouth [5:30 P.M.] and 12 oz. of wine [6 P.M.].

Wed.--For dinner, 4 oz. of wine [noon] and for supper 6 oz. of wine [6 P.M.].

Thurs.--At noon, coffee and at supper, 6 oz. of wine [6 P.M.].

Fri.--Today at noon I drank orange juice; at supper in the evening [6 P.M.] 8 oz. of wine.

  Sappio drinks almost every day, unless she isn't feeling well. She almost always drinks wine. She drinks only at mealtimes. She rarely has more than a glass--except on a special occasion, as when she and her daughter are out with friends at a restaurant.

  Here is another of Williams and Straus's subjects--Carmine Trotta, aged sixty, born in a village outside Salerno, married to a girl from his village, father of three, proprietor of a small grocery store, resident of an exclusively Italian neighborhood:

Fri.--I do not generally eat anything for breakfast if I have a heavy supper the night before. I leave out eggnog and only take coffee with whisky because I like to have a little in the morning with coffee or with eggnog or a few crackers.

Mon.--When I drink whisky before going to bed I always put it in a glass of water. . . .

Wed.--Today is my day off from business, so I [drank] some beer because it was very hot. I never drink beer when I am working because I don't like the smell of beer on my breath for my customers.

Thurs.--Every time that I buy a bottle of whisky I always divide same. One half at home and one half in my shop.

  Sappio and Trotta do not drink for the same purpose as the Camba: alcohol has no larger social or emotional reward. It's food, consumed according to the same quotidian rhythms as pasta or cheese. But the content of the rules matters less than the fact of the rule, the existence of a drinking regimen that both encourages and constrains alcohol's use. "I went to visit one of my friends this evening," Sappio writes. "We saw television and she offered me 6 oz. of wine to drink, and it was good [9 P.M.]." She did not say that her friend put the bottle on the table or offered her a second glass. Evidently, she brought out one glass of wine for each of them, and they drank together, because one glass is what you had, in the Italian neighborhoods of New Haven, at 9 P.M. while watching television.

  5.

  Why can't we all drink like the Italians of New Haven? The flood of immigrants who came to the United States in the nineteenth century brought with them a wealth of cultural models, some of which were clearly superior to the patterns of their new host--and, in a perfect world, the rest of us would have adopted the best ways of the newcomers. It hasn't worked out that way, though. Americans did not learn to drink like Italians. On the contrary, when researchers followed up on Italian-Americans, they found that by the third and fourth generations they were, increasingly, drinking like everyone else.

  There is something about the cultural dimension of social problems that eludes us. When confronted with the rowdy youth in the bar, we are happy to raise his drinking age, to tax his beer, to punish him if he drives under the influence, and to push him into treatment if his habit becomes an addiction. But we are reluctant to provide him with a positive and constructive example of how to drink. The consequences of that failure are considerable, because, in the end, culture is a more powerful tool in dealing with drinking than medicine, economics, or the law. For all we know, Philomena Sappio could have had within her genome a grave susceptibility to alcohol. Because she lived in the protective world of New Haven's immigrant Italian community, however, it would never have become a problem. Today, she would be at the mercy of her own inherent weaknesses. Nowhere in the multitude of messages and signals sent by popular culture and social institutions about drinking is there any consensus about what drinking is supposed to mean.

  "Mind if I vent for a while?" a woman asks her husband, in one popular--and depressingly typical--beer ad. He is sitting on the couch. She has just come home from work. He replies, "Mind? I'd prefer it!" And he jumps up, goes to the refrigerator, and retrieves two cans of Coors Light--a brand that comes with a special vent intended to make pouring the beer easier. "Let's vent!" he cries out. She looks at him oddly: "What are you talking about?" "I'm talking about venting!" he replies, as she turns away in disgust. "What are you talking about?" The voice-over intones, "The vented wide-mouthed can from Coors Light. It lets in air for a smooth, refreshing pour." Even the Camba, for all their excesses, would never have been so foolish as to pretend that you could have a conversation about drinking and talk only about the can.

  TO TOP MENU

Success and RISK

  How entrepreneurs really succeed.

  1.

  In 1969, Ted Turner wanted to buy a television station. He was thirty years old. He had inherited a billboard business from his father, which was doing well. But he was bored, and television seemed exciting. "He knew absolutely nothing about it," one of Turner's many biographers, Christian Williams, writes in "Lead, Follow or Get Out of the Way" (1981). "It would be fun to risk everything he had built, scare the hell out of everybody, and get back in the front seat of the roller coaster."

  The station in question was WJRJ, Channel 17, in Atlanta. It was an independent station on the UHF band, the lonely part of the television spectrum which viewers needed a special antenna to find. It was housed in a run-down cinder-block building near a funeral home, leading to the joke that it was at death's door. The equipment was falling apart. The staff was incompetent. It had no decent programming to speak of, and it was losing more than half a million dollars a year. Turner's lawyer, Tench Coxe, and his accountant, Irwin Mazo, were firmly opposed to the idea. "We tried to make it clear that--yes--this thing might work, but if it doesn't everything will collapse," Mazo said, years later. "Everything you've got will be gone. . . . It wasn't just us, either. Everybody told him not to do it."

  Turner didn't listen. He was Captain Courageous, the man with nerves of steel who went on to win the America's Cup, take on the networks, marry a movie star, and become a billionaire. He dressed like a cowboy. He gave the impression of signing contracts without looking at them. He was a drinker, a yeller, a man of unstoppable urges and impulses, the embodiment of the entrepreneur as risk-taker. He bought the station, and so began one of the great broadcasting empires of the twentieth century.

  What is sometimes forgotten amid the mythology, however, is that Turner wasn't the proprietor of any old billboard company. He had inherited the largest outdoor-advertising firm in the South, and billboards, in the nineteen-sixties and seventies, were enormously lucrative. They benefitted from favorable tax-depreciation rules, they didn't require much capital investment, and they produced rivers of cash. WJRJ's losses could be used to offset the taxes on the profits of Turner's billboard business. A television station, furthermore, fit very nicely into his existing business. Television was about selling ads, and Turner was very experienced at ad-selling. WJRJ may have been a virtual unknown in the Atlanta market, but Turner had billboards all over the city that were blank about fifteen per cent of the time. He could advertise his new station free. As for programming, Turner had a fix for that, too. In those days, the networks offered their local affiliates a full slate of shows, and whenever an affiliate wanted to broadcast local programming, such as sports or news, the national shows were preëmpted. Turner realized that he could persuade the networks in New York to let him have whatever programming their affiliates weren't running. That's exactly what happened. "When we reached the point of having four preempted NBC shows running in our daytime lineup," Turner writes in his autobiography, "Call Me Ted" (2008), "I had our people put up some billboards saying 'THE NBC NETWORK MOVES TO CHANNEL 17.' "

  Williams writes that Turner was "attracted to the risk" of the deal, but it seems just as plausible to say that he was attracted by the deal's lack of risk. "We don't want to put it all on the line, because the result can't possibly be worth the risk," Mazo recalls warning Turner. Put it all on the line? The purchase price for WJRJ was $2.5 million. Similar properties in that era went for many times that, and Turner paid with a stock swap engineered in such a way that he didn't have to put a penny down. Within two years, the station was breaking even. By 1973, it was making a million dollars in profit.

  In a recent study, "From Predators to Icons," the French scholars Michel Villette and Catherine Vuillermot set out to discover what successful entrepreneurs have in common. They present case histories of businessmen who built their own empires--ranging from Sam Walton, of Wal-Mart, to Bernard Arnault, of the luxury-goods conglomerate L.V.M.H.--and chart what they consider the typical course of a successful entrepreneur's career. There is almost always, they conclude, a moment of great capital accumulation--a particular transaction that catapults him into prominence. The entrepreneur has access to that deal by virtue of occupying a "structural hole," a niche that gives him a unique perspective on a particular market. Villette and Vuillermot go on, "The businessman looks for partners to a transaction who do not have the same definition as he of the value of the goods exchanged, that is, who undervalue what they sell to him or overvalue what they buy from him in comparison to his own evaluation." He moves decisively. He repeats the good deal over and over again, until the opportunity closes, and--most crucially--his focus throughout that sequence is on hedging his bets and minimizing his chances of failure. The truly successful businessman, in Villette and Vuillermot's telling, is anything but a risk-taker. He is a predator, and predators seek to incur the least risk possible while hunting.

  Giovanni Agnelli, the founder of Fiat, financed his young company with the money of investors--who were "subsequently excluded from the company by a maneuver by Agnelli," the authors point out. Bernard Arnault took over the Boussac group at a personal cost of forty million francs, which was a fraction of the "immediate resale value of the assets." The French industrialist Vincent Bolloré "took charge of the failing family company for almost nothing with other people's money." George Eastman, the founder of Kodak, shifted the financial risk of his new enterprise to his family and to his wealthy friend Henry Strong. IKEA's founder, Ingvar Kamprad, arranged to get his furniture made in Communist Poland for half of what it would cost him in Sweden. Marcel Dassault, the French aviation pioneer, did a study for the French Army that pointed out the value of propellers, and then took over a propeller manufacturer. When he started making planes for the military, he made sure he was paid in advance.

  People like Dassault and Eastman and Arnault and Turner are all successful entrepreneurs, businessmen whose insights and decisions have transformed the economy, but their entrepreneurial spirit could not have less in common with that of the daring risk-taker of popular imagination. Would we so revere risk-taking if we realized that the people who are supposedly taking bold risks in the cause of entrepreneurship are actually doing no such thing?

  2.

  The most successful entrepreneur on Wall Street--certainly of the past decade and perhaps even of the postwar era--is a hedge-fund manager named John Paulson. He started a small money-management business in the nineteen-nineties and built it into a juggernaut, and Gregory Zuckerman's recent account of Paulson's triumph, "The Greatest Trade Ever," offers a fascinating perspective on the predator thesis.

  Paulson grew up in middle-class Queens, the child of an immigrant father. His career on Wall Street started relatively slowly. He launched his firm in 1994, when he was nearly forty years old, specializing in merger arbitrage. By 2004, Paulson was managing about two billion dollars of other people's money, putting him in the middle ranks of hedge funds. He was, Zuckerman writes, a "solid investor, careful and decidedly unspectacular." The particular kinds of deal he did were "among the safest forms of investing." One of Paulson's mentors was an investor named Marty Gruss, and, Zuckerman writes, "the ideal Gruss investment had limited risk but held the promise of a potential fortune. Marty Gruss drilled a maxim into Paulson: 'Watch the downside; the upside will take care of itself.' At his firm, he asked his analysts repeatedly, 'How much can we lose on this trade?' " Long after he became wealthy, he would take the bus to his offices in midtown, and the train out to his summer house on Long Island. He was known for getting around the Hamptons on his bicycle.

  By 2004-05, Paulson was increasingly suspicious of the real-estate boom. He decided to short the mortgage market, using a financial tool known as the credit-default swap, or C.D.S. A credit-default swap is like an insurance policy. Wall Street banks combined hundreds of mortgages together in bundles, and investors could buy insurance on any of the bundles they chose. Suppose I put together a bundle of ten mortgages totalling a million dollars. I could sell you a one-year C.D.S. policy on that bundle for, say, a hundred thousand dollars. If after the year was up the ten homeowners holding those mortgages were all making their monthly payments, I'd pocket your hundred thousand. If, however, those homeowners all defaulted, I'd owe you the full value of the bundle--a million dollars. Throughout the boom, countless banks and investment firms sold C.D.S. policies on securities backed by subprime loans, happily pocketing the annual premiums in the belief that there was little chance of ever having to make good on the contract. Paulson, as often as not, was the one on the other side of the trade. He bought C.D.S. contracts by the truckload, and, when he ran out of money, he found new investors, raising billions of new dollars so he could buy even more. By the time the crash came, he was holding insurance on some twenty-five billion dollars' worth of subprime mortgages.

  Was Paulson's trade risky? Conventional wisdom said that it was. This kind of deal is known, in Wall Street parlance, as a "negative-carry" trade, and, as Zuckerman writes, negative-carry trades are a "maneuver that investment pros detest almost as much as high taxes and coach-class seating." Their problem with negative-carry is that if the trade doesn't pay off quickly it can become ruinously expensive. It's one thing if I pay you a hundred thousand dollars for one year's insurance on a million dollars' worth of mortgages, and the mortgages go belly up after six months. But what if I pay premiums for two years, and the bubble still hasn't burst? Then I'm out two hundred thousand dollars, with nothing to show for my efforts. And what if the bubble hasn't burst after three years? Now I have a very nervous group of investors. To win at a negative-carry trade, you have not only to correctly predict the presence of a bubble but also to correctly predict when the bubble is about to burst.

  At one point before the crash, Zuckerman writes, a trader at Morgan Stanley "hung up the phone after yet another Paulson order and turned to a colleague in disbelief. 'This guy is nuts,' he said with a chuckle, amazed that Paulson was agreeing to make so many annual insurance payments. 'He's just going to pay it all out?' " Wall Street thought that Paulson was crazy.

  But Paulson wasn't crazy at all. In 2006, he had his firm undertake a rigorous analysis of the housing market, led by Paulson's associate Paolo Pellegrini. At that point, it was unclear whether rising housing prices represented a bubble or a legitimate phenomenon. Pellegrini concluded that housing prices had risen on average 1.4 per cent annually between 1975 and 2000, once inflation had been accounted for. In the next five years, though, they had risen seven per cent a year--to the point where they would have to fall by forty per cent to be back in line with historical trends. That fact left Paulson certain that he was looking at a bubble.

  Paulson's next concern was with the volatility of the housing market. Was this bubble resilient? Or was everything poised to come crashing down? Zuckerman tells how Pellegrini and another Paulson associate, Sihan Shu, "purchased enormous databases tracking the historic performance of more than six million mortgages in various parts of the country." Thus equipped,

they crunched the numbers, tinkered with logarithms and logistic functions, and ran different scenarios, trying to figure out what would happen if housing prices stopped rising. Their findings seemed surprising: Even if prices just flatlined, homeowners would feel so much financial pressure that it would result in losses of 7 percent of the value of a typical pool of subprime mortgages. And if home prices fell 5 percent, it would lead to losses as high as 17 percent.

  This was a crucial finding. Most people at the time believed that widespread defaults on mortgages were a function of some combination of structural economic factors such as unemployment rates, interest rates, and regional economic health. That's why so many on Wall Street were happy to sell Paulson C.D.S. policies: they thought it would take a perfect storm to bring the market to its knees. But Pellegrini's data showed that the bubble was being inflated by a single, rickety factor--rising home prices. It wouldn't take much for the bubble to burst.

  Paulson then looked at what buying disaster insurance on mortgages would cost. C.D.S. contracts can sometimes be prohibitively expensive. In the months leading up to General Motors' recent bankruptcy, for example, a year's insurance on a million of the carmaker's bonds sold for eight hundred thousand dollars. If Paulson had to pay anything like that amount, there wouldn't be much room for error. To his amazement, though, he found that to insure a million dollars of mortgages would cost him just ten thousand dollars--and this was for some of the most dubious and high-risk subprime mortgages. Paulson didn't even need a general housing-market collapse to make his money. He needed only the most vulnerable of all homeowners to start defaulting. It was a classic asymmetrical trade. If Paulson raised a billion dollars from investors, he could buy a year's worth of insurance on twelve billion dollars of subprime loans for a hundred and twenty million. That's an outlay of twelve per cent up front. But, Zuckerman explains,

because premiums on CDS contracts, like those on any other insurance product, are paid out over time, the new fund could keep most of its money in the bank until the CDS bills came due, and thereby earn about 5 percent a year. That would cut the annual cost to the fund to a more reasonable 7 percent. Since Paulson would charge 1 percent a year as a management fee, the most an investor could lose would be 8 percent a year. . . . And the upside? If Paulson purchased CDS contracts that fully protected $12 billion of subprime mortgage bonds and the bonds somehow became worthless, Paulson & Co. would make a cool $12 billion.

  "There's never been an opportunity like this," Paulson gushed to a colleague, as he made one bet after another. By "never," he meant never ever--not in his lifetime and not in anyone else's, either. In one of the book's many memorable scenes, Zuckerman describes how a five-point decline in what's called the ABX index (a measure of mortgage health) once made Paulson $1.25 billion in one morning. In 2007 alone, Paulson & Co. took in fifteen billion dollars in profits, of which four billion went directly into Paulson's pocket. In 2008, his firm made five billion dollars. Rarely in human history has anyone made so much money is so short a time.

  What Paulson's story makes clear is how different the predator is from our conventional notion of the successful businessman. The risk-taking model suggests that the entrepreneur's chief advantage is one of temperament--he's braver than the rest of us are. In the predator model, the entrepreneur's advantage is analytical--he's better at figuring out a sure thing than the rest of us. Paulson looked at the same marketplace as everyone else on Wall Street did. But he saw a different pattern. As an outsider, he had fresh eyes, and his line of investing made him a lot more comfortable with negative-carry trades than his competitors were. He looked for and found partners to the transaction who did not have the same definition as he of the value of the goods exchanged--that is, the banks selling credit-default swaps for a penny on the dollar--and he exploited that advantage ruthlessly. At one point, incredibly, Paulson got together with some investment banks to assemble bundles of the most absurdly toxic mortgages--which the banks then sold to some hapless investors and Paulson then promptly bet against. As Zuckerman points out, this is the equivalent of a game of football in which the defense calls the plays for the offense. It's how a nerd would play football, not a jock.

  This is exactly how Turner pulled off another of his legendary early deals--his 1976 acquisition of the Atlanta Braves baseball team. Turner's Channel 17 was the Braves' local broadcaster, having acquired the rights four years before--a brilliant move, as it turned out, because it forced every Braves fan in the region to go out and buy a UHF antenna. (Well before ESPN and Rupert Murdoch's Sky TV, Turner had realized how important live sports programming could be in building a television brand.) The team was losing a million dollars a year, and the owners wanted ten million dollars to sell. That was four times the price of Channel 17. "I had no idea how I could afford it," Turner told one of his biographers, although by this point the reader is wise to his aw-shucks modesty. First, he didn't pay ten million dollars. He talked the Braves into taking a million down, and the rest over eight or so years. Second, he didn't end up paying the million down. Somewhat mysteriously, Turner reports that he found a million dollars on the team's books--money the previous owners somehow didn't realize they had--and so, he says, "I bought it using its own money, which was quite a trick." He now owed nine million dollars. But Turner had already been paying the Braves six hundred thousand dollars a year for the rights to broadcast sixty of the team's games. What the deal consisted of, then, was his paying an additional six hundred thousand dollars or so a year, for eight years: in return, he would get the rights to all a hundred and sixty-two of the team's games, plus the team itself.

  You and I might not have made that deal. But that's not because Turner is a risk-taker and we are cowards. It's because Turner is a cold-blooded bargainer who could find a million dollars in someone's back pocket that the person didn't know he had. Once you get past the more flamboyant aspects of Turner's personal and sporting life, in fact, there is little evidence that he had any real appetite for risk at all. In his memoir, Turner tells us that when he was starting out in the family business his father, Ed, bought another billboard firm, called General Outdoor. That was the acquisition that launched the Turner company as a major advertising player in the South, and it involved taking on a sizable amount of debt. Young Ted had no qualms, intellectually, about the decision. He could do the math. There were substantial economies of scale in the advertising business: the bigger you got, the lower your costs were, and paying off the debt from the General Outdoor purchase, Ted Turner realized, probably wasn't going to be a problem. But Turner's father did something that Turner, when he was building his empire, always went to extraordinary lengths to avoid: he put his own capital into the deal. In the highly unlikely event that it didn't work out, Turner Advertising would be crippled. It was a good deal, not a perfect one, and that niggling imperfection, along with the toll that the uncertainty was taking on his father, left Turner worried sick. "During the first six months or so after the General Outdoor acquisition my weight dropped from 180 pounds to 135," he writes. "I developed a pre-ulcerative condition and my doctor made me swear off coffee. I'd get so tired and agitated that one of my eyelids developed a twitch."

  Zuckerman profiles John Paulson alongside three others who made the same subprime bet--Greg Lippmann, a trader at Deutsche Bank; Jeffrey Greene, a real-estate mogul in Los Angeles; and Michael Burry, who ran a hedge fund in Silicon Valley--and finds the same pattern. All were supremely confident of their decision. All had done their homework. All had swooped down, like perfect predators, on a marketplace anomaly. But these were not men temperamentally suited to risk-taking. They worked so hard to find the sure thing because anything short of that gave them ulcers. Here is Zuckerman on Burry, as he waited for his trade to pan out:

In a tailspin, Burry withdrew from his friends, family, and employees. Each morning, Burry walked into his firm and made a beeline to his office, head down, locking the door behind him. He didn't emerge all day, not even to eat or use the bathroom. His remaining employees, who were still pulling for Burry, turned worried. Sometimes he got into the office so early, and kept the door closed for so long, that when his staff left at the end of the day, they were unsure if their boss had ever come in. Other times, Burry pounded his fists on his desk, trying to release his tension, as heavy-metal music blasted from nearby speakers.

  3.

  Paulson's story also casts a harsh light on the prevailing assumptions behind corporate compensation policies. One of the main arguments for the generous stock options that are so often given to C.E.O.s is that they are necessary to encourage risk-taking in the corporate suite. This notion comes from what is known as "agency theory," which Freek Vermeulen, of the London Business School, calls "one of the few academic theories in management academia that has actually influenced the world of management practice." Agency theory, Vermeulen observes, "says that managers are inherently risk-averse; much more risk-averse than shareholders would like them to be. And the theory prescribes that you should give them stock options, rather than stock, to stimulate them to take more risk." Why do shareholders want managers to take more risks? Because they want stodgy companies to be more entrepreneurial, and taking risks is what everyone says that entrepreneurs do.

  The result has been to turn executives into risk-takers. Paulson, for his part, was stunned at the reckless behavior of his Wall Street counterparts. Some of the mortgage bundles he was betting against--collections of some of the sketchiest subprime loans--were paying the investors who bought them six-per-cent interest. Treasury bonds, the safest investment in the world, were paying almost five per cent at that point. Nor could he comprehend why so many banks were willing to sell him C.D.S. insurance at such low prices. Why would someone, in the middle of a housing bubble, demand only one cent on the dollar? At the end of 2006, Merrill Lynch paid $1.3 billion for First Franklin Financial, one of the biggest subprime lenders in the country, bringing the total value of subprime mortgages on its books to eleven billion dollars. Paulson was so risk-averse that he didn't so much as put a toe in the water of subprime-mortgage default swaps until Pellegrini had done months of analysis. But Merrill Lynch bought First Franklin even though the firm's own economists were predicting that housing prices were about to drop by as much as five per cent. "It just doesn't make sense," an incredulous Paulson told his friend Howard Gurvitch. "These are supposedly the smart people."

  The economist Scott Shane, in his book "The Illusions of Entrepreneurship," makes a similar argument. Yes, he says, many entrepreneurs take plenty of risks--but those are generally the failed entrepreneurs, not the success stories. The failures violate all kinds of established principles of new-business formation. New-business success is clearly correlated with the size of initial capitalization. But failed entrepreneurs tend to be wildly undercapitalized. The data show that organizing as a corporation is best. But failed entrepreneurs tend to organize as sole proprietorships. Writing a business plan is a must; failed entrepreneurs rarely take that step. Taking over an existing business is always the best bet; failed entrepreneurs prefer to start from scratch. Ninety per cent of the fastest-growing companies in the country sell to other businesses; failed entrepreneurs usually try selling to consumers, and, rather than serving customers that other businesses have missed, they chase the same people as their competitors do. The list goes on: they underemphasize marketing; they don't understand the importance of financial controls; they try to compete on price. Shane concedes that some of these risks are unavoidable: would-be entrepreneurs take them because they have no choice. But a good many of these risks reflect a lack of preparation or foresight.

  4.

  Shane's description of the pattern of entrepreneurial failure brings to mind the Harvard psychologist David McClelland's famous experiment with kindergarten children in the nineteen-fifties. McClelland watched a group of kids play ringtoss--throwing a hoop over a pole. The children who played the game in the riskiest manner, who stood so far from the pole that success was unlikely, also scored lowest on what he called "achievement motive," that is, the desire to succeed. (Another group of low scorers were at the other extreme, standing so close to the pole that the game ceased to be a game at all.) Taking excessive risks was, then, a psychologically protective strategy: if you stood far enough back from the pole, no one could possibly blame you if you failed. These children went out of their way to take a "professional" risk in order to avoid a personal risk. That's what companies are buying with their bloated C.E.O. stock-options packages--gambles so wild that the gambler can lose without jeopardizing his social standing within the corporate world. "As long as the music is playing, you've got to get up and dance," the now departed C.E.O. of Citigroup, Charles Prince, notoriously said, as his company continued to pile one dubious investment on another. He was more afraid of being a wallflower than he was of imperilling his firm.

  The successful entrepreneur takes the opposite tack. Villette and Vuillermot point out that the predator is often quite happy to put his reputation on the line in the pursuit of the sure thing. Ingvar Kamprad, of IKEA, went to Poland in the nineteen-sixties to get his furniture manufactured. Since Polish labor was inexpensive, it gave Kamprad a huge price advantage. But doing business with a Communist country at the height of the Cold War was a scandal. Sam Walton financed his first retailing venture, in Newport, Arkansas, with money from his wealthy in-laws. That approach was safer than turning to a bank, especially since Walton was forced out of Newport and had to go back to his wife's family for another round. But you can imagine that it made for some tense moments at family reunions for a while. Deutsche Bank's Lippmann, meanwhile, was called Chicken Little and Bubble Boy to his face for his insistence that the mortgage market was going to burst.

  Why are predators willing to endure this kind of personal abuse? Perhaps they are sufficiently secure and confident that they don't need public approval. Or perhaps they are so caught up in their own calculations that they don't notice. The simplest explanation, though, is that it's just another manifestation of their relentlessly rational pursuit of the sure thing. If an awkward family reunion was the price Walton had to pay for a guaranteed line of credit, then so be it. He went out of his way to take a personal risk in order to avoid a professional risk. Reputation, after all, is a commodity that trades in the marketplace at a significant and often excessive premium. The predator shorts the dancers, and goes long on the wallflowers.

  5.

  When Pellegrini finally finished his research on the mortgage market--proving how profoundly inflated home prices had become--he rushed in to show his findings to his boss. Zuckerman writes:

"This is unbelievable!" Paulson said, unable to take his eyes off the chart. A mischievous smile formed on his face, as if Pellegrini had shared a secret no one else was privy to. Paulson sat back in his chair and turned to Pellegrini. "This is our bubble! This is proof. Now we can prove it!" Paulson said. Pellegrini grinned, unable to mask his pride. The chart was Paulson's Rosetta stone, the key to making sense of the entire housing market. Years later, he would keep it atop a pile of papers on his desk, showing it off to his clients and updating it each month with new data, like a car collector gently waxing and caressing a prized antique auto. . . . "I still look at it. I love that chart," Paulson says.

  There are a number of moments like this in "The Greatest Trade Ever," when it becomes clear just how much Paulson enjoyed his work. Yes, he wanted to make money. But he was fabulously wealthy long before he tackled the mortgage business. His real motivation was the challenge of figuring out a particularly knotty problem. He was a kid with a puzzle.

  This is consistent with the one undisputed finding in all the research on entrepreneurship: people who work for themselves are far happier than the rest of us. Shane says that the average person would have to earn two and a half times as much to be as happy working for someone else as he would be working for himself. And people who like what they do are profoundly conservative. When the sociologists Hongwei Xu and Martin Ruef asked a large sample of entrepreneurs and non-entrepreneurs to choose among three alternatives--a business with a potential profit of five million dollars with a twenty-per-cent chance of success, or one with a profit of two million with a fifty-per-cent chance of success, or one with a profit of $1.25 million with an eighty-per-cent chance of success--it was the entrepreneurs who were more likely to go with the third, safe choice. They weren't dazzled by the chance of making five million dollars. They were drawn to the eighty-per-cent chance of getting to do what they love doing. The predator is a supremely rational actor. But, deep down, he is also a romantic, motivated by the simple joy he finds in his work.

  In "Call Me Ted," Turner tells the story of one of his first great traumas. When Turner was twenty-four, his father committed suicide. He had been depressed and troubled for some months, and one day after breakfast he went upstairs and shot himself. After the funeral, it emerged that the day before his death Turner's father had sold the crown jewels of the family business--the General Outdoor properties--to a man named Bob Naegele. Turner was grief-stricken. But he fought back. He hired away the General Outdoor leasing department. He began "jumping" the company's leases--that is, persuading the people who owned the real estate on which the General Outdoor billboards sat to cancel the leases and sign up with Turner Advertising. Then he flew to Palm Springs and strong-armed Naegele into giving back the business. Turner the rational actor negotiated the deal. But it was Turner the romantic who had the will, at the moment of his greatest grief, to fight back. What Turner understood was that none of his grand ambitions were possible without the billboard cash machine. He had felt the joy that comes with figuring out a particularly knotty problem, and he couldn't give that up. Naegele, by the way, asked for two hundred thousand dollars, which Turner didn't have. But Turner realized that for someone in Naegele's tax bracket a flat payment like that made no sense. He countered with two hundred thousand dollars in Turner Advertising stock. "So far so good," Turner writes in his autobiography. "I had kept the company out of Naegele's hands and it didn't cost me a single dollar of cash." Of course it didn't. He's a predator. Why on earth would he take a risk like that?

  How different are dogfighting and football?

  1.

  One evening in August, Kyle Turley was at a bar in Nashville with his wife and some friends. It was one of the countless little places in the city that play live music. He'd ordered a beer, but was just sipping it, because he was driving home. He had eaten an hour and a half earlier. Suddenly, he felt a sensation of heat. He was light-headed, and began to sweat. He had been having episodes like that with increasing frequency during the past year--headaches, nausea. One month, he had vertigo every day, bouts in which he felt as if he were stuck to a wall. But this was worse. He asked his wife if he could sit on her stool for a moment. The warmup band was still playing, and he remembers saying, "I'm just going to take a nap right here until the next band comes on." Then he was lying on the floor, and someone was standing over him. "The guy was freaking out," Turley recalled. "He was saying, 'Damn, man, I couldn't find a pulse,' and my wife said, 'No, no. You were breathing.' I'm, like, 'What? What?' "

  They picked him up. "We went out in the parking lot, and I just lost it," Turley went on. "I started puking everywhere. I couldn't stop. I got in the car, still puking. My wife, she was really scared, because I had never passed out like that before, and I started becoming really paranoid. I went into a panic. We get to the emergency room. I started to lose control. My limbs were shaking, and I couldn't speak. I was conscious, but I couldn't speak the words I wanted to say."

  Turley is six feet five. He is thirty-four years old, with a square jaw and blue eyes. For nine years, before he retired, in 2007, he was an offensive lineman in the National Football League. He knew all the stories about former football players. Mike Webster, the longtime Pittsburgh Steeler and one of the greatest players in N.F.L. history, ended his life a recluse, sleeping on the floor of the Pittsburgh Amtrak station. Another former Pittsburgh Steeler, Terry Long, drifted into chaos and killed himself four years ago by drinking antifreeze. Andre Waters, a former defensive back for the Philadelphia Eagles, sank into depression and pleaded with his girlfriend--"I need help, somebody help me"--before shooting himself in the head. There were men with aching knees and backs and hands, from all those years of playing football. But their real problem was with their heads, the one part of their body that got hit over and over again.

  "Lately, I've tried to break it "I remember, every season, multiple occasions where I'd hit someone so hard that my eyes went cross-eyed, and they wouldn't come uncrossed for a full series of plays. You are just out there, trying to hit the guy in the middle, because there are three of them. You don't remember much. There are the cases where you hit a guy and you'd get into a collision where everything goes off. You're dazed. And there are the others where you are involved in a big, long drive. You start on your own five-yard line, and drive all the way down the field--fifteen, eighteen plays in a row sometimes. Every play: collision, collision, collision. By the time you get to the other end of the field, you're seeing spots. You feel like you are going to black out. Literally, these white explosions--boom, boom, boom--lights getting dimmer and brighter, dimmer and brighter.

  "Then, there was the time when I got knocked unconscious. That was in St. Louis, in 2003. My wife said that I was out a minute or two on the field. But I was gone for about four hours after that. It was the last play of the third quarter. We were playing the Packers. I got hit in the back of the head. I saw it on film a little while afterward. I was running downfield, made a block on a guy. We fell to the ground. A guy was chasing the play, a little guy, a defensive back, and he jumped over me as I was coming up, and he kneed me right in the back of the head. Boom!

  "They sat me down on the bench. I remember Marshall Faulk coming up and joking with me, because he knew that I was messed up. That's what happens in the N.F.L: 'Oooh. You got effed up. Oooh.' The trainer came up to me and said, 'Kyle, let's take you to the locker room.' I remember looking up at a clock, and there was only a minute and a half left in the game--and I had no idea that much time had elapsed. I showered and took all my gear off. I was sitting at my locker. I don't remember anything. When I came back, after being hospitalized, the guys were joking with me because Georgia Frontiere"--then the team's owner--"came in the locker room, and they said I was butt-ass naked and I gave her a big hug. They were dying laughing, and I was, like, 'Are you serious? I did that?'

  "They cleared me for practice that Thursday. I probably shouldn't have. I don't know what damage I did from that, because my head was really hurting. But when you're coming off an injury you're frustrated. I wanted to play the next game. I was just so mad that this happened to me that I'm overdoing it. I was just going after guys in practice. I was really trying to use my head more, because I was so frustrated, and the coaches on the sidelines are, like, 'Yeah. We're going to win this game. He's going to lead the team.' That's football. You're told either that you're hurt or that you're injured. There is no middle ground. If you are hurt, you can play. If you are injured, you can't, and the line is whether you can walk and if you can put on a helmet and pads."

  Turley said that he loved playing football so much that he would do it all again. Then he began talking about what he had gone through in the past year. The thing that scared him most about that night at the bar was that it felt exactly like the time he was knocked unconscious. "It was identical," he said. "It was my worst episode ever."

  2.

  In August of 2007, one of the highest-paid players in professional football, the quarterback Michael Vick, pleaded guilty to involvement in a dogfighting ring. The police raided one of his properties, a farm outside Richmond, Virginia, and found the bodies of dead dogs buried on the premises, along with evidence that some of the animals there had been tortured and electrocuted. Vick was suspended from football. He was sentenced to twenty-three months in prison. The dogs on his farm were seized by the court, and the most damaged were sent to an animal sanctuary in Utah for rehabilitation. When Vick applied for reinstatement to the National Football League, this summer, he was asked to undergo psychiatric testing. He then met with the commissioner of the league, Roger Goodell, for four and a half hours, so that Goodell could be sure that he was genuinely remorseful.

  "I probably considered every alternative that I could think of," Goodell told reporters, when he finally allowed Vick back into the league. "I reached out to an awful lot of people to get their views--not only on what was right for the young man but also what was right for our society and the N.F.L."

  Goodell's job entails dealing with players who have used drugs, driven drunk and killed people, fired handguns in night clubs, and consorted with thugs and accused murderers. But he clearly felt what many Americans felt as well--that dogfighting was a moral offense of a different order.

  Here is a description of a dogfight given by the sociologists Rhonda Evans and Craig Forsyth in "The Social Milieu of Dogmen and Dogfights," an article they published some years ago in the journal Deviant Behavior. The fight took place in Louisiana between a local dog, Black, owned by a man named L.G., and Snow, whose owner, Rick, had come from Arizona:

The handlers release their dogs and Snow and Black lunge at one another. Snow rears up and overpowers Black, but Black manages to come back with a quick locking of the jaws on Snow's neck. The crowd is cheering wildly and yelling out bets. Once a dog gets a lock on the other, they will hold on with all their might. The dogs flail back and forth and all the while Black maintains her hold.

  In a dogfight, whenever one of the dogs "turns"--makes a submissive gesture with its head--the two animals are separated and taken back to their corners. Each dog, in alternation, then "scratches"--is released to charge at its opponent. After that first break, it is Snow's turn to scratch. She races toward Black:

Snow goes straight for the throat and grabs hold with her razor-sharp teeth. Almost immediately, blood flows from Black's throat. Despite a serious injury to the throat, Black manages to continue fighting back. They are relentless, each battling the other and neither willing to accept defeat. This fighting continues for an hour. [Finally, the referee] gives the third and final pit call. It is Black's turn to scratch and she is severely wounded. Black manages to crawl across the pit to meet her opponent. Snow attacks Black and she is too weak to fight back. L.G. realizes that this is it for Black and calls the fight. Snow is declared the winner.

  Afterward, Snow's owner collects his winnings; L.G. carries Black from the ring. "Her back legs are broken and blood is gushing from her throat," Evans and Forsyth write. "A shot rings out barely heard over the noise in the barn. Black's body is wrapped up and carried by her owner to his vehicle."

  It's the shot ringing out that seals the case against dogfighting. L.G. willingly submitted his dog to a contest that culminated in her suffering and destruction. And why? For the entertainment of an audience and the chance of a payday. In the nineteenth century, dogfighting was widely accepted by the American public. But we no longer find that kind of transaction morally acceptable in a sport. "I was not aware of dogfighting and the terrible things that happen around dogfighting," Goodell said, explaining why he responded so sternly in the Vick case. One wonders whether, had he spent as much time talking to Kyle Turley as he did to Michael Vick, he'd start to have similar doubts about his own sport.

  3.

  In 2003, a seventy-two-year-old patient at the Veterans Hospital in Bedford, Massachusetts, died, fifteen years after receiving a diagnosis of dementia. Patients in the hospital's dementia ward are routinely autopsied, as part of the V.A.'s research efforts, so the man's brain was removed and "fixed" in a formaldehyde solution. A laboratory technician placed a large slab of the man's cerebral tissue on a microtome--essentially, a sophisticated meat slicer--and, working along the coronal plane, cut off dozens of fifty-micron shavings, less than a hairbreadth thick. The shavings were then immunostained--bathed in a special reagent that would mark the presence of abnormal proteins with a bright, telltale red or brown stain on the surface of the tissue. Afterward, each slice was smoothed out and placed on a slide.

  The stained tissue of Alzheimer's patients typically shows the two trademarks of the disease--distinctive patterns of the proteins beta-amyloid and tau. Beta-amyloid is thought to lay the groundwork for dementia. Tau marks the critical second stage of the disease: it's the protein that steadily builds up in brain cells, shutting them down and ultimately killing them. An immunostain of an Alzheimer's patient looks, under the microscope, as if the tissue had been hit with a shotgun blast: the red and brown marks, corresponding to amyloid and tau, dot the entire surface. But this patient's brain was different. There was damage only to specific surface regions of his brain, and the stains for amyloid came back negative. "This was all tau," Ann McKee, who runs the hospital's neuropathology laboratory, said. "There was not even a whiff of amyloid. And it was the most extraordinary damage. It was one of those cases that really took you aback." The patient may have been in an Alzheimer's facility, and may have looked and acted as if he had Alzheimer's. But McKee realized that he had a different condition, called chronic traumatic encephalopathy (C.T.E.), which is a progressive neurological disorder found in people who have suffered some kind of brain trauma. C.T.E. has many of the same manifestations as Alzheimer's: it begins with behavioral and personality changes, followed by disinhibition and irritability, before moving on to dementia. And C.T.E. appears later in life as well, because it takes a long time for the initial trauma to give rise to nerve-cell breakdown and death. But C.T.E. isn't the result of an endogenous disease. 's the result of injury. The patient, it turned out, had been a boxer in his youth. He had suffered from dementia for fifteen years because, decades earlier, he'd been hit too many times in the head.

  McKee's laboratory does the neuropathology work for both the giant Framingham heart study, which has been running since 1948, and Boston University's New England Centenarian Study, which analyzes the brains of people who are unusually long-lived. "I'm looking at brains constantly," McKee said. "Then I ran across another one. I saw it and said, 'Wow, it looks just like the last case.' This time, there was no known history of boxing. But then I called the family, and heard that the guy had been a boxer in his twenties." You can't see tau except in an autopsy, and you can't see it in an autopsy unless you do a very particular kind of screen. So now that McKee had seen two cases, in short order, she began to wonder: how many people who we assume have Alzheimer's--a condition of mysterious origin--are actually victims of preventable brain trauma?

  McKee linked up with an activist named Chris Nowinski, a former college football player and professional wrestler who runs a group called the Sports Legacy Institute, in Boston. In his football and wrestling careers, Nowinski suffered six concussions (that he can remember), the last of which had such severe side effects that he has become a full-time crusader against brain injuries in sports. Nowinski told McKee that he would help her track down more brains of ex-athletes. Whenever he read an obituary of someone who had played in a contact sport, he'd call up the family and try to persuade them to send the player's brain to Bedford. Usually, they said no. Sometimes they said yes. The first brain McKee received was from a man in his mid-forties who had played as a linebacker in the N.F.L. for ten years. He accidentally shot himself while cleaning a gun. He had at least three concussions in college, and eight in the pros. In the years before his death, he'd had memory lapses, and had become more volatile. McKee immunostained samples of his brain tissue, and saw big splotches of tau all over the frontal and temporal lobes. If he hadn't had the accident, he would almost certainly have ended up in a dementia ward.

  Nowinski found her another ex-football player. McKee saw the same thing. She has now examined the brains of sixteen ex-athletes, most of them ex-football players. Some had long careers and some played only in college. Some died of dementia. Some died of unrelated causes. Some were old. Some were young. Most were linemen or linebackers, although there was one wide receiver. In one case, a man who had been a linebacker for sixteen years, you could see, without the aid of magnification, that there was trouble: there was a shiny tan layer of scar tissue, right on the surface of the frontal lobe, where the brain had repeatedly slammed into the skull. It was the kind of scar you'd get only if you used your head as a battering ram. You could also see that some of the openings in the brain were larger than you'd expect, as if the surrounding tissue had died and shrunk away. In other cases, everything seemed entirely normal until you looked under the microscope and saw the brown ribbons of tau. But all sixteen of the ex-athlete brains that McKee had examined--those of the two boxers, plus the ones that Nowinski had found for her--had something in common: every one had abnormal tau.

  The other major researcher looking at athletes and C.T.E. is the neuropathologist Bennet Omalu. He diagnosed the first known case of C.T.E. in an ex-N.F.L. player back in September of 2002, when he autopsied the former Pittsburgh Steelers center Mike Webster. He also found C.T.E. in the former Philadelphia Eagles defensive back Andre Waters, and in the former Steelers linemen Terry Long and Justin Strzelczyk, the latter of whom was killed when he drove the wrong way down a freeway and crashed his car, at ninety miles per hour, into a tank truck. Omalu has only once failed to find C.T.E. in a professional football player, and that was a twenty-four-year-old running back who had played in the N.F.L. for only two years.

  "There is something wrong with this group as a cohort," Omalu says. "They forget things. They have slurred speech. I have had an N.F.L. player come up to me at a funeral and tell me he can't find his way home. I have wives who call me and say, 'My husband was a very good man. Now he drinks all the time. I don't know why his behavior changed.' I have wives call me and say, 'My husband was a nice guy. Now he's getting abusive.' I had someone call me and say, 'My husband went back to law school after football and became a lawyer. Now he can't do his job. People are suing him.' "

  McKee and Omalu are trying to make sense of the cases they've seen so far. At least some of the players are thought to have used steroids, which has led to the suggestion that brain injury might in some way be enhanced by drug use. Many of the players also share a genetic risk factor for neurodegenerative diseases, so perhaps deposits of tau are the result of brain trauma coupled with the weakened ability of the brain to repair itself. McKee says that she will need to see at least fifty cases before she can draw any firm conclusions. In the meantime, late last month the University of Michigan's Institute for Social Research released the findings of an N.F.L.-funded phone survey of just over a thousand randomly selected retired N.F.L. players--all of whom had played in the league for at least three seasons. Self-reported studies are notoriously unreliable instruments, but, even so, the results were alarming. Of those players who were older than fifty, 6.1 per cent reported that they had received a diagnosis of "dementia, Alzheimer's disease, or other memory-related disease." That's five times higher than the national average for that age group. For players between the ages of thirty and forty-nine, the reported rate was nineteen times the national average. (The N.F.L. has distributed five million dollars to former players with dementia.)

  "A long time ago, someone suggested that the [C.T.E. rate] in boxers was twenty per cent," McKee told me. "I think it's probably higher than that among boxers, and I also suspect that it's going to end up being higher than that among football players as well. Why? Because every brain I've seen has this. To get this number in a sample this small is really unusual, and the findings are so far out of the norm. I only can say that because I have looked at thousands of brains for a long time. This isn't something that you just see. I did the same exact thing for all the individuals from the Framingham heart study. We study them until they die. I run these exact same proteins, make these same slides--and we never see this."

  McKee's laboratory occupies a warren of rooms, in what looks like an old officers' quarters on the V.A. campus. In one of the rooms, there is an enormous refrigerator, filled with brains packed away in hundreds of plastic containers. Nearby is a tray with small piles of brain slices. They look just like the ginger shavings that come with an order of sushi. Now McKee went to the room next to her office, sat down behind a microscope, and inserted one of the immunostained slides under the lens.

  "This is Tom McHale," she said. "He started out playing for Cornell. Then he went to Tampa Bay. He was the man who died of substance abuse at the age of forty-five. I only got fragments of the brain. But it's just showing huge accumulations of tau for a forty-five-year-old--ridiculously abnormal."

  She placed another slide under the microscope. "This individual was forty-nine years old. A football player. Cognitively intact. He never had any rage behavior. He had the distinctive abnormalities. Look at the hypothalamus." It was dark with tau. She put another slide in. "This guy was in his mid-sixties," she said. "He died of an unrelated medical condition. His name is Walter Hilgenberg. Look at the hippocampus. It's wall-to-wall tangles. Even in a bad case of Alzheimer's, you don't see that." The brown pigment of the tau stain ran around the edge of the tissue sample in a thick, dark band. "It's like a big river."

  McKee got up and walked across the corridor, back to her office. "There's one last thing," she said. She pulled out a large photographic blowup of a brain-tissue sample. "This is a kid. I'm not allowed to talk about how he died. He was a good student. This is his brain. He's eighteen years old. He played football. He'd been playing football for a couple of years." She pointed to a series of dark spots on the image, where the stain had marked the presence of something abnormal. "He's got all this tau. This is frontal and this is insular. Very close to insular. Those same vulnerable regions." This was a teen-ager, and already his brain showed the kind of decay that is usually associated with old age. "This is completely inappropriate," she said. "You don't see tau like this in an eighteen-year-old. You don't see tau like this in a fifty-year-old."

  McKee is a longtime football fan. She is from Wisconsin. She had two statuettes of Brett Favre, the former Green Bay Packers quarterback, on her bookshelf. On the wall was a picture of a robust young man. It was McKee's son--nineteen years old, six feet three. If he had a chance to join the N.F.L., I asked her, what would she advise him? "I'd say, 'Don't. Not if you want to have a life after football.' "

  4.

  At the core of the C.T.E. research is a critical question: is the kind of injury being uncovered by McKee and Omalu incidental to the game of football or inherent in it? Part of what makes dogfighting so repulsive is the understanding that violence and injury cannot be removed from the sport. It's a feature of the sport that dogs almost always get hurt. Something like stock-car racing, by contrast, is dangerous, but not unavoidably so.

  In 2000 and 2001, four drivers in Nascar's élite Sprint Cup Series were killed in crashes, including the legendary Dale Earnhardt. In response, Nascar mandated stronger seats, better seat belts and harnesses, and ignition kill switches, and completed the installation of expensive new barriers on the walls of its racetracks, which can absorb the force of a crash much better than concrete. The result is that, in the past eight years, no one has died in Nascar's three national racing series. Stock-car fans are sometimes caricatured as bloodthirsty, eagerly awaiting the next spectacular crash. But there is little blood these days in Nascar crashes. Last year, at Texas Motor Speedway, Michael McDowell hit an oil slick, slammed head first into the wall at a hundred and eighty miles per hour, flipped over and over, leaving much of his car in pieces on the track, and, when the vehicle finally came to a stop, crawled out of the wreckage and walked away. He raced again the next day. So what is football? Is it dogfighting or is it stock-car racing?

  Football faced a version of this question a hundred years ago, after a series of ugly incidents. In 1905, President Theodore Roosevelt called an emergency summit at the White House, alarmed, as the historian John Sayle Watterson writes, "that the brutality of the prize ring had invaded college football and might end up destroying it." Columbia University dropped the sport entirely. A professor at the University of Chicago called it a "boy-killing, man-mutilating, money-making, education-prostituting, gladiatorial sport." In December of 1905, the presidents of twelve prominent colleges met in New York and came within one vote of abolishing the game. But the main objection at the time was to a style of play--densely and dangerously packed offensive strategies--that, it turns out, could be largely corrected with rule changes, like the legalization of the forward pass and the doubling of the first-down distance from five yards to ten. Today, when we consider subtler and more insidious forms of injury, it's far from clear whether the problem is the style of play or the play itself.

  Take the experience of a young defensive lineman for the University of North Carolina football team, who suffered two concussions during the 2004 season. His case is one of a number studied by Kevin Guskiewicz, who runs the university's Sports Concussion Research Program. For the past five seasons, Guskiewicz and his team have tracked every one of the football team's practices and games using a system called HITS, in which six sensors are placed inside the helmet of every player on the field, measuring the force and location of every blow he receives to the head. Using the HITS data, Guskiewicz was able to reconstruct precisely what happened each time the player was injured.

  "The first concussion was during preseason. The team was doing two-a-days," he said, referring to the habit of practicing in both the morning and the evening in the preseason. "It was August 9th, 9:55 A.M. He has an 80-g hit to the front of his head. About ten minutes later, he has a 98-g acceleration to the front of his head." To put those numbers in perspective, Guskiewicz explained, if you drove your car into a wall at twenty-five miles per hour and you weren't wearing your seat belt, the force of your head hitting the windshield would be around 100 gs: in effect, the player had two car accidents that morning. He survived both without incident. "In the evening session, he experiences this 64-g hit to the same spot, the front of the head. Still not reporting anything. And then this happens." On his laptop, Guskiewicz ran the video from the practice session. It was a simple drill: the lineman squaring off against an offensive player who wore the number 76. The other player ran toward the lineman and brushed past him, while delivering a glancing blow to the defender's helmet. "Seventy-six does a little quick elbow. It's 63 gs, the lowest of the four, but he sustains a concussion."

  "The second injury was nine weeks later," Guskiewicz continued. "He's now recovered from the initial injury. It's a game out in Utah. In warmups, he takes a 76-g blow to the front of his head. Then, on the very first play of the game, on kickoff, he gets popped in the earhole. It's a 102-g impact. He's part of the wedge." He pointed to the screen, where the player was blocking on a kickoff: "Right " The player stumbled toward the sideline. "His symptoms were significantly worse than the first injury." Two days later, during an evaluation in Guskiewicz's clinic, he had to have a towel put over his head because he couldn't stand the light. He also had difficulty staying awake. He was sidelined for sixteen days.

  When we think about football, we worry about the dangers posed by the heat and the fury of competition. Yet the HITS data suggest that practice--the routine part of the sport--can be as dangerous as the games themselves. We also tend to focus on the dramatic helmet-to-helmet hits that signal an aggressive and reckless style of play. Those kinds of hits can be policed. But what sidelined the U.N.C. player, the first time around, was an accidental and seemingly innocuous elbow, and none of the blows he suffered that day would have been flagged by a referee as illegal. Most important, though, is what Guskiewicz found when he reviewed all the data for the lineman on that first day in training camp. He didn't just suffer those four big blows. He was hit in the head thirty-one times that day. What seems to have caused his concussion, in other words, was his cumulative exposure. And why was the second concussion--in the game at Utah--so much more serious than the first? It's not because that hit to the side of the head was especially dramatic; it was that it came after the 76-g blow in warmup, which, in turn, followed the concussion in August, which was itself the consequence of the thirty prior hits that day, and the hits the day before that, and the day before that, and on and on, perhaps back to his high-school playing days.

  This is a crucial point. Much of the attention in the football world, in the past few years, has been on concussions--on diagnosing, managing, and preventing them--and on figuring out how many concussions a player can have before he should call it quits. But a football player's real issue isn't simply with repetitive concussive trauma. It is, as the concussion specialist Robert Cantu argues, with repetitive subconcussive trauma. It's not just the handful of big hits that matter. It's lots of little hits, too.

  That's why, Cantu says, so many of the ex-players who have been given a diagnosis of C.T.E. were linemen: line play lends itself to lots of little hits. The HITS data suggest that, in an average football season, a lineman could get struck in the head a thousand times, which means that a ten-year N.F.L. veteran, when you bring in his college and high-school playing days, could well have been hit in the head eighteen thousand times: that's thousands of jarring blows that shake the brain from front to back and side to side, stretching and weakening and tearing the connections among nerve cells, and making the brain increasingly vulnerable to long-term damage. People with C.T.E., Cantu says, "aren't necessarily people with a high, recognized concussion history. But they are individuals who collided heads on every play--repetitively doing this, year after year, under levels that were tolerable for them to continue to play."

  But if C.T.E. is really about lots of little hits, what can be done about it? Turley says that it's impossible for an offensive lineman to do his job without "using his head." The position calls for the player to begin in a crouch and then collide with the opposing lineman when the ball is snapped. Helmet-to-helmet contact is inevitable. Nowinski, who played football for Harvard, says that "proper" tackling technique is supposed to involve a player driving into his opponent with his shoulder. "The problem," he says, "is that, if you're a defender and you're trying to tackle someone and you decide to pick a side, you're giving the other guy a way to go--and people will start running around you." Would better helmets help? Perhaps. And there have been better models introduced that absorb more of the shock from a hit. But, Nowinski says, the better helmets have become--and the more invulnerable they have made the player seem--the more athletes have been inclined to play recklessly.

  "People love technological solutions," Nowinski went on. "When I give speeches, the first question is always: 'What about these new helmets I hear about?' What most people don't realize is that we are decades, if not forever, from having a helmet that would fix the problem. I mean, you have two men running into each other at full speed and you think a little bit of plastic and padding could absorb that 150 gs of force?"

  At one point, while he was discussing his research, Guskiewicz showed a videotape from a 1997 college football game between Arizona and Oregon. In one sequence, a player from Oregon viciously tackles an Arizona player, bringing his head up onto the opposing player's chin and sending his helmet flying with the force of the blow. To look at it, 'd think that the Arizona player would be knocked unconscious. Instead, he bounces back up. "This guy does not sustain a concussion," Guskiewicz said. "He has a lip laceration. Lower lip, that's it. Now, same game, twenty minutes later." He showed a clip of an Arizona defensive back making a dramatic tackle. He jumps up, and, as he does so, a teammate of his chest-bumps him in celebration. The defensive back falls and hits his head on the ground. "That's a Grade 2 concussion," Guskiewicz said. "It's the fall to the ground, combined with the bounce off the turf."

  The force of the first hit was infinitely greater than the second. But the difference is that the first player saw that he was about to be hit and tensed his neck, which limited the sharp back-and-forth jolt of the head that sends the brain crashing against the sides of the skull. In essence, he was being hit not in the head but in the head, neck, and torso--an area with an effective mass three times greater. In the second case, the player didn't see the hit coming. His head took the full force of the blow all by itself. That's why he suffered a concussion. But how do you insure, in a game like football, that a player is never taken by surprise?

  Guskiewicz and his colleagues have come up with what they believe is a much better method of understanding concussion. They have done a full cognitive workup of the players on the U.N.C. team, so that they can track whatever effect might arise from the hits each player accumulates during his four years. U.N.C.'s new coach, Butch Davis, has sharply cut back on full-contact practices, reducing the toll on the players' heads. Guskiewicz says his data show that a disproportionate number of serious head impacts happen on kickoffs, so he wonders whether it might make sense, in theory, anyway, to dispense with them altogether. But, like everyone else who's worried about football, he still has no idea what the inherent risks of the game are. What if you did everything you could, and banned kickoffs and full-contact practices and used the most state-of-the-art techniques for diagnosing and treating concussion, and behaved as responsibly as Nascar has in the past several years--and players were still getting too many dangerous little hits to the head?

  After the tape session, Guskiewicz and one of his colleagues, Jason Mihalik, went outside to watch the U.N.C. football team practice, a short walk down the hill from their office. Only when you see football at close range is it possible to understand the dimensions of the brain-injury problem. The players were huge--much larger than you imagine them being. They moved at astonishing speeds for people of that size, and, long before you saw them, you heard them: the sound of one two-hundred-and-fifty-pound man colliding with another echoed around the practice facility. Mihalik and Guskiewicz walked over to a small building, just off to the side of the field. On the floor was a laptop inside a black storage crate. Next to the computer was an antenna that received the signals from the sensors inside the players' helmets. Mihalik crouched down and began paging through the data. In one column, the HITS software listed the top hits of the practice up to that point, and every few moments the screen would refresh, reflecting the plays that had just been run on the field. Forty-five minutes into practice, the top eight head blows on the field measured 82 gs, 79 gs, 75 gs, 79 gs, 67 gs, 60 gs, 57 gs, and 53 gs. One player, a running back, had received both the 79 gs and the 60 gs, as well as another hit, measuring 27.9 gs. This wasn't a full-contact practice. It was "shells." The players wore only helmets and shoulder pads, and still there were mini car crashes happening all over the field.

  5.

  The most damaged, scarred, and belligerent of Michael Vick's dogs--the hardest cases--were sent to the Best Friends Animal Sanctuary, on a thirty-seven-hundred-acre spread in the canyons of southern Utah. They were housed in a specially modified octagon, a one-story, climate-controlled cottage, ringed by individual dog runs. The dogs were given a final walk at 11 P.M. and woken up at 7 A.M., to introduce them to a routine. They were hand-fed. In the early months, the staff took turns sleeping in the octagon--sometimes in the middle, sometimes in a cot in one of the runs--so that someone would be with the dogs twenty-four hours a day. Twenty-two of Vick's pit bulls came to Best Friends in January of 2008, and all but five of them are still there.

  Ray lunged at his handlers when he first came to Best Friends. He can't be with other dogs. Ellen lies on the ground and wants her stomach scratched, and when the caregivers slept in the octagon she licked them all night long. Her face is lopsided, as if it had been damaged from fighting. She can't be with other dogs, either. Georgia has a broken tail, and her legs and snout are covered with scars. She has no teeth. At some point, in her early life, they had been surgically removed. The court-ordered evaluation of the Vick dogs labelled Meryl, a medium-sized brown-and-white pit-bull mix, "human aggressive," meaning that she is never allowed to be taken out of the Best Friends facility. "She had a hard time meeting people--she would preëmpt anyone coming by charging and snapping at them," Ann Allums, one of the Best Friends dog trainers, said, as she walked around Meryl's octagon, on a recent fall day.

  She opened the gate to Meryl's dog run and crouched down on the ground next to her. She hugged the dog, and began playfully wrestling with her, as Meryl's tail thumped happily. "She really doesn't mind new people," Allums said. "s very happy and loving. I feel totally comfortable with her. I can grab and kiss her." She gave Meryl another hug. "I am building a relationship," she said. "She needed to see that when people were around bad things would not happen."

  What happens at Best Friends represents, by any measure, an extravagant gesture. These are dogs that will never live a normal life. But the kind of crime embodied by dogfighting is so morally repellent that it demands an extravagant gesture in response. In a fighting dog, the quality that is prized above all others is the willingness to persevere, even in the face of injury and pain. A dog that will not do that is labelled a "cur," and abandoned. A dog that keeps charging at its opponent is said to possess "gameness," and game dogs are revered.

  In one way or another, plenty of organizations select for gameness. The Marine Corps does so, and so does medicine, when it puts young doctors through the exhausting rigors of residency. But those who select for gameness have a responsibility not to abuse that trust: if you have men in your charge who would jump off a cliff for you, you cannot march them to the edge of the cliff--and dogfighting fails this test. Gameness, Carl Semencic argues, in "The World of Fighting Dogs" (1984), is no more than a dog's "desire to please an owner at any expense to itself." The owners, Semencic goes on,

understand this desire to please on the part of the dog and capitalize on it. At any organized pit fight in which two dogs are really going at each other wholeheartedly, one can observe the owner of each dog changing his position at pit-side in order to be in sight of his dog at all times. The owner knows that seeing his master rooting him on will make a dog work all the harder to please its master.

  This is why Michael Vick's dogs weren't euthanized. The betrayal of loyalty requires an act of social reparation.

  Professional football players, too, are selected for gameness. When Kyle Turley was knocked unconscious, in that game against the Packers, he returned to practice four days later because, he said, "I didn't want to miss a game." Once, in the years when he was still playing, he woke up and fell into a wall as he got out of bed. "I start puking all over," he recalled. "So I said to my wife, 'Take me to practice.' I didn't want to miss practice." The same season that he was knocked unconscious, he began to have pain in his hips. He received three cortisone shots, and kept playing. At the end of the season, he discovered that he had a herniated disk. He underwent surgery, and four months later was back at training camp. "They put me in full-contact practice from day one," he said. "After the first day, I knew I wasn't right. They told me, 'You've had the surgery. You're fine. You should just fight through it.' It's like you're programmed. You've got to go without question--I'm a warrior. I can block that out of my mind. I go out, two days later. Full contact. Two-a-days. My back locks up again. I had re-herniated the same disk that got operated on four months ago, and bulged the disk above it." As one of Turley's old coaches once said, "He plays the game as it should be played, all out," which is to say that he put the game above his own well-being.

  Turley says he was once in the training room after a game with a young linebacker who had suffered a vicious hit on a kickoff return. "We were in the cold tub, which is, like, forty-five degrees, and he starts passing out. In the cold tub. I don't know anyone who has ever passed out in the cold tub. That's supposed to wake you up. And I'm, like, slapping his face. 'Richie! Wake up!' He said, 'What, what? I'm cool.' I said, 'You've got a concussion. You have to go to the hospital.' He said, 'You know, man, I'm fine.' " He wasn't fine, though. That moment in the cold tub represented a betrayal of trust. He had taken the hit on behalf of his team. He was then left to pass out in the cold tub, and to deal--ten and twenty years down the road--with the consequences. No amount of money or assurances about risk freely assumed can change the fact that, in this moment, an essential bond had been broken. What football must confront, in the end, is not just the problem of injuries or scientific findings. It is the fact that there is something profoundly awry in the relationship between the players and the game.

  "Let's assume that Dr. Omalu and the others are right," Ira Casson, who co-chairs an N.F.L. committee on brain injury, said. "What should we be doing differently? We asked Dr. McKee this when she came down. And she was honest, and said, 'I don't know how to answer that.' No one has any suggestions--assuming that you aren't saying no more football, because, let's be honest, that's not going to happen." Casson began to talk about the research on the connection between C.T.E. and boxing. It had been known for eighty years. Boxers ran a twenty-per-cent risk of dementia. Yet boxers continue to box. Why? Because people still go to boxing matches.

  "We certainly know from boxers that the incidence of C.T.E. is related to the length of your career," he went on. "So if you want to apply that to football--and I'm not saying it does apply--then you'd have to let people play six years and then stop. If it comes to that, maybe we'll have to think about that. On the other hand, nobody's willing to do this in boxing. Why would a boxer at the height of his career, six or seven years in, stop fighting, just when he's making million-dollar paydays?" He shrugged. "It's a violent game. I suppose if you want to you could play touch football or flag football. For me, as a Jewish kid from Long Island, I'd be just as happy if we did that. But I don't know if the fans would be happy with that. So what else do you do?"

  Casson is right. There is nothing else to be done, not so long as fans stand and cheer. We are in love with football players, with their courage and grit, and nothing else--neither considerations of science nor those of morality--can compete with the destructive power of that love.

  In "Dogmen and Dogfights," Evans and Forsyth write:

When one views a staged dog fight between pit bulls for the first time, the most macabre aspect of the event is that the only sounds you hear from these dogs are those of crunching bones and cartilage. The dogs rip and tear at each other; their blood, urine and saliva splatter the sides of the pit and clothes of the handlers. . . . The emotions of the dogs are conspicuous, but not so striking, even to themselves, are the passions of the owners of the dogs. Whether they hug a winner or in the rare case, destroy a dying loser, whether they walk away from the carcass or lay crying over it, their fondness for these fighters is manifest.

  Atticus Finch and the limits of Southern liberalism.

  1.

  In 1954, when James (Big Jim) Folsom was running for a second term as governor of Alabama, he drove to Clayton, in Barbour County, to meet a powerful local probate judge. This was in the heart of the Deep South, at a time when Jim Crow was in full effect. In Barbour County, the races did not mix, and white men were expected to uphold the privileges of their gender and color. But when his car pulled up to the curb, where the judge was waiting, Folsom spotted two black men on the sidewalk. He jumped out, shook their hands heartily, and only then turned to the stunned judge. "All men are just alike," Folsom liked to say.

  Big Jim Folsom was six feet eight inches tall, and had the looks of a movie star. He was a prodigious drinker, and a brilliant campaigner, who travelled around the state with a hillbilly string band called the Strawberry Pickers. The press referred to him (not always affectionately) as Kissin' Jim, for his habit of grabbing the prettiest woman at hand. Folsom was far and away the dominant figure in postwar Alabama politics--and he was a prime example of that now rare species of progressive Southern populist.

  Folsom would end his speeches by brandishing a corn-shuck mop and promising a spring cleaning of the state capitol. He was against the Big Mules, as the entrenched corporate interests were known. He worked to extend the vote to disenfranchised blacks. He wanted to equalize salaries between white and black schoolteachers. He routinely commuted the death sentences of blacks convicted in what he believed were less than fair trials. He made no attempt to segregate the crowd at his inaugural address. "Ya'll come," he would say to one and all, making a proud and lonely stand for racial justice.

  Big Jim Folsom left office in 1959. The next year, a young Southern woman published a novel set in mid-century Alabama about one man's proud and lonely stand for racial justice. The woman was Harper Lee and the novel was "To Kill a Mockingbird," and one way to make sense of Lee's classic--and of a controversy that is swirling around the book on the eve of its fiftieth anniversary--is to start with Big Jim Folsom.

  2.

  The Alabama of Folsom--and Lee--was marked by a profound localism. Political scientists call it the "friends and neighbors" effect. "Alabama voters rarely identified with candidates on the basis of issues," George Sims writes in his biography of Folsom, "The Little Man's Best Friend." "Instead, they tended to give greatest support to the candidate whose home was nearest their own." Alabama was made up of "island communities," each dominated by a small clique of power brokers, known as a "courthouse ring." There were no Republicans to speak of in the Alabama of that era, only Democrats. Politics was not ideological. It was personal. What it meant to be a racial moderate, in that context, was to push for an informal accommodation between black and white.

  "Big Jim did not seek a fundamental shift of political power or a revolution in social mores," Sims says. Folsom operated out of a sense of noblesse oblige: privileged whites, he believed, ought to "adopt a more humanitarian attitude" toward blacks. When the black Harlem congressman Adam Clayton Powell, Jr., came to Montgomery, on a voter-registration drive, Folsom invited him to the Governor's Mansion for a Scotch-and-soda. That was simply good manners. Whenever he was accused of being too friendly to black people, Folsom shrugged. His assumption was that Negroes were citizens, just like anyone else. "I just never did get all excited about our colored brothers," he once said. "We have had them here for three hundred years and we will have them for another three hundred years."

  Folsom was not a civil-rights activist. Activists were interested in using the full, impersonal force of the law to compel equality. In fact, the Supreme Court's landmark desegregation ruling in Brown v. Board of Education ended Folsom's career, because the racial backlash that it created drove moderates off the political stage. The historian Michael Klarman writes, "Virtually no southern politician could survive in this political environment without toeing the massive resistance line, and in most states politicians competed to occupy the most extreme position on the racial spectrum." Folsom lost his job to the segregationist John Patterson, who then gave way to the radical George Wallace. In Birmingham, which was quietly liberalizing through the early nineteen-fifties, Bull Connor (who notoriously set police dogs on civil-rights marchers in the nineteen-sixties) had been in political exile. It was the Brown decision that brought him back. Old-style Southern liberalism--gradual and paternalistic--crumbled in the face of liberalism in the form of an urgent demand for formal equality. Activism proved incompatible with Folsomism.

  On what side was Harper Lee's Atticus Finch? Finch defended Tom Robinson, the black man falsely accused of what in nineteen-thirties Alabama was the gravest of sins, the rape of a white woman. In the years since, he has become a role model for the legal profession. But he's much closer to Folsom's side of the race question than he is to the civil-rights activists who were arriving in the South as Lee wrote her novel.

  Think about the scene that serves as the book's centerpiece. Finch is at the front of the courtroom with Robinson. The jury files in. In the balcony, the book's narrator--Finch's daughter, Jean Louise, or Scout, as she's known--shuts her eyes. "Guilty," the first of the jurors says. "Guilty," the second says, and down the line: "guilty, guilty, guilty." Finch gathers his papers into his briefcase. He says a quiet word to his client, gathers his coat off the back of his chair, and walks, head bowed, out of the courtroom.

  "Someone was punching me, but I was reluctant to take my eyes from the people below us, and from the image of Atticus's lonely walk down the aisle," Scout relates, in one of American literature's most moving passages:

"Miss Jean Louise?"
I looked around. They were standing. All around us and in the balcony on the opposite wall, the Negroes were getting to their feet. Reverend Sykes's voice was as distant as Judge Taylor's:
"Miss Jean Louise, stand up. Your father's passin'."

  If Finch were a civil-rights hero, he would be brimming with rage at the unjust verdict. But he isn't. He's not Thurgood Marshall looking for racial salvation through the law. He's Jim Folsom, looking for racial salvation through hearts and minds.

  "If you can learn a simple trick, Scout, you'll get along a lot better with all kinds of folks," Finch tells his daughter. "You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it." He is never anything but gracious to his neighbor Mrs. Dubose, even though she considers him a "nigger-lover." He forgives the townsfolk of Maycomb for the same reason. They are suffering from a "sickness," he tells Scout--the inability to see a black man as a real person. All men, he believes, are just alike.

  3.

  Here is where the criticism of Finch begins, because the hearts-and-minds approach is about accommodation, not reform. At one point, Scout asks him if it is O.K. to hate Hitler. Finch answers, firmly, that it is not O.K. to hate anyone. Really? Not even Hitler? When his children bring up the subject of the Ku Klux Klan's presence in Maycomb, he shrugs: "Way back about nineteen-twenty there was a Klan, but it was a political organization more than anything. Besides, they couldn't find anyone to scare. They paraded by Mr. Sam Levy's house one night, but Sam just stood on his porch and told 'em things had come to a pretty pass. . . . Sam made 'em so ashamed of themselves they went away." Someone in Finch's historical position would surely have been aware of the lynching of Leo Frank in Marietta, Georgia, in 1915. Frank was convicted, on dubious evidence, of murdering a thirteen-year-old girl, Mary Phagan. The prosecutor in the case compared Frank to Judas Iscariot, and the crowd outside the courthouse shouted, "Hang the Jew!" Anti-Semitism of the most virulent kind was embedded in the social fabric of the Old South. But Finch does not want to deal with the existence of anti-Semitism. He wants to believe in the fantasy of Sam Levy, down the street, giving the Klan a good scolding.

  In the middle of the novel, after Tom Robinson's arrest, Finch spends the night in front of the Maycomb jail, concerned that a mob might come down and try to take matters into its own hands. Sure enough, one does, led by a poor white farmer, Walter Cunningham. The mob eventually scatters, and the next morning Finch tries to explain the night's events to Scout. Here again is a test for Finch's high-minded equanimity. He likes Walter Cunningham. Cunningham is, to his mind, the right sort of poor white farmer: a man who refuses a W.P.A. handout and who scrupulously repays Finch for legal work with a load of stove wood, a sack of hickory nuts, and a crate of smilax and holly. Against this, Finch must weigh the fact that Cunningham also leads lynch mobs against black people. So what does he do? Once again, he puts personal ties first. Cunningham, Finch tells his daughter, is "basically a good man," who "just has his blind spots along with the rest of us." Blind spots? As the legal scholar Monroe Freedman has written, "It just happens that Cunningham's blind spot (along with the rest of us?) is a homicidal hatred of black people."

  Finch will stand up to racists. He'll use his moral authority to shame them into silence. He will leave the judge standing on the sidewalk while he shakes hands with Negroes. What he will not do is look at the problem of racism outside the immediate context of Mr. Cunningham, Mr. Levy, and the island community of Maycomb, Alabama.

  Folsom was the same way. He knew the frailties of his fellow-Alabamians when it came to race. But he could not grasp that those frailties were more than personal--that racism had a structural dimension. After he was elected governor a second time, in 1955, Folsom organized the first inaugural ball for blacks in Alabama's history. That's a very nice gesture. Yet it doesn't undermine segregation to give Negroes their own party. It makes it more palatable. Folsom's focus on the personal was also the reason that he was blindsided by Brown. He simply didn't have an answer to the Court's blunt and principled conclusion that separate was not equal. For a long time, Folsom simply ducked questions about integration. When he could no longer duck, he wriggled. And the wriggling wasn't attractive. Sims writes:

In the spring of 1955, he repeated portions of his campaign program that touched the issue of desegregation tangentially and claimed that he had already made his position "plain, simple, and clear." He frequently repeated his pledge that he would not force black children to go to school with white children. It was an ambiguous promise that sounded like the words of a segregationist without specifically opposing segregation. Speaking to the Alabama Education Association in 1955, the governor recommended a school construction bond issue and implied that the money would help prolong segregation by improving the physical facilities of Negro schools.

  4.

  One of Atticus Finch's strongest critics has been the legal scholar Steven Lubet, and Lubet's arguments are a good example of how badly the brand of Southern populism Finch represents has aged over the past fifty years. Lubet's focus is the main event of "To Kill a Mockingbird"--Finch's defense of Tom Robinson. In "Reconstructing Atticus Finch," in the Michigan Law Review, Lubet points out that Finch does not have a strong case. The putative rape victim, Mayella Ewell, has bruises on her face, and the supporting testimony of her father, Robert E. Lee Ewell. Robinson concedes that he was inside the Ewell house, and that some kind of sexual activity took place. The only potentially exculpatory evidence Finch can come up with is that Mayella's bruises are on the right side of her face while Robinson's left arm, owing to a childhood injury, is useless. Finch presents this fact with great fanfare. But, as Lubet argues, it's not exactly clear why a strong right-handed man can't hit a much smaller woman on the right side of her face. Couldn't she have turned her head? Couldn't he have hit her with a backhanded motion? Given the situation, Finch designs his defense, Lubet says, "to exploit a virtual catalog of misconceptions and fallacies about rape, each one calculated to heighten mistrust of the female complainant."

  Here is the crucial moment of Robinson's testimony. Under Finch's patient prodding, he has described how he was walking by the Ewell property when Mayella asked him to come inside, to help her dismantle a piece of furniture. The house, usually crowded with Mayella's numerous sisters and brothers, was empty. "I say where the chillun?" Robinson testifies, "an' she says--she was laughin', sort of--she says they all gone to town to get ice creams. She says, 'Took me a slap year to save seb'm nickels, but I done it. They all gone to town.' " She then asked him to stand on a chair and get a box down from the chifforobe. She "hugged him" around the waist. Robinson goes on:

"She reached up an' kissed me 'side of th' face. She says she never kissed a grown man before an' she might as well kiss a nigger. She says what her papa do to her don't count. She says, 'Kiss me back nigger.' I say Miss Mayella lemme outa here an' tried to run but she got her back to the door an' I'da had to push her. I didn't wanta harm her, Mr. Finch, an' I say lemme pass, but just when I say it Mr. Ewell yonder hollered through th' window."
"What did he say?"
. . . Tom Robinson shut his eyes tight. "He says you goddam whore, I'll kill ya."

  Mayella plotted for a year, saving her pennies so she could clear the house of her siblings. Then she lay in wait for Robinson, in the fervent hope that he would come by that morning. "She knew full well the enormity of her offense," Finch tells the jury, in his summation, "but because her desires were stronger than the code she was breaking, she persisted in breaking it." For a woman to be portrayed as a sexual aggressor in the Jim Crow South was a devastating charge. Lubet writes:

The "she wanted it" defense in this case was particularly harsh. Here is what it said about Mayella: She was so starved for sex that she spent an entire year scheming for a way to make it happen. She was desperate for a man, any man. She repeatedly grabbed at Tom and wouldn't let him go, barring the door when he respectfully tried to disentangle himself. And in case Mayella had any dignity left after all that, it had to be insinuated that she had sex with her father.

It is useful, once again, to consider Finch's conduct in the light of the historical South of his time. The scholar Lisa Lindquist Dorr has examined two hundred and eighty-eight cases of black-on-white rape that occurred in Virginia between 1900 and 1960. Seventeen of the accused were killed through "extra legal violence"--that is to say, lynched. Fifty were executed. Forty-eight were given the maximum sentence. Fifty-two were sentenced to prison terms of five years or less, on charges ranging from rape and murder to robbery, assault and battery, or "annoying a white woman." Thirty-five either were acquitted or had their charges dismissed. A not inconsiderable number had their sentences commuted by the governor.

  Justice was administered unequally in the South: Dorr points out that of the dozens of rapists in Virginia who were sentenced to death between 1908 and 1963 (Virginia being one of the few states where both rape and attempted rape were capital crimes) none were white. Nonetheless, those statistics suggest that race was not always the overriding consideration in rape trials. "White men did not always automatically leap to the defense of white women," Dorr writes. "Some white men reluctantly sided with black men against white women whose class or sexual history they found suspect. Sometimes whites trusted the word of black men whose families they had known for generations over the sworn testimony of white women whose backgrounds were unknown or (even worse) known and despised. White women retained their status as innocent victim only as long as they followed the dictates of middle-class morality."

  One of Dorr's examples is John Mays, Jr., a black juvenile sentenced in 1923 to an eighteen-year prison term for the attempted rape of a white girl. His employer, A. A. Sizer, petitioned the Virginia governor for clemency, arguing that Mays, who was religious and educated, "comes of our best negro stock." His victim, meanwhile, "comes from our lowest breed of poor whites. . . . Her mother is utterly immoral and without principle; and this child has been accustomed from her very babyhood to behold scenes of the grossest immorality. None of our welfare work affects her, she is brazenly immoral."

  The reference to the mother was important. "Though Sizer did not directly impugn the victim herself, direct evidence was unnecessary during the heyday of eugenic family studies," Dorr writes. "The victim, coming from the same inferior 'stock,' would likely share her mother's moral character." The argument worked: Mays was released from prison in 1930.

  This is essentially the defense that Atticus Finch fashions for his client. Robinson is the churchgoer, the "good Negro." Mayella, by contrast, comes from the town's lowest breed of poor whites. "Every town the size of Maycomb had families like the Ewells," Scout tells us. "No truant officers could keep their numerous offspring in school; no public health officer could free them from congenital defects, various worms, and the diseases indigenous to filthy surroundings." They live in a shack behind the town dump, with windows that "were merely open spaces in the walls, which in the summertime were covered with greasy strips of cheesecloth to keep out the varmints that feasted on Maycomb's refuse." Bob Ewell is described as a "little bantam cock of a man" with a face as red as his neck, so unaccustomed to polite society that cleaning up for the trial leaves him with a "scalded look; as if an overnight soaking had deprived him of protective layers of dirt." His daughter, the complainant, is a "thick-bodied girl accustomed to strenuous labor." The Ewells are trash. When the defense insinuates that Mayella is the victim of incest at the hands of her father, it is not to make her a sympathetic figure. It is, in the eugenicist spirit of the times, to impugn her credibility--to do what A. A. Sizer did in the John Mays case: The victim, coming from the same inferior stock, would likely share her father's moral character. "I won't try to scare you for a while," Finch says, when he begins his cross-examination of Mayella. Then he adds, with polite menace, "Not yet."

  We are back in the embrace of Folsomism. Finch wants his white, male jurors to do the right thing. But as a good Jim Crow liberal he dare not challenge the foundations of their privilege. Instead, Finch does what lawyers for black men did in those days. He encourages them to swap one of their prejudices for another.

  5.

  One of George Orwell's finest essays takes Charles "Dickens to task for his lack of constructive suggestions." Dickens was a powerful critic of Victorian England, a proud and lonely voice in the campaign for social reform. But, as Orwell points out, there was little substance to Dickens's complaints. "He attacks the law, parliamentary government, the educational system and so forth, without ever clearly suggesting what he would put in their places," Orwell writes. "There is no clear sign that he wants the existing order to be overthrown, or that he believes it would make very much difference if it were overthrown. For in reality his target is not so much society as 'human nature.' " Dickens sought "a change of spirit rather than a change in structure."

  Orwell didn't think that Dickens should have written different novels; he loved Dickens. But he understood that Dickens bore the ideological marks of his time and place. His class did not see the English social order as tyrannical, worthy of being overthrown. Dickens thought that large contradictions could be tamed through small moments of justice. He believed in the power of changing hearts, and that's what you believe in, Orwell says, if you "do not wish to endanger the status quo."

  But in cases where the status quo involves systemic injustice this is no more than a temporary strategy. Eventually, such injustice requires more than a change of heart. "What in the world am I ever going to do with the Niggers?" Jim Folsom once muttered, when the backlash against Brown began to engulf his political career. The argument over race had risen to such a pitch that it could no longer be alleviated by gesture and symbolism--by separate but equal inaugural balls and hearty handshakes--and he was lost.

  Finch's moral test comes at the end of "To Kill a Mockingbird." Bob Ewell has been humiliated by the Robinson trial. In revenge, he attacks Scout and her brother on Halloween night. Boo Radley, the reclusive neighbor of the Finches, comes to the children's defense, and in the scuffle Radley kills Ewell. Sheriff Tate brings the news to Finch, and persuades him to lie about what actually happened; the story will be that Ewell inadvertently stabbed himself in the scuffle. As the Sheriff explains:

Maybe you'll say it's my duty to tell the town all about it and not hush it up. Know what'd happen then? All the ladies in Maycomb includin' my wife'd be knocking on his door bringing angel food cakes. To my way of thinkin', Mr. Finch, taking the one man who's done you and this town a great service an' draggin' him with his shy ways into the limelight--to me, that's a sin. It's a sin and I'm not about to have it on my head. If it was any other man it'd be different. But not this man, Mr. Finch.

  The courthouse ring had spoken. Maycomb would go back to the way it had always been.

  "Scout," Finch says to his daughter, after he and Sheriff Tate have cut their little side deal. "Mr. Ewell fell on his knife. Can you possibly understand?"

  Understand what? That her father and the Sheriff have decided to obstruct justice in the name of saving their beloved neighbor the burden of angel-food cake? Atticus Finch is faced with jurors who have one set of standards for white people like the Ewells and another set for black folk like Tom Robinson. His response is to adopt one set of standards for respectable whites like Boo Radley and another for white trash like Bob Ewell. A book that we thought instructed us about the world tells us, instead, about the limitations of Jim Crow liberalism in Maycomb, Alabama.

  Banks, battles, and the psychology of overconfidence.

1.

  In 1996, an investor named Henry de Kwiatkowski sued Bear Stearns for negligence and breach of fiduciary duty. De Kwiatkowski had made--and then lost--hundreds of millions of dollars by betting on the direction of the dollar, and he blamed his bankers for his reversals. The district court ruled in de Kwiatkowski's favor, ultimately awarding him $164.5 million in damages. But Bear Stearns appealed--successfully--and in William D. Cohan's engrossing account of the fall of Bear Stearns, "House of Cards," the firm's former chairman and C.E.O. Jimmy Cayne tells the story of what happened on the day of the hearing:

  Their lead lawyer turned out to be about a 300-pound fag from Long Island . . . a really irritating guy who had cross-examined me and tried to kick the shit out of me in the lower court trial. Now when we walk into the courtroom for the appeal, they're arguing another case and we have to wait until they're finished. And I stopped this guy. I had to take a piss. I went into the bathroom to take a piss and came back and sat down. Then I see my blood enemy stand up and he's going to the bathroom. So I wait till he passes and then I follow him in and it's just he and I in the bathroom. And I said to him, "Today you're going to get your ass kicked, big." He ran out of the room. He thought I might have wanted to start it right there and then.

  At the time Cayne said this, Bear Stearns had spectacularly collapsed. The eighty-five-year-old investment bank, with its shiny new billion-dollar headquarters and its storied history, was swallowed whole by J. P. Morgan Chase. Cayne himself had lost close to a billion dollars. His reputation--forty years in the making--was in ruins, especially when it came out that, during Bear's final, critical months, he'd spent an inordinate amount of time on the golf course.

  Did Cayne think long and hard about how he wanted to make his case to Cohan? He must have. Cayne understood selling; he started out as a photocopier salesman, working the nine-hundred-mile stretch between Boise and Salt Lake City, and ended up among the highest-paid executives in banking. He was known as one of the savviest men on the Street, a master tactician, a brilliant gamesman. "Jimmy had it all," Bill Bamber, a former Bear senior managing director, writes in "Bear Trap: The Fall of Bear Stearns and the Panic of 2008" (a book co-written by Andrew Spencer). "The ability to read an opponent. The ability to objectively analyze his own strengths and weaknesses. . . . He knew how to exploit others' weaknesses--and their strengths, for that matter--as a way to further his own gain. He knew when to take his losses and live to fight another day."

  Cohan asked Cayne about the last days of Bear Stearns, in the spring of 2008. Wall Street had become so spooked by rumors about the firm's financial status that investors withdrew their capital, and no one would lend Bear the money required for its day-to-day operations. The bank received some government money, via J. P. Morgan. But Timothy Geithner, then the head of the New York Federal Reserve Bank, didn't open the Fed's so-called "discount window" to investment banks until J. P. Morgan's acquisition of Bear was under way. What did Cayne think of Geithner? Picture the scene. The journalist in one chair, Cayne in another. Between them, a tape recorder. And the savviest man on Wall Street sets out to salvage his good name:

The audacity of that prick in front of the American people announcing he was deciding whether or not a firm of this stature and this whatever was good enough to get a loan. Like he was the determining factor, and it's like a flea on his back, floating down underneath the Golden Gate Bridge, getting a hard-on, saying, "Raise the bridge." This guy thinks he's got a big dick. He's got nothing, except maybe a boyfriend.

  2.

  Since the beginning of the financial crisis, there have been two principal explanations for why so many banks made such disastrous decisions. The first is structural. Regulators did not regulate. Institutions failed to function as they should. Rules and guidelines were either inadequate or ignored. The second explanation is that Wall Street was incompetent, that the traders and investors didn't know enough, that they made extravagant bets without understanding the consequences. But the first wave of postmortems on the crash suggests a third possibility: that the roots of Wall Street's crisis were not structural or cognitive so much as they were psychological.

  In "Military Misfortunes," the historians Eliot Cohen and John Gooch offer, as a textbook example of this kind of failure, the British-led invasion of Gallipoli, in 1915. Gallipoli is a peninsula in southern Turkey, jutting out into the Aegean. The British hoped that by landing an army there they could make an end run around the stalemate on the Western Front, and give themselves a clear shot at the soft underbelly of Germany. It was a brilliant and daring strategy. "In my judgment, it would have produced a far greater effect upon the whole conduct of the war than anything [else]," the British Prime Minister H. H. Asquith later concluded. But the invasion ended in disaster, and Cohen and Gooch find the roots of that disaster in the curious complacency displayed by the British.

  The invasion required a large-scale amphibious landing, something the British had little experience with. It then required combat against a foe dug into ravines and rocky outcroppings and hills and thickly vegetated landscapes that Cohen and Gooch call "one of the finest natural fortresses in the world." Yet the British never bothered to draw up a formal plan of operations. The British military leadership had originally estimated that the Allies would need a hundred and fifty thousand troops to take Gallipoli. Only seventy thousand were sent. The British troops should have had artillery--more than three hundred guns. They took a hundred and eighteen, and, for the most part, neglected to bring howitzers, trench mortars, or grenades. Command of the landing at Sulva --the most critical element of the attack--was given to Frederick Stopford, a retired officer whose experience was largely administrative. Stopford had two days during which he had a ten-to-one advantage over the Turks and could easily have seized the highlands overlooking the bay. Instead, his troops lingered on the beach, while Stopford lounged offshore, aboard a command ship. Winston Churchill later described the scene as "the placid, prudent, elderly English gentleman with his 20,000 men spread around the beaches, the front lines sitting on the tops of shallow trenches, smoking and cooking, with here and there an occasional rifle shot, others bathing by hundreds in the bright blue bay where, disturbed hardly by a single shell, floated the great ships of war." When word of Stopford's ineptitude reached the British commander, Sir Ian Hamilton, he rushed to Sulva Bay to intercede--although "rushed" may not be quite the right word here, since Hamilton had chosen to set up his command post on an island an hour away and it took him a good while to find a boat to take him to the scene.

  Cohen and Gooch ascribe the disaster at Gallipoli to a failure to adapt--a failure to take into account how reality did not conform to their expectations. And behind that failure to adapt was a deeply psychological problem: the British simply couldn't wrap their heads around the fact that they might have to adapt. "Let me bring my lads face to face with Turks in the open field," Hamilton wrote in his diary before the attack. "We must beat them every time because British volunteer soldiers are superior individuals to Anatolians, Syrians or Arabs and are animated with a superior ideal and an equal joy in battle."

  Hamilton was not a fool. Cohen and Gooch call him an experienced and "brilliant commander who was also a firstrate trainer of men and a good organizer." Nor was he entirely wrong in his assessments. The British probably were a superior fighting force. Certainly they were more numerous, especially when they held that ten-to-one advantage at Sulva Bay. Hamilton, it seems clear, was simply overconfident--and one of the things that happen to us when we become overconfident is that we start to blur the line between the kinds of things that we can control and the kinds of things that we can't. The psychologist Ellen Langer once had subjects engage in a betting game against either a self-assured, well-dressed opponent or a shy and badly dressed opponent (in Langer's delightful phrasing, the "dapper" or the "schnook" condition), and she found that her subjects bet far more aggressively when they played against the schnook. They looked at their awkward opponent and thought, I'm better than he is. Yet the game was pure chance: all the players did was draw cards at random from a deck, and see who had the high hand. This is called the "illusion of control": confidence spills over from areas where it may be warranted ("I'm savvier than that schnook") to areas where it isn't warranted at all ("and that means I'm going to draw higher cards").

  At Gallipoli, the British acted as if their avowed superiority over the Turks gave them superiority over all aspects of the contest. They neglected to take into account the fact that the morning sun would be directly in the eyes of the troops as they stormed ashore. They didn't bring enough water. They didn't factor in the harsh terrain. "The attack was based on two assumptions," Cohen and Gooch write, "both of which turned out to be unwise: that the only really difficult part of the operation would be getting ashore, after which the Turks could easily be pushed off the peninsula; and that the main obstacles to a happy landing would be provided by the enemy."

  Most people are inclined to use moral terms to describe overconfidence--terms like "arrogance" or "hubris." But psychologists tend to regard overconfidence as a state as much as a trait. The British at Gallipoli were victims of a situation that promoted overconfidence. Langer didn't say that it was only arrogant gamblers who upped their bets in the presence of the schnook. She argues that this is what competition does to all of us; because ability makes a difference in competitions of skill, we make the mistake of thinking that it must also make a difference in competitions of pure chance. Other studies have reached similar conclusions. As novices, we don't trust our judgment. Then we have some success, and begin to feel a little surer of ourselves. Finally, we get to the top of our game and succumb to the trap of thinking that there's nothing we can't master. As we get older and more experienced, we overestimate the accuracy of our judgments, especially when the task before us is difficult and when we're involved with something of great personal importance. The British were overconfident at Gallipoli not because Gallipoli didn't matter but, paradoxically, because it did; it was a high-stakes contest, of daunting complexity, and it is often in those circumstances that overconfidence takes root.

  Several years ago, a team headed by the psychologist Mark Fenton-O'Creevy created a computer program that mimicked the ups and downs of an index like the Dow, and recruited, as subjects, members of a highly paid profession. As the line moved across the screen, Fenton-O'Creevy asked his subjects to press a series of buttons, which, they were told, might or might not affect the course of the line. At the end of the session, they were asked to rate their effectiveness in moving the line upward. The buttons had no effect at all on the line. But many of the players were convinced that their manipulation of the buttons made the index go up and up. The world these people inhabited was competitive and stressful and complex. They had been given every reason to be confident in their own judgments. If they sat down next to you, with a tape recorder, it wouldn't take much for them to believe that they had you in the palm of their hand. They were traders at an investment bank.

  3.

  The high-water mark for Bear Stearns was 2003. The dollar was falling. A wave of scandals had just swept through the financial industry. The stock market was in a swoon. But Bear Stearns was an exception. In the first quarter of that year, its earnings jumped fifty-five per cent. Its return on equity was the highest on Wall Street. The firm's mortgage business was booming. Since Bear Stearns's founding, in 1923, it had always been a kind of also-ran to its more blue-chip counterparts, like Goldman Sachs and Morgan Stanley. But that year Fortune named it the best financial company to work for. "We are hitting on all 99 cylinders," Jimmy Cayne told a reporter for the Times, in the spring of that year, "so you have to ask yourself, What can we do better? And I just can't decide what that might be." He went on, "Everyone says that when the markets turn around, we will suffer. But let me tell you, we are going to surprise some people this time around. Bear Stearns is a great place to be."

  With the benefit of hindsight, Cayne's words read like the purest hubris. But in 2003 they would have seemed banal. These are the kinds of things that bankers say. More precisely--and here is where psychological failure becomes more problematic still--these are the kinds of things that bankers are expected to say. Investment banks are able to borrow billions of dollars and make huge trades because, at the end of the day, their counterparties believe they are capable of making good on their promises. Wall Street is a confidence game, in the strictest sense of that phrase.

  This is what social scientists mean when they say that human overconfidence can be an "adaptive trait. In conflicts involving mutual assessment, an exaggerated assessment of the probability of winning increases the probability of winning," Richard Wrangham, a biological anthropologist at Harvard, writes. "Selection therefore favors this form of overconfidence." Winners know how to bluff. And who bluffs the best? The person who, instead of pretending to be stronger than he is, actually believes himself to be stronger than he is. According to Wrangham, self-deception reduces the chances of "behavioral leakage"; that is, of "inadvertently revealing the truth through an inappropriate behavior." This much is in keeping with what some psychologists have been telling us for years--that it can be useful to be especially optimistic about how attractive our spouse is, or how marketable our new idea is. In the words of the social psychologist Roy Baumeister, humans have an "optimal margin of illusion."

  If you were a Wall Street C.E.O., there were two potential lessons to be drawn from the collapse of Bear Stearns. The first was that Jimmy Cayne was overconfident. The second was that Jimmy Cayne wasn't overconfident enough. Bear Stearns did not collapse, after all, simply because it had made bad bets. Until very close to the end, the firm had a capital cushion of more than seventeen billion dollars. The problem was that when, in early 2008, Cayne and his colleagues stood up and said that Bear was a great place to be, the rest of Wall Street no longer believed them. Clients withdrew their money, and lenders withheld funding. As the run on Bear Stearns worsened, J. P. Morgan and the Fed threw the bank a lifeline--a multibillion-dollar line of credit. But confidence matters so much on Wall Street that the lifeline had the opposite of its intended effect. As Bamber writes:

This line-of-credit, the stop-gap measure that was supposed to solve the problem that hadn't really existed in the first place had done nothing but worsen it. When we started the week, we had no liquidity issues. But because people had said that we did have problems with our capital, it became true, even though it wasn't true when people started saying it. . . . So we were forced to find capital to offset the losses we'd sustained because somebody decided we didn't have capital when we really did. So when we finally got more capital to replace the capital we'd lost, people took that as a bad sign and pointed to the fact that we'd had no capital and had to get a loan to cover it, even when we did have the capital they said we didn't have.

  Of course, one reason that over-confidence is so difficult to eradicate from expert fields like finance is that, at least some of the time, it's useful to be overconfident--or, more precisely, sometimes the only way to get out of the problems caused by overconfidence is to be even more overconfident.

  From an individual perspective, it is hard to distinguish between the times when excessive optimism is good and the times when it isn't. All that we can say unequivocally is that overconfidence is, as Wrangham puts it, "globally maladaptive." When one opponent bluffs, he can score an easy victory. But when everyone bluffs, Wrangham writes, rivals end up "escalating conflicts that only one can win and suffering higher costs than they should if assessment were accurate." The British didn't just think the Turks would lose in Gallipoli; they thought that Belgium would prove to be an obstacle to Germany's advance, and that the Russians would crush the Germans in the east. The French, for their part, planned to be at the Rhine within six weeks of the start of the war, while the Germans predicted that by that point they would be on the outskirts of Paris. Every side in the First World War was bluffing, with the resolve and skill that only the deluded are capable of, and the results, of course, were catastrophic.

  4.

  Jimmy Cayne grew up in Chicago, the son of a patent lawyer. He wanted to be a bookie, but he realized that it wasn't quite respectable enough. He went to Purdue University to study mechanical engineering--and became hooked on bridge. His grades suffered, and he never graduated. He got married in 1956 and was divorced within four years. "At this time, he was one of the best bridge players in Chicago," his ex-brother-in-law told Cohan. "In fact, that's the reason for the divorce. There was no other woman or anything like that. The co-respondent in their divorce was bridge. He spent all of his time playing bridge--every night. He wasn't home." He was selling scrap metal in those days, and, Cohan says, he would fall asleep on the job, exhausted from playing cards. In 1964, he moved to New York to become a professional bridge player. It was bridge that led him to his second wife, and to a job interview with Alan (Ace) Greenberg, then a senior executive at Bear Stearns. When Cayne told Greenberg that he was a bridge player, Cayne tells Cohan, "you could see the electric light bulb." Cayne goes on:

[Greenberg] says, "How well do you play?" I said, "I play well." He said, "Like how well?" I said, "I play quite well." He says, "You don't understand." I said, "Yeah, I do. I understand. Mr. Greenberg, if you study bridge the rest of your life, if you play with the best partners and you achieve your potential, you will never play bridge like I play bridge."

  Right then and there, Cayne says, Greenberg offered him a job.

  Twenty years later, the scene was repeated with Warren Spector, who went on to become a co-president of the firm. Spector had been a bridge champion as a student, and Cayne somehow heard about it. "Suddenly, out of nowhere there's a bridge player at Bear Stearns on the bond desk," Cayne recalls. Spector tells Cohan, "He called me up and said, 'Are you a bridge player?' I said, 'I used to be.' So bridge was something that he, Ace, and I all shared and talked about." As reports circulated that two of Bear Stearns's hedge funds were going under--a failure that started the bank on its long, downward spiral into collapse--Spector and Cayne were attending the Spingold K.O. bridge tournament, in Nashville. The Wall Street Journal reported that, of the twenty-one workdays that month, Cayne was out of the office for nearly half of them.

  It makes sense that there should be an affinity between bridge and the business of Wall Street. Bridge is a contest between teams, each of which competes over a --how many tricks they think they can win in a given hand. Winning requires knowledge of the cards, an accurate sense of probabilities, steely nerves, and the ability to assess an opponent's psychology. Bridge is Wall Street in miniature, and the reason the light bulb went on when Greenberg looked at Cayne, and Cayne looked at Spector, is surely that they assumed that bridge skills could be transferred to the trading floor--that being good at the game version of Wall Street was a reasonable proxy for being good at the real-life version of Wall Street.

  It isn't, however. In bridge, there is such a thing as expertise unencumbered by bias. That's because, as the psychologist Gideon Keren points out, bridge involves "related items with continuous feedback." It has rules and boundaries and situations that repeat themselves and clear patterns that ----and when a player makes a mistake of overconfidence he or she learns of the consequences of that mistake almost immediately. In other words, it's a game. But running an investment bank is not, in this sense, a game: it is not a closed world with a limited set of possibilities. It is an open world where one day a calamity can happen that no one had dreamed could happen, and where you can make a mistake of overconfidence and not personally feel the consequences for years and years--if at all. Perhaps this is part of why we play games: there is something intoxicating about pure expertise, and the real mastery we can attain around a card table or behind the wheel of a racecar emboldens us when we move into the more complex realms. "I'm good at that. I must be good at this, too," we tell ourselves, forgetting that in wars and on Wall Street there is no such thing as absolute expertise, that every step taken toward mastery brings with it an increased risk of mastery's curse. Cayne must have come back from the Spingold bridge tournament fortified in his belief in his own infallibility. And the striking thing about his conversations with Cohan is that nothing that had happened since seemed to have shaken that belief.

  "When I left," Cayne told Cohan, speaking of his final day at Bear Stearns, "I had three different meetings. The first was with the president's advisory group, which was about eighty people. There wasn't a dry eye. Standing ovation. I was crying." Until the very end, he evidently saw the world that he wanted to see. "The second meeting was with the retail sales force on the Web," he goes on. "Standing ovation. And the third was a partners' meeting that night for me to tell them that I was stepping down. Standing ovation, of the whole auditorium."

  Is free the future?

  1.

  At a hearing on Capitol Hill in May, James Moroney, the publisher of the Dallas Morning News, told Congress about negotiations he'd just had with the online retailer Amazon. The idea was to license his newspaper's content to the Kindle, Amazon's new electronic reader. "They want seventy per cent of the subscription revenue," Moroney testified. "Glad12"I get thirty per cent, they get seventy per cent. On top of that, they have said we get the right to republish your intellectual property to any portable device." The idea was that if a Kindle subscription to the Dallas Morning News cost ten dollars a month, seven dollars of that belonged to Amazon, the provider of the gadget on which the news was read, and just three dollars belonged to the newspaper, the provider of an expensive and ever-changing variety of editorial content. The people at Amazon valued the newspaper's contribution so little, in fact, that they felt they ought then to be able to license it to anyone else they wanted. Another witness at the hearing, Arianna Huffington, of the Huffington Post, said that she thought the Kindle could provide a business model to save the beleaguered newspaper industry. Moroney disagreed. "I get thirty per cent and they get the right to license my content to any portable device--not just ones made by Amazon?" He was incredulous. "That, to me, is not a model."

  Had James Moroney read Chris Anderson's new book, "Free: The Future of a Radical Price" (Hyperion; $26.99), Amazon's offer might not have seemed quite so surprising. Anderson is the editor of Wired and the author of the 2006 best-seller "The Long Tail," and "Free" is essentially an extended elaboration of Stewart Brand's famous declaration that "information wants to be free." The digital age, Anderson argues, is exerting an inexorable downward pressure on the prices of all things "made of ideas." Anderson does not consider this a passing trend. Rather, he seems to think of it as an iron law: "In the digital realm you can try to keep Free at bay with laws and locks, but eventually the force of economic gravity will win." To musicians who believe that their music is being pirated, Anderson is blunt. They should stop complaining, and capitalize on the added exposure that piracy provides by making money through touring, merchandise sales, and "yes, the sale of some of [their] music to people who still want CDs or prefer to buy their music online." To the Dallas Morning News, he would say the same thing. Newspapers need to accept that content is never again going to be worth what they want it to be worth, and reinvent their business. "Out of the bloodbath will come a new role for professional journalists," he predicts, and he goes on:

  There may be more of them, not fewer, as the ability to participate in journalism extends beyond the credentialed halls of traditional media. But they may be paid far less, and for many it won't be a full time job at all. Journalism as a profession will share the stage with journalism as an avocation. Meanwhile, others may use their skills to teach and organize amateurs to do a better job covering their own communities, becoming more editor/coach than writer. If so, leveraging the Free--paying people to get other people to write for non-monetary rewards--may not be the enemy of professional journalists. Instead, it may be their salvation.

  Anderson is very good at paragraphs like this--with its reassuring arc from "bloodbath" to "salvation." His advice is pithy, his tone uncompromising, and his subject matter perfectly timed for a moment when old-line content providers are desperate for answers. That said, it is not entirely clear what distinction is being marked between "paying people to get other people to write" and paying people to write. If you can afford to pay someone to get other people to write, why can't you pay people to write? It would be nice to know, as well, just how a business goes about reorganizing itself around getting people to work for "non-monetary rewards." Does he mean that the New York Times should be staffed by volunteers, like Meals on Wheels? Anderson's reference to people who "prefer to buy their music online" carries the faint suggestion that refraining from theft should be considered a mere preference. And then there is his insistence that the relentless downward pressure on prices represents an iron law of the digital economy. Why is it a law? Free is just another price, and prices are set by individual actors, in accordance with the aggregated particulars of marketplace power. "Information wants to be free," Anderson tells us, "in the same way that life wants to spread and water wants to run downhill." But information can't actually want anything, can it? Amazon wants the information in the Dallas paper to be free, because that way Amazon makes more money. Why are the self-interested motives of powerful companies being elevated to a philosophical principle? But we are getting ahead of ourselves.

  2.

  Anderson's argument begins with a technological trend. The cost of the building blocks of all electronic activity--storage, processing, and bandwidth--has fallen so far that it is now approaching zero. In 1961, Anderson says, a single transistor was ten dollars. In 1963, it was five dollars. By 1968, it was one dollar. Today, Intel will sell you two billion transistors for eleven hundred dollars--meaning that the cost of a single transistor is now about .000055 cents.

  Anderson's second point is that when prices hit zero extraordinary things happen. Anderson describes an experiment conducted by the M.I.T. behavioral economist Dan Ariely, the author of "Predictably Irrational." Ariely offered a group of subjects a choice between two kinds of chocolate--Hershey's Kisses, for one cent, and Lindt truffles, for fifteen cents. Three-quarters of the subjects chose the truffles. Then he redid the experiment, reducing the price of both chocolates by one cent. The Kisses were now free. What happened? The order of preference was reversed. Sixty-nine per cent of the subjects chose the Kisses. The price difference between the two chocolates was exactly the same, but that magic word "free" has the power to create a consumer stampede. Amazon has had the same experience with its offer of free shipping for orders over twenty-five dollars. The idea is to induce you to buy a second book, if your first book comes in at less than the twenty-five-dollar threshold. And that's exactly what it does. In France, however, the offer was mistakenly set at the equivalent of twenty cents--and consumers didn't buy the second book. "From the consumer's perspective, there is a huge difference between cheap and free," Anderson writes. "Give a product away, and it can go viral. Charge a single cent for it and you're in an entirely different business. . . . The truth is that zero is one market and any other price is another."

  Since the falling costs of digital technology let you make as much stuff as you want, Anderson argues, and the magic of the word "free" creates instant demand among consumers, then Free (Anderson honors it with a capital) represents an enormous business opportunity. Companies ought to be able to make huge amounts of money "around" the thing being given away--as Google gives away its search and e-mail and makes its money on advertising.

  Anderson cautions that this philosophy of embracing the Free involves moving from a "scarcity" mind-set to an "abundance" mind-set. Giving something away means that a lot of it will be wasted. But because it costs almost nothing to make things, digitally, we can afford to be wasteful. The elaborate mechanisms we set up to monitor and judge the quality of content are, Anderson thinks, artifacts of an era of scarcity: we had to worry about how to allocate scarce resources like newsprint and shelf space and broadcast time. Not anymore. Look at YouTube, he says, the free video archive owned by Google. YouTube lets anyone post a video to its site free, and lets anyone watch a video on its site free, and it doesn't have to pass judgment on the quality of the videos it archives. "Nobody is deciding whether a video is good enough to justify the scarce channel space it takes, because there is no scarce channel space," he writes, and goes on:

Distribution is now close enough to free to round down. Today, it costs about $0.25 to stream one hour of video to one person. Next year, it will be $0.15. A year later it will be less than a dime. Which is why YouTube's founders decided to give it away. . . . The result is both messy and runs counter to every instinct of a television professional, but this is what abundance both requires and demands.

  There are four strands of argument here: a technological claim (digital infrastructure is effectively Free), a psychological claim (consumers love Free), a procedural claim (Free means never having to make a judgment), and a commercial claim (the market created by the technological Free and the psychological Free can make you a lot of money). The only problem is that in the middle of laying out what he sees as the new business model of the digital age Anderson is forced to admit that one of his main case studies, YouTube, "has so far failed to make any money for Google."

  Why is that? Because of the very principles of Free that Anderson so energetically celebrates. When you let people upload and download as many videos as they want, lots of them will take you up on the offer. That's the magic of Free psychology: an estimated seventy-five billion videos will be served up by YouTube this year. Although the magic of Free technology means that the cost of serving up each video is "close enough to free to round down," "close enough to free" multiplied by seventy-five billion is still a very large number. A recent report by Credit Suisse estimates that YouTube's bandwidth costs in 2009 will be three hundred and sixty million dollars. In the case of YouTube, the effects of technological Free and psychological Free work against each other.

  So how does YouTube bring in revenue? Well, it tries to sell advertisements alongside its videos. The problem is that the videos attracted by psychological Free--pirated material, cat videos, and other forms of user-generated content--are not the sort of thing that advertisers want to be associated with. In order to sell advertising, YouTube has had to buy the rights to professionally produced content, such as television shows and movies. Credit Suisse put the cost of those licenses in 2009 at roughly two hundred and sixty million dollars. For Anderson, YouTube illustrates the principle that Free removes the necessity of aesthetic judgment. (As he puts it, YouTube proves that "crap is in the eye of the beholder.") But, in order to make money, YouTube has been obliged to pay for programs that aren't crap. To recap: YouTube is a great example of Free, except that Free technology ends up not being Free because of the way consumers respond to Free, fatally compromising YouTube's ability to make money around Free, and forcing it to retreat from the "abundance thinking" that lies at the heart of Free. Credit Suisse estimates that YouTube will lose close to half a billion dollars this year. If it were a bank, it would be eligible for TARP funds.

  3.

  Anderson begins the second part of his book by quoting Lewis Strauss, the former head of the Atomic Energy Commission, who famously predicted in the mid-nineteen-fifties that "our children will enjoy in their homes electrical energy too cheap to meter."

  "What if Strauss had been right?" Anderson wonders, and then diligently sorts through the implications: as much fresh water as you could want, no reliance on fossil fuels, no global warming, abundant agricultural production. Anderson wants to take "too cheap to meter" seriously, because he believes that we are on the cusp of our own "too cheap to meter" revolution with computer processing, storage, and bandwidth. But here is the second and broader problem with Anderson's argument: he is asking the wrong question. It is pointless to wonder what would have happened if Strauss's prediction had come true while rushing past the reasons that it could not have come true.

  Strauss's optimism was driven by the fuel cost of nuclear energy--which was so low compared with its fossil-fuel counterparts that he considered it (to borrow Anderson's phrase) close enough to free to round down. Generating and distributing electricity, however, requires a vast and expensive infrastructure of transmission lines and power plants--and it is this infrastructure that accounts for most of the cost of electricity. Fuel prices are only a small part of that. As Gordon Dean, Strauss's predecessor at the A.E.C., wrote, " " Even if coal were mined and distributed free to electric generating plants today, the reduction in your monthly electricity bill would amount to but twenty per cent, so great is the cost of the plant itself and the distribution system."

  This is the kind of error that technological utopians make. They assume that their particular scientific revolution will wipe away all traces of its predecessors--that if you change the fuel you change the whole system. Strauss went on to forecast "an age of peace," jumping from atoms to human hearts. "As the world of chips and glass fibers and wireless waves goes, so goes the rest of the world," Kevin Kelly, another Wired visionary, proclaimed at the start of his 1998 digital manifesto, "New Rules for the New Economy," offering up the same non sequitur. And now comes Anderson. "The more products are made of ideas, rather than stuff, the faster they can get cheap," he writes, and we know what's coming next: "However, this is not limited to digital products." Just look at the pharmaceutical industry, he says. Genetic engineering means that drug development is poised to follow the same learning curve of the digital world, to "accelerate in performance while it drops in price."

  But, like Strauss, he's forgotten about the plants and the power lines. The expensive part of making drugs has never been what happens in the laboratory. It's what happens after the laboratory, like the clinical testing, which can take years and cost hundreds of millions of dollars. In the pharmaceutical world, what's more, companies have chosen to use the potential of new technology to do something very different from their counterparts in Silicon Valley. They've been trying to find a way to serve smaller and smaller markets--to create medicines tailored to very specific subpopulations and strains of diseases--and smaller markets often mean higher prices. The biotechnology company Genzyme spent five hundred million dollars developing the drug Myozyme, which is intended for a condition, Pompe disease, that afflicts fewer than ten thousand people worldwide. That's the quintessential modern drug: a high-tech, targeted remedy that took a very long and costly path to market. Myozyme is priced at three hundred thousand dollars a year. Genzyme isn't a mining company: its real assets are intellectual property--information, not stuff. But, in this case, information does not want to be free. It wants to be really, really expensive.

  And there's plenty of other information out there that has chosen to run in the opposite direction from Free. The Times gives away its content on its Web site. But the Wall Street Journal has found that more than a million subscribers are quite happy to pay for the privilege of reading online. Broadcast television--the original practitioner of Free--is struggling. But premium cable, with its stiff monthly charges for specialty content, is doing just fine. Apple may soon make more money selling iPhone downloads (ideas) than it does from the iPhone itself (stuff). The company could one day give away the iPhone to boost downloads; it could give away the downloads to boost iPhone sales; or it could continue to do what it does now, and charge for both. Who knows? The only iron law here is the one too obvious to write a book about, which is that the digital age has so transformed the ways in which things are made and sold that there are no iron laws.

  When underdogs break the rules.

1.

  When Vivek Ranadivé decided to coach his daughter Anjali's basketball team, he settled on two principles. The first was that he would never raise his voice. This was National Junior Basketball--the Little League of basketball. The team was made up mostly of twelve-year-olds, and twelve-year-olds, he knew from experience, did not respond well to shouting. He would conduct business on the basketball court, he decided, the same way he conducted business at his software firm. He would speak calmly and softly, and convince the girls of the wisdom of his approach with appeals to reason and common sense.

  The second principle was more important. Ranadivé was puzzled by the way Americans played basketball. He is from Mumbai. He grew up with cricket and soccer. He would never forget the first time he saw a basketball game. He thought it was mindless. Team A would score and then immediately retreat to its own end of the court. Team B would inbound the ball and dribble it into Team A's end, where Team A was patiently waiting. Then the process would reverse itself. A basketball court was ninety-four feet long. But most of the time a team defended only about twenty-four feet of that, conceding the other seventy feet. Occasionally, teams would play a full-court press--that is, they would contest their opponent's attempt to advance the ball up the court. But they would do it for only a few minutes at a time. It was as if there were a kind of conspiracy in the basketball world about the way the game ought to be played, and Ranadivé thought that that conspiracy had the effect of widening the gap between good teams and weak teams. Good teams, after all, had players who were tall and could dribble and shoot well; they could crisply execute their carefully prepared plays in their opponent's end. Why, then, did weak teams play in a way that made it easy for good teams to do the very things that made them so good?

  Ranadivé looked at his girls. Morgan and Julia were serious basketball players. But Nicky, Angela, Dani, Holly, Annika, and his own daughter, Anjali, had never played the game before. They weren't all that tall. They couldn't shoot. They weren't particularly adept at dribbling. They were not the sort who played pickup games at the playground every evening. Most of them were, as Ranadivé says, "little blond girls" from Menlo Park and Redwood City, the heart of Silicon Valley. These were the daughters of computer programmers and people with graduate degrees. They worked on science projects, and read books, and went on ski vacations with their parents, and dreamed about growing up to be marine biologists. Ranadivé knew that if they played the conventional way--if they let their opponents dribble the ball up the court without opposition--they would almost certainly lose to the girls for whom basketball was a passion. Ranadivé came to America as a seventeen-year-old, with fifty dollars in his pocket. He was not one to accept losing easily. His second principle, then, was that his team would play a real full-court press, every game, all the time. The team ended up at the national championships. "It was really random," Anjali Ranadivé said. "I mean, my father had never played basketball before."

  2.

  David's victory over Goliath, in the Biblical account, is held to be an anomaly. It was not. Davids win all the time. The political scientist Ivan Arreguín-Toft recently looked at every war fought in the past two hundred years between strong and weak combatants. The Goliaths, he found, won in 71.5 per cent of the cases. That is a remarkable fact. Arreguín-Toft was analyzing conflicts in which one side was at least ten times as powerful--in terms of armed might and population--as its opponent, and even in those lopsided contests the underdog won almost a third of the time.

  In the Biblical story of David and Goliath, David initially put on a coat of mail and a brass helmet and girded himself with a sword: he prepared to wage a conventional battle of swords against Goliath. But then he stopped. "I cannot walk in these, for I am unused to it," he said (in Robert Alter's translation), and picked up those five smooth stones. What happened, Arreguín-Toft wondered, when the underdogs likewise acknowledged their weakness and chose an unconventional strategy? He went back and re-analyzed his data. In those cases, David's winning percentage went from 28.5 to 63.6. When underdogs choose not to play by Goliath's rules, they win, Arreguín-Toft concluded, "even when everything we think we know about power says they shouldn't."

  Consider the way T. E. Lawrence (or, as he is better known, Lawrence of Arabia) led the revolt against the Ottoman Army occupying Arabia near the end of the First World War. The British were helping the Arabs in their uprising, and the initial focus was Medina, the city at the end of a long railroad that the Turks had built, running south from Damascus and down through the Hejaz desert. The Turks had amassed a large force in Medina, and the British leadership wanted Lawrence to gather the Arabs and destroy the Turkish garrison there, before the Turks could threaten the entire region.

  But when Lawrence looked at his ragtag band of Bedouin fighters he realized that a direct attack on Medina would never succeed. And why did taking the city matter, anyway? The Turks sat in Medina "on the defensive, immobile." There were so many of them, consuming so much food and fuel and water, that they could hardly make a major move across the desert. Instead of attacking the Turks at their point of strength, Lawrence reasoned, he ought to attack them where they were weak--along the vast, largely unguarded length of railway line that was their connection to Damascus. Instead of focussing his attention on Medina, he should wage war over the broadest territory possible.

  The Bedouins under Lawrence's command were not, in conventional terms, skilled troops. They were nomads. Sir Reginald Wingate, one of the British commanders in the region, called them "an untrained rabble, most of whom have never fired a rifle." But they were tough and they were mobile. The typical Bedouin soldier carried no more than a rifle, a hundred rounds of ammunition, forty-five pounds of flour, and a pint of drinking water, which meant that he could travel as much as a hundred and ten miles a day across the desert, even in summer. "Our cards were speed and time, not hitting power," Lawrence wrote. "Our largest available resources were the tribesmen, men quite unused to formal warfare, whose assets were movement, endurance, individual intelligence, knowledge of the country, courage." The eighteenth-century general Maurice de Saxe famously said that the art of war was about legs, not arms, and Lawrence's troops were all legs. In one typical stretch, in the spring of 1917, his men dynamited sixty rails and cut a telegraph line at Buair on March 24th, sabotaged a train and twenty-five rails at Abu al-Naam on March 25th, dynamited fifteen rails and cut a telegraph line at Istabl Antar on March 27th, raided a Turkish garrison and derailed a train on March 29th, returned to Buair and sabotaged the railway line again on March 31st, dynamited eleven rails at Hediah on April 3rd, raided the train line in the area of Wadi Dhaiji on April 4th and 5th, and attacked twice on April 6th.

  Lawrence's masterstroke was an assault on the port town of Aqaba. The Turks expected an attack from British ships patrolling the waters of the Gulf of Aqaba to the west. Lawrence decided to attack from the east instead, coming at the city from the unprotected desert, and to do that he led his men on an audacious, six-hundred-mile loop--up from the Hejaz, north into the Syrian desert, and then back down toward Aqaba. This was in summer, through some of the most inhospitable land in the Middle East, and Lawrence tacked on a side trip to the outskirts of Damascus, in order to mislead the Turks about his intentions. "This year the valley seemed creeping with horned vipers and puff-adders, cobras and black snakes," Lawrence writes in "The Seven Pillars of Wisdom" of one stage in the journey:

We could not lightly draw water after dark, for there were snakes swimming in the pools or clustering in knots around their brinks. Twice puff-adders came twisting into the alert ring of our debating coffee-circle. Three of our men died of bites; four recovered after great fear and pain, and a swelling of the poisoned limb. Howeitat treatment was to bind up the part with snake-skin plaster and read chapters of the Koran to the sufferer until he died.

  When they finally arrived at Aqaba, Lawrence's band of several hundred warriors killed or captured twelve hundred Turks, and lost only two men. The Turks simply did not think that their opponent would be mad enough to come at them from the desert. This was Lawrence's great insight. David can beat Goliath by substituting effort for ability--and substituting effort for ability turns out to be a winning formula for underdogs in all walks of life, including little blond-haired girls on the basketball court.

  3.

  Vivek Ranadivé is an elegant man, slender and fine-boned, with impeccable manners and a languorous walk. His father was a pilot who was jailed by Indira Gandhi, he says, because he wouldn't stop challenging the safety of India's planes. Ranadivé went to M.I.T., because he saw a documentary on the school and decided that it was perfect for him. This was in the nineteen-seventies, when going abroad for undergraduate study required the Indian government to authorize the release of foreign currency, and Ranadivé camped outside the office of the governor of the Reserve Bank of India until he got his way. The Ranadivés are relentless.

  In 1985, Ranadivé founded a software company in Silicon Valley devoted to what in the computer world is known as "real time" processing. If a businessman waits until the end of the month to collect and count his receipts, he's "batch processing." There is a gap between the events in the company--sales--and his understanding of those events. Wall Street used to be the same way. The information on which a trader based his decisions was scattered across a number of databases. The trader would collect information from here and there, collate and analyze it, and then make a trade. What Ranadivé's company, TIBCO, did was to consolidate those databases into one stream, so that the trader could collect all the data he wanted instantaneously. Batch processing was replaced by real-time processing. Today, TIBCO's software powers most of the trading floors on Wall Street.

  Ranadivé views this move from batch to real time as a sort of holy mission. The shift, to his mind, is one of kind, not just of degree. "We've been working with some airlines," he said. "You know, when you get on a plane and your bag doesn't, they actually know right away that it's not there. But no one tells you, and a big part of that is that they don't have all their information in one place. There are passenger systems that know where the passenger is. There are aircraft and maintenance systems that track where the plane is and what kind of shape 's in. Then, there are baggage systems and ticketing systems--and they're all separate. So you land, you wait at the baggage terminal, and it doesn't show up." Everything bad that happens in that scenario, Ranadivé maintains, happens because of the lag between the event (the luggage doesn't make it onto the plane) and the response (the airline tells you that your luggage didn't make the plane). The lag is why you're angry. The lag is why you had to wait, fruitlessly, at baggage claim. The lag is why you vow never to fly that airline again. Put all the databases together, and there's no lag. "What we can do is send you a text message the moment we know your bag didn't make it," Ranadivé said, "telling you we'll ship it to your house."

  A few years ago, Ranadivé wrote a paper arguing that even the Federal Reserve ought to make its decisions in real time--not once every month or two. "Everything in the world is now real time," he said. "So when a certain type of shoe isn't selling at your corner shop, it's not six months before the guy in China finds out. It's almost instantaneous, thanks to my software. The world runs in real time, but government runs in batch. Every few months, it adjusts. Its mission is to keep the temperature comfortable in the economy, and, if you were to do things the government's way in your house, then every few months you'd turn the heater either on or off, overheating or underheating your house." Ranadivé argued that we ought to put the economic data that the Fed uses into a big stream, and write a computer program that sifts through those data, the moment they are collected, and make immediate, incremental adjustments to interest rates and the money supply. "It can all be automated," he said. "Look, we've had only one soft landing since the Second World War. Basically, we've got it wrong every single time."

  You can imagine what someone like Alan Greenspan or Ben Bernanke might say about that idea. Such people are powerfully invested in the notion of the Fed as a Solomonic body: that pause of five or eight weeks between economic adjustments seems central to the process of deliberation. To Ranadivé, though, "deliberation" just prettifies the difficulties created by lag. The Fed has to deliberate because it's several weeks behind, the same way the airline has to bow and scrape and apologize because it waited forty-five minutes to tell you something that it could have told you the instant you stepped off the plane.

  Is it any wonder that Ranadivé looked at the way basketball was played and found it mindless? A professional basketball game was forty-eight minutes long, divided up into alternating possessions of roughly twenty seconds: back and forth, back and forth. But a good half of each twenty-second increment was typically taken up with preliminaries and formalities. The point guard dribbled the ball up the court. He stood above the top of the key, about twenty-four feet from the opposing team's basket. He called out a play that the team had choreographed a hundred times in practice. It was only then that the defending team sprang into action, actively contesting each pass and shot. Actual basketball took up only half of that twenty-second interval, so that a game's real length was not forty-eight minutes but something closer to twenty-four minutes--and that twenty-four minutes of activity took place within a narrowly circumscribed area. It was as formal and as convention-bound as an eighteenth-century quadrille. The supporters of that dance said that the defensive players had to run back to their own end, in order to compose themselves for the arrival of the other team. But the reason they had to compose themselves, surely, was that by retreating they allowed the offense to execute a play that it had practiced to perfection. Basketball was batch!

  Insurgents, though, operate in real time. Lawrence hit the Turks, in that stretch in the spring of 1917, nearly every day, because he knew that the more he accelerated the pace of combat the more the war became a battle of endurance--and endurance battles favor the insurgent. "And it happened as the Philistine arose and was drawing near David that David hastened and ran out from the lines toward the Philistine," the Bible says. "And he reached his hand into the pouch and took from there a stone and slung it and struck the Philistine in his forehead." The second sentence--the slingshot part--is what made David famous. But the first sentence matters just as much. David broke the rhythm of the encounter. He speeded it up. "The sudden astonishment when David sprints forward must have frozen Goliath, making him a better target," the poet and critic Robert Pinsky writes in "The Life of David." Pinsky calls David a "point guard ready to flick the basketball here or there." David pressed. That's what Davids do when they want to beat Goliaths.

  4.

  Ranadivé's basketball team played in the National Junior Basketball seventh-and-eighth-grade division, representing Redwood City. The girls practiced at Paye's Place, a gym in nearby San Carlos. Because Ranadivé had never played basketball, he recruited a series of experts to help him. The first was Roger Craig, the former all-pro running back for the San Francisco 49ers, who is also TIBCO's director of business development. As a football player, Craig was legendary for the off-season hill workouts he put himself through. Most of his N.F.L. teammates are now hobbling around golf courses. He has run seven marathons. After Craig signed on, he recruited his daughter Rometra, who played Division I basketball at Duke and U.S.C. Rometra was the kind of person you assigned to guard your opponent's best player in order to shut her down. The girls loved Rometra. "She has always been like my big sister," Anjali Ranadivé said. "It was so awesome to have her along."

  Redwood City's strategy was built around the two deadlines that all basketball teams must meet in order to advance the ball. The first is the inbounds pass. When one team scores, a player from the other team takes the ball out of bounds and has five seconds to pass it to a teammate on the court. If that deadline is missed, the ball goes to the other team. Usually, that's not an issue, because teams don't contest the inbounds pass. They run back to their own end. Redwood City did not. Each girl on the team closely shadowed her counterpart. When some teams play the press, the defender plays behind the offensive player she's guarding, to impede her once she catches the ball. The Redwood City girls, by contrast, played in front of their opponents, to prevent them from catching the inbounds pass in the first place. And they didn't guard the player throwing the ball in. Why bother? Ranadivé used that extra player as a floater, who could serve as a second defender against the other team's best player. "Think about football," Ranadivé said. "The quarterback can run with the ball. He has the whole field to throw to, and it's still damned difficult to complete a pass." Basketball was harder. A smaller court. A five-second deadline. A heavier, bigger ball. As often as not, the teams Redwood City was playing against simply couldn't make the inbounds pass within the five-second limit. Or the inbounding player, panicked by the thought that her five seconds were about to be up, would throw the ball away. Or her pass would be intercepted by one of the Redwood City players. Ranadivé's girls were maniacal.

  The second deadline requires a team to advance the ball across mid-court, into its opponent's end, within ten seconds, and if Redwood City's opponents met the first deadline the girls would turn their attention to the second. They would descend on the girl who caught the inbounds pass and "trap" her. Anjali was the designated trapper. She'd sprint over and double-team the dribbler, stretching her long arms high and wide. Maybe she'd steal the ball. Maybe the other player would throw it away in a panic--or get bottled up and stalled, so that the ref would end up blowing the whistle. "When we first started out, no one knew how to play defense or anything," Anjali said. "So my dad said the whole game long, 'Your job is to guard someone and make sure they never get the ball on inbounds plays.' It's the best feeling in the world to steal the ball from someone. We would press and steal, and do that over and over again. It made people so nervous. There were teams that were a lot better than us, that had been playing a long time, and we would beat them."

  The Redwood City players would jump ahead 4-0, 6-0, 8-0, 12-0. One time, they led 25-0. Because they typically got the ball underneath their opponent's basket, they rarely had to take low-percentage, long-range shots that required skill and practice. They shot layups. In one of the few games that Redwood City lost that year, only four of the team's players showed up. They pressed anyway. Why not? They lost by three points.

  "What that defense did for us is that we could hide our weaknesses," Rometra Craig said. She helped out once Redwood City advanced to the regional championships. "We could hide the fact that we didn't have good outside shooters. We could hide the fact that we didn't have the tallest lineup, because as long as we played hard on defense we were getting steals and getting easy layups. I was honest with the girls. I told them, 'We're not the best basketball team out there.' But they understood their roles." A twelve-year-old girl would go to war for Rometra. "They were awesome," she said.

  Lawrence attacked the Turks where they were weak--the railroad--and not where they were strong, Medina. Redwood City attacked the inbounds pass, the point in a game where a great team is as vulnerable as a weak one. Lawrence extended the battlefield over as large an area as possible. So did the girls of Redwood City. They defended all ninety-four feet. The full-court press is legs, not arms. It supplants ability with effort. It is basketball for those "quite unused to formal warfare, whose assets were movement, endurance, individual intelligence . . . courage."

  "It's an exhausting strategy," Roger Craig said. He and Ranadivé were in a TIBCO conference room, reminiscing about their dream season. Ranadivé was at the whiteboard, diagramming the intricacies of the Redwood City press. Craig was sitting at the table.

  "My girls had to be more fit than the others," Ranadivé said.

  "He used to make them run," Craig said, nodding approvingly.

  "We followed soccer strategy in practice," Ranadivé said. "I would make them run and run and run. I couldn't teach them skills in that short period of time, and so all we did was make sure they were fit and had some basic understanding of the game. That's why attitude plays such a big role in this, because you're going to get tired." He turned to Craig. "What was our cheer again?"

  The two men thought for a moment, then shouted out happily, in unison, "One, two, three, ATTITUDE!"

  That was it! The whole Redwood City philosophy was based on a willingness to try harder than anyone else.

  "One time, some new girls joined the team," Ranadivé said, "and so in the first practice I had I was telling them, 'Look, this is what we're going to do,' and I showed them. I said, 'It's all about attitude.' And there was this one new girl on the team, and I was worried that she wouldn't get the whole attitude thing. Then we did the cheer and she said, 'No, no, it's not One, two three, ATTITUDE. It's One, two, three, attitude HAH ' "--at which point Ranadivé and Craig burst out laughing.

  5.

  In January of 1971, the Fordham University Rams played a basketball game against the University of Massachusetts Redmen. The game was in Amherst, at the legendary arena known as the Cage, where the Redmen hadn't lost since December of 1969. Their record was -1. The Redmen's star was none other than Julius Erving--Dr. J. The UMass team was very, very good. Fordham, by contrast, was a team of scrappy kids from the Bronx and Brooklyn. Their center had torn up his knee the first week of the season, which meant that their tallest player was six feet five. Their starting forward--and forwards are typically almost as tall as centers--was Charlie Yelverton, who was six feet two. But from the opening buzzer the Rams launched a full-court press, and never let up. "We jumped out to a thirteen-to-six lead, and it was a war the rest of the way," Digger Phelps, the Fordham coach at the time, recalls. "These were tough city kids. We played you ninety-four feet. We knew that sooner or later we were going to make you crack." Phelps sent in one indefatigable Irish or Italian kid from the Bronx after another to guard Erving, and, one by one, the indefatigable Irish and Italian kids fouled out. None of them were as good as Erving. It didn't matter. Fordham won, 87-79.

  In the world of basketball, there is one story after another like this about legendary games where David used the full-court press to beat Goliath. Yet the puzzle of the press is that it has never become popular. People look at upsets like Fordham over UMass and call them flukes. Basketball sages point out that the press can be beaten by a well-coached team with adept ball handlers and astute passers--and that is true. Ranadivé readily admitted that all an opposing team had to do to beat Redwood City was press back: the girls were not good enough to handle their own medicine. Playing insurgent basketball did not guarantee victory. It was simply the best chance an underdog had of beating Goliath. If Fordham had played UMass the conventional way, it would have lost by thirty points. And yet somehow that lesson has escaped the basketball establishment.

  What did Digger Phelps do, the season after his stunning upset of UMass? He never used the full-court press the same way again. The UMass coach, Jack Leaman, was humbled in his own gym by a bunch of street kids. Did he learn from his defeat and use the press himself the next time he had a team of underdogs? He did not.

  The only person who seemed to have absorbed the lessons of that game was a skinny little guard on the UMass freshman team named Rick Pitino. He didn't play that day. He watched, and his eyes grew wide. Even now, thirty-eight years later, he can name, from memory, nearly every player on the Fordham team: Yelverton, Sullivan, Mainor, Charles, Zambetti. "They came in with the most unbelievable pressing team I'd ever seen," Pitino said. "Five guys between six feet five and six feet. It was unbelievable how they covered ground. I studied it. There is no way they should have beaten us. Nobody beat us at the Cage."

  Pitino became the head coach at Boston University in 1978, when he was twenty-five years old, and used the press to take the school to its first N.C.A.A. tournament appearance in twenty-four years. At his next head-coaching stop, Providence College, Pitino took over a team that had gone 11-20 the year before. The players were short and almost entirely devoid of talent--a carbon copy of the Fordham Rams. They pressed, and ended up one game away from playing for the national championship. At the University of Kentucky, in the mid-nineteen-nineties, Pitino took his team to the Final Four three times--and won a national championship--with full-court pressure, and then rode the full-court press back to the Final Four in 2005, as the coach at the University of Louisville. This year, his Louisville team entered the N.C.A.A. tournament ranked No. 1 in the land. College coaches of Pitino's calibre typically have had numerous players who have gone on to be bona-fide all-stars at the professional level. In his many years of coaching, Pitino has had one, Antoine Walker. It doesn't matter. Every year, he racks up more and more victories.

  "The greatest example of the press I've ever coached was my Kentucky team in '96, when we played L.S.U.," Pitino said. He was at the athletic building at the University of Louisville, in a small room filled with television screens, where he watches tapes of opponents' games. "Do we have that tape?" Pitino called out to an assistant. He pulled a chair up close to one of the monitors. The game began with Kentucky stealing the ball from L.S.U., deep in L.S.U.'s end. Immediately, the ball was passed to Antoine Walker, who cut to the basket for a layup. L.S.U. got the ball back. Kentucky stole it again. Another easy basket by Walker. "Walker had almost thirty points at halftime," Pitino said. "He dunked it almost every time. When we steal, he just runs to the basket." The Kentucky players were lightning quick and long-armed, and swarmed around the L.S.U. players, arms flailing. It was mayhem. Five minutes in, it was clear that L.S.U. was panicking.

  Pitino trains his players to look for what he calls the "rush state" in their opponents--that moment when the player with the ball is shaken out of his tempo--and L.S.U. could not find a way to get out of the rush state. "See if you find one play that L.S.U. managed to run," Pitino said. You couldn't. The L.S.U. players struggled to get the ball inbounds, and, if they did that, they struggled to get the ball over mid-court, and on those occasions when they managed both those things they were too overwhelmed and exhausted to execute their offense the way they had been trained to. "We had eighty-six points at halftime," Pitino went on--eighty-six points being, of course, what college basketball teams typically score in an entire game. "And I think we'd forced twenty-three turnovers at halftime," twenty-three turnovers being what college basketball teams might force in two games. "I love watching this," Pitino said. He had a faraway look in his eyes. "Every day, you dream about getting a team like this again." So why are there no more than a handful of college teams who use the full-court press the way Pitino does?

  Arreguín-Toft found the same puzzling pattern. When an underdog fought like David, he usually won. But most of the time underdogs didn't fight like David. Of the two hundred and two lopsided conflicts in Arreguín-Toft's database, the underdog chose to go toe to toe with Goliath the conventional way a hundred and fifty-two times--and lost a hundred and nineteen times. In 1809, the Peruvians fought the Spanish straight up and lost; in 1816, the Georgians fought the Russians straight up and lost; in 1817, the Pindaris fought the British straight up and lost; in the Kandyan rebellion of 1817, the Sri Lankans fought the British straight up and lost; in 1823, the Burmese chose to fight the British straight up and lost. The list of failures was endless. In the nineteen-forties, the Communist insurgency in Vietnam bedevilled the French until, in 1951, the Viet Minh strategist Vo Nguyen Giap switched to conventional warfare--and promptly suffered a series of defeats. George Washington did the same in the American Revolution, abandoning the guerrilla tactics that had served the colonists so well in the conflict's early stages. "As quickly as he could," William Polk writes in "Violent Politics," a history of unconventional warfare, Washington "devoted his energies to creating a British-type army, the Continental Line. As a result, he was defeated time after time and almost lost the war."

  It makes no sense, unless you think back to that Kentucky-L.S.U. game and to Lawrence's long march across the desert to Aqaba. It is easier to dress soldiers in bright uniforms and have them march to the sound of a fife-and-drum corps than it is to have them ride six hundred miles through the desert on the back of a camel. It is easier to retreat and compose yourself after every score than swarm about, arms flailing. We tell ourselves that skill is the precious resource and effort is the commodity. It's the other way around. Effort can trump ability--legs, in Saxe's formulation, can overpower arms--because relentless effort is in fact something rarer than the ability to engage in some finely tuned act of motor coördination.

  "I have so many coaches come in every year to learn the press," Pitino said. Louisville was the Mecca for all those Davids trying to learn how to beat Goliaths. "Then they e-mail me. They tell me they can't do it. They don't know if they have the bench. They don't know if the players can last." Pitino shook his head. "We practice every day for two hours " " he went on. "The players are moving almost ninety-eight per cent of the practice. We spend very little time talking. When we make our corrections"--that is, when Pitino and his coaches stop play to give instruction--"they are seven-second corrections, so that our heart rate never rests. We are always working." Seven seconds! The coaches who came to Louisville sat in the stands and watched that ceaseless activity and despaired. The prospect of playing by David's rules was too daunting. They would rather lose.

  6.

  In 1981, a computer scientist from Stanford University named Doug Lenat entered the Traveller Trillion Credit Squadron tournament, in San Mateo, California. It was a war game. The contestants had been given several volumes of rules, well beforehand, and had been asked to design their own fleet of warships with a mythical budget of a trillion dollars. The fleets then squared off against one another in the course of a weekend. "Imagine this enormous auditorium area with tables, and at each table people are paired off," Lenat said. "The winners go on and advance. The losers get eliminated, and the field gets smaller and smaller, and the audience gets larger and larger."

  Lenat had developed an artificial-intelligence program that he called Eurisko, and he decided to feed his program the rules of the tournament. Lenat did not give Eurisko any advice or steer the program in any particular strategic direction. He was not a war-gamer. He simply let Eurisko figure things out for itself. For about a month, for ten hours every night on a hundred computers at Xerox PARC, in Palo Alto, Eurisko ground away at the problem, until it came out with an answer. Most teams fielded some version of a traditional naval fleet--an array of ships of various sizes, each well defended against enemy attack. Eurisko thought differently. "The program came up with a strategy of spending the trillion on an astronomical number of small ships like P.T. boats, with powerful weapons but absolutely no defense and no mobility," Lenat said. "They just sat there. Basically, if they were hit once they would sink. And what happened is that the enemy would take its shots, and every one of those shots would sink our ships. But it didn't matter, because we had so many." Lenat won the tournament in a runaway.

  The next year, Lenat entered once more, only this time the rules had changed. Fleets could no longer just sit there. Now one of the criteria of success in battle was fleet "agility." Eurisko went back to work. "What Eurisko did was say that if any of our ships got damaged it would sink itself--and that would raise fleet agility back up again," Lenat said. Eurisko won again.

  Eurisko was an underdog. The other gamers were people steeped in military strategy and history. They were the sort who could tell you how Wellington had outfoxed Napoleon at Waterloo, or what exactly happened at Antietam. They had been raised on Dungeons and Dragons. They were insiders. Eurisko, on the other hand, knew nothing but the rule book. It had no common sense. As Lenat points out, a human being understands the meaning of the sentences "Johnny robbed a bank. He is now serving twenty years in prison," but Eurisko could not, because as a computer it was perfectly literal; it could not fill in the missing step--"Johnny was caught, tried, and convicted." Eurisko was an outsider. But it was precisely that outsiderness that led to Eurisko's victory: not knowing the conventions of the game turned out to be an advantage.

  "Eurisko was exposing the fact that any finite set of rules is going to be a very incomplete approximation of reality," Lenat explained. "What the other entrants were doing was filling in the holes in the rules with real-world, realistic answers. But Eurisko didn't have that kind of preconception, partly because it didn't know enough about the world." So it found solutions that were, as Lenat freely admits, "socially horrifying": send a thousand defenseless and immobile ships into battle; sink your own ships the moment they get damaged.

  This is the second half of the insurgent's creed. Insurgents work harder than Goliath. But their other advantage is that they will do what is "socially horrifying"--they will challenge the conventions about how battles are supposed to be fought. All the things that distinguish the ideal basketball player are acts of skill and coördination. When the game becomes about effort over ability, it becomes unrecognizable--a shocking mixture of broken plays and flailing limbs and usually competent players panicking and throwing the ball out of bounds. You have to be outside the establishment--a foreigner new to the game or a skinny kid from New York at the end of the bench--to have the audacity to play it that way. George Washington couldn't do it. His dream, before the war, was to be a British Army officer, finely turned out in a red coat and brass buttons. He found the guerrillas who had served the American Revolution so well to be "an exceeding dirty and nasty people." He couldn't fight the establishment, because he was the establishment.

  T. E. Lawrence, by contrast, was the farthest thing from a proper British Army officer. He did not graduate with honors from Sandhurst. He was an archeologist by trade, a dreamy poet. He wore sandals and full Bedouin dress when he went to see his military superiors. He spoke Arabic like a native, and handled a camel as if he had been riding one all his life. And David, let's not forget, was a shepherd. He came at Goliath with a slingshot and staff because those were the tools of his trade. He didn't know that duels with Philistines were supposed to proceed formally, with the crossing of swords. "When the lion or the bear would come and carry off a sheep from the herd, I would go out after him and strike him down and rescue it from his clutches," David explained to Saul. He brought a shepherd's rules to the battlefield.

  The price that the outsider pays for being so heedless of custom is, of course, the disapproval of the insider. Why did the Ivy League schools of the nineteen-twenties limit the admission of Jewish immigrants? Because they were the establishment and the Jews were the insurgents, scrambling and pressing and playing by immigrant rules that must have seemed to the Wasp élite of the time to be socially horrifying. "Their accomplishment is well over a hundred per cent of their ability on account of their tremendous energy and ambition," the dean of Columbia College said of the insurgents from Brooklyn, the Bronx, and the Lower East Side. He wasn't being complimentary. Goliath does not simply dwarf David. He brings the full force of social convention against him; he has contempt for David.

  "In the beginning, everyone laughed at our fleet," Lenat said. "It was really embarrassing. People felt sorry for us. But somewhere around the third round they stopped laughing, and some time around the fourth round they started complaining to the judges. When we won again, some people got very angry, and the tournament directors basically said that it was not really in the spirit of the tournament to have these weird computer-designed fleets winning. They said that if we entered again they would stop having the tournament. I decided the best thing to do was to graciously bow out."

  It isn't surprising that the tournament directors found Eurisko's strategies beyond the pale. 's wrong to sink your own ships, they believed. And they were right. But let's remember who made that rule: Goliath. And let's remember why Goliath made that rule: when the world has to play on Goliath's terms, Goliath wins.

  7.

  The trouble for Redwood City started early in the regular season. The opposing coaches began to get angry. There was a sense that Redwood City wasn't playing fair--that it wasn't right to use the full-court press against twelve-year-old girls, who were just beginning to grasp the rudiments of the game. The point of basketball, the dissenting chorus said, was to learn basketball skills. Of course, you could as easily argue that in playing the press a twelve-year-old girl learned something much more valuable--that effort can trump ability and that conventions are made to be challenged. But the coaches on the other side of Redwood City's lopsided scores were disinclined to be so philosophical.

  "There was one guy who wanted to have a fight with me in the parking lot," Ranadivé said. "He was this big guy. He obviously played football and basketball himself, and he saw that skinny, foreign guy beating him at his own game. He wanted to beat me up."

  Roger Craig says that he was sometimes startled by what he saw. "The other coaches would be screaming at their girls, humiliating them, shouting at them. They would say to the refs--'That's a foul! That's a foul!' But we weren't fouling. We were just playing aggressive defense."

  "My girls were all blond-haired white girls," Ranadivé said. "My daughter is the closest we have to a black girl, because she's half-Indian. One time, we were playing this all-black team from East San Jose. They had been playing for years. These were born-with-a-basketball girls. We were just crushing them. We were up something like twenty to zero. We wouldn't even let them inbound the ball, and the coach got so mad that he took a chair and threw it. He started screaming at his girls, and of course the more you scream at girls that age the more nervous they get." Ranadivé shook his head: never, ever raise your voice. "Finally, the ref physically threw him out of the building. I was afraid. I think he couldn't stand it because here were all these blond-haired girls who were clearly inferior players, and we were killing them."

  At the nationals, the Redwood City girls won their first two games. In the third round, their opponents were from somewhere deep in Orange County. Redwood City had to play them on their own court, and the opponents supplied their own referee as well. The game was at eight o'clock in the morning. The Redwood City players left their hotel at six, to beat the traffic. It was downhill from there. The referee did not believe in "One, two, three, attitude HAH." He didn't think that playing to deny the inbounds pass was basketball. He began calling one foul after another.

"They were touch fouls," Craig said. Ticky-tacky stuff. The memory was painful.

"My girls didn't understand," Ranadivé said. "The ref called something like four times as many fouls on us as on the other team."

"People were booing," Craig said. "It was bad."

"A two-to-one ratio is understandable, but a ratio of four to one?" Ranadivé shook his head.

"One girl fouled out."

"We didn't get blown out. There was still a chance to win. But . . ."

  Ranadivé called the press off. He had to. The Redwood City players retreated to their own end, and passively watched as their opponents advanced down the court. They did not run. They paused and deliberated between each possession. They played basketball the way basketball is supposed to be played, and they lost--but not before making Goliath wonder whether he was a giant, after all.

  How do we hire when we can't tell who's right for the job?

  1.

  On the day of the big football game between the University of Missouri Tigers and the Cowboys of Oklahoma State, a football scout named Dan Shonka sat in his hotel, in Columbia, Missouri, with a portable DVD player. Shonka has worked for three National Football League teams. Before that, he was a football coach, and before that he played linebacker--although, he says, "that was three knee operations and a hundred pounds ago." Every year, he evaluates somewhere between eight hundred and twelve hundred players around the country, helping professional teams decide whom to choose in the college draft, which means that over the last thirty years he has probably seen as many football games as anyone else in America. In his DVD player was his homework for the evening's big game--an edited video of the Tigers' previous contest, against the University of Nebraska Cornhuskers.

  Shonka methodically made his way through the video, stopping and re-winding whenever he saw something that caught his eye. He liked Jeremy Maclin and Chase Coffman, two of the Mizzou receivers. He loved William Moore, the team's bruising strong safety. But, most of all, he was interested in the Tigers' quarterback and star, a stocky, strong-armed senior named Chase Daniel.

  "I like to see that the quarterback can hit a receiver in stride, so he doesn't have to slow for the ball," Shonka began. He had a stack of evaluation forms next to him and, as he watched the game, he was charting and grading every throw that Daniel made. "Then judgment. Hey, if it's not there, throw it away and play another day. Will he stand in there and take a hit, with a guy breathing down his face? Will he be able to step right in there, throw, and still take that hit? Does the guy throw better when he's in the pocket, or does he throw equally well when he's on the move? You want a great competitor. Durability. Can they hold up, their strength, toughness? Can they make big plays? Can they lead a team down the field and score late in the game? Can they see the field? When your team's way ahead, that's fine. But when you're getting your ass kicked I want to see what you're going to do."

  He pointed to his screen. Daniel had thrown a dart, and, just as he did, a defensive player had hit him squarely. "See how he popped up?" Shonka said. "He stood right there and threw the ball in the face of that rush. This kid has got a lot of courage." Daniel was six feet tall and weighed two hundred and twenty-five pounds: thick through the chest and trunk. He carried himself with a self-assurance that bordered on cockiness. He threw quickly and in rhythm. He nimbly evaded defenders. He made short throws with touch and longer throws with accuracy. By the game's end, he had completed an astonishing seventy-eight per cent of his passes, and handed Nebraska its worst home defeat in fifty-three years. "He can zip it," Shonka said. "He can really gun, when he has to." Shonka had seen all the promising college quarterbacks, charted and graded their throws, and to his mind Daniel was special: "He might be one of the best college quarterbacks in the country."

  But then Shonka began to talk about when he was on the staff of the Philadelphia Eagles, in 1999. Five quarterbacks were taken in the first round of the college draft that year, and each looked as promising as Chase Daniel did now. But only one of them, Donovan McNabb, ended up fulfilling that promise. Of the rest, one descended into mediocrity after a decent start. Two were complete busts, and the last was so awful that after failing out of the N.F.L. he ended up failing out of the Canadian Football League as well.

  The year before, the same thing happened with Ryan Leaf, who was the Chase Daniel of 1998. The San Diego Chargers made him the second player taken over all in the draft, and gave him an eleven-million-dollar signing bonus. Leaf turned out to be terrible. In 2002, it was Joey Harrington's turn. Harrington was a golden boy out of the University of Oregon, and the third player taken in the draft. Shonka still can't get over what happened to him.

  "I tell you, I saw Joey live," he said. "This guy threw lasers, he could throw under tight spots, he had the arm strength, he had the size, he had the intelligence." Shonka got as misty as a two-hundred-and-eighty-pound ex-linebacker in a black tracksuit can get. "He's a concert pianist, you know? I really--I mean, I really--liked Joey." And yet Harrington's career consisted of a failed stint with the Detroit Lions and a slide into obscurity. Shonka looked back at the screen, where the young man he felt might be the best quarterback in the country was marching his team up and down the field. "How will that ability translate to the National Football League?" He shook his head slowly. "Shoot."

  This is the quarterback problem. There are certain jobs where almost nothing you can learn about candidates before they start predicts how they'll do once they're hired. So how do we know whom to choose in cases like that? In recent years, a number of fields have begun to wrestle with this problem, but none with such profound social consequences as the profession of teaching.

  2.

  One of the most important tools in contemporary educational research is "value added" analysis. It uses standardized test scores to look at how much the academic performance of students in a given teacher's classroom changes between the beginning and the end of the school year. Suppose that Mrs. Brown and Mr. Smith both teach a classroom of third graders who score at the fiftieth percentile on math and reading tests on the first day of school, in September. When the students are retested, in June, Mrs. Brown's class scores at the seventieth percentile, while Mr. Smith's students have fallen to the fortieth percentile. That change in the students' rankings, value-added theory says, is a meaningful indicator of how much more effective Mrs. Brown is as a teacher than Mr. Smith.

  It's only a crude measure, of course. A teacher is not solely responsible for how much is learned in a classroom, and not everything of value that a teacher imparts to his or her students can be captured on a standardized test. Nonetheless, if you follow Brown and Smith for three or four years, their effect on their students' test scores starts to become predictable: with enough data, it is possible to identify who the very good teachers are and who the very poor teachers are. What's more--and this is the finding that has galvanized the educational world--the difference between good teachers and poor teachers turns out to be vast.

  Eric Hanushek, an economist at Stanford, estimates that the students of a very bad teacher will learn, on average, half a year's worth of material in one school year. The students in the class of a very good teacher will learn a year and a half's worth of material. That difference amounts to a year's worth of learning in a single year. Teacher effects dwarf school effects: your child is actually better off in a "bad" school with an excellent teacher than in an excellent school with a bad teacher. Teacher effects are also much stronger than class-size effects. You'd have to cut the average class almost in half to get the same boost that you'd get if you switched from an average teacher to a teacher in the eighty-fifth percentile. And remember that a good teacher costs as much as an average one, whereas halving class size would require that you build twice as many classrooms and hire twice as many teachers.

  Hanushek recently did a back-of-the-envelope calculation about what even a rudimentary focus on teacher quality could mean for the United States. If you rank the countries of the world in terms of the academic performance of their schoolchildren, the U.S. is just below average, half a standard deviation below a clump of relatively high-performing countries like Canada and Belgium. According to Hanushek, the U.S. could close that gap simply by replacing the bottom six per cent to ten per cent of public-school teachers with teachers of average quality. After years of worrying about issues like school funding levels, class size, and curriculum design, many reformers have come to the conclusion that nothing matters more than finding people with the potential to be great teachers. But there's a hitch: no one knows what a person with the potential to be a great teacher looks like. The school system has a quarterback problem.

  3.

  Kickoff time for Missouri's game against Oklahoma State was seven o'clock. It was a perfect evening for football: cloudless skies and a light fall breeze. For hours, fans had been tailgating in the parking lots around the stadium. Cars lined the roads leading to the university, many with fuzzy yellow-and-black Tiger tails hanging from their trunks. It was one of Mizzou's biggest games in years. The Tigers were undefeated, and had a chance to become the No. 1 college football team in the country. Shonka made his way through the milling crowds and took a seat in the press box. Below him, the players on the field looked like pieces on a chessboard.

  The Tigers held the ball first. Chase Daniel stood a good seven yards behind his offensive line. He had five receivers, two to his left and three to his right, spaced from one side of the field to the other. His linemen were widely spaced as well. In play after play, Daniel caught the snap from his center, planted his feet, and threw the ball in quick seven- and eight-yard diagonal passes to one of his five receivers.

  The style of offense that the Tigers run is called the "spread," and most of the top quarterbacks in college football--the players who will be drafted into the pros--are spread quarterbacks. By spacing out the offensive linemen and wide receivers, the system makes it easy for the quarterback to figure out the intentions of the opposing defense before the ball is snapped: he can look up and down the line, "read" the defense, and decide where to throw the ball before anyone has moved a muscle. Daniel had been playing in the spread since high school; he was its master. "Look how quickly he gets the ball out," Shonka said. "You can hardly go a thousand and one, a thousand and two, and it's out of his hand. He knows right where he's going. When everyone is spread out like that, the defense can't disguise its coverage. Chase knows right away what they are going to do. The system simplifies the quarterback's decisions."

  But for Shonka this didn't help matters. It had always been hard to predict how a college quarterback would fare in the pros. The professional game was, simply, faster and more complicated. With the advent of the spread, though, the correspondence between the two levels of play had broken down almost entirely. N.F.L. teams don't run the spread. They can't. The defenders in the pros are so much faster than their college counterparts that they would shoot through those big gaps in the offensive line and flatten the quarterback. In the N.F.L., the offensive line is bunched closely together. Daniel wouldn't have five receivers. Most of the time, he'd have just three or four. He wouldn't have the luxury of standing seven yards behind the center, planting his feet, and knowing instantly where to throw. He'd have to crouch right behind the center, take the snap directly, and run backward before planting his feet to throw. The onrushing defenders wouldn't be seven yards away. They would be all around him, from the start. The defense would no longer have to show its hand, because the field would not be so spread out. It could now disguise its intentions. Daniel wouldn't be able to read the defense before the snap was taken. He'd have to read it in the seconds after the play began.

  "In the spread, you see a lot of guys wide open," Shonka said. "But when a guy like Chase goes to the N.F.L. he's never going to see his receivers that open--only in some rare case, like someone slips or there's a bust in the coverage. When that ball's leaving your hands in the pros, if you don't use your eyes to move the defender a little bit, they'll break on the ball and intercept it. The athletic ability that they're playing against in the league is unbelievable."

  As Shonka talked, Daniel was moving his team down the field. But he was almost always throwing those quick, diagonal passes. In the N.F.L., he would have to do much more than that--he would have to throw long, vertical passes over the top of the defense. Could he make that kind of throw? Shonka didn't know. There was also the matter of his height. Six feet was fine in a spread system, where the big gaps in the offensive line gave Daniel plenty of opportunity to throw the ball and see downfield. But in the N.F.L. there wouldn't be gaps, and the linemen rushing at him would be six-five, not six-one.

  "I wonder," Shonka went on. "Can he see? Can he be productive in a new kind of offense? How will he handle that? I'd like to see him set up quickly from center. I'd like to see his ability to read coverages that are not in the spread. I'd like to see him in the pocket. I'd like to see him move his feet. I'd like to see him do a deep dig, or deep comeback. You know, like a throw twenty to twenty-five yards down the field."

  It was clear that Shonka didn't feel the same hesitancy in evaluating the other Mizzou stars--the safety Moore, the receivers Maclin and Coffman. The game that they would play in the pros would also be different from the game they were playing in college, but the difference was merely one of degree. They had succeeded at Missouri because they were strong and fast and skilled, and these traits translate in kind to professional football.

  A college quarterback joining the N.F.L., by contrast, has to learn to play an entirely new game. Shonka began to talk about Tim Couch, the quarterback taken first in that legendary draft of 1999. Couch set every record imaginable in his years at the University of Kentucky. "They used to put five garbage cans on the field," Shonka recalled, shaking his head, "and Couch would stand there and throw and just drop the ball into every one." But Couch was a flop in the pros. It wasn't that professional quarterbacks didn't need to be accurate. It was that the kind of accuracy required to do the job well could be measured only in a real N.F.L. game.

  Similarly, all quarterbacks drafted into the pros are required to take an I.Q. test--the Wonderlic Personnel Test. The theory behind the test is that the pro game is so much more cognitively demanding than the college game that high intelligence should be a good predictor of success. But when the economists David Berri and Rob Simmons analyzed the scores--which are routinely leaked to the press--they found that Wonderlic scores are all but useless as predictors. Of the five quarterbacks taken in round one of the 1999 draft, Donovan McNabb, the only one of the five with a shot at the Hall of Fame, had the lowest Wonderlic score. And who else had I.Q. scores in the same range as McNabb? Dan Marino and Terry Bradshaw, two of the greatest quarterbacks ever to play the game.

  We're used to dealing with prediction problems by going back and looking for better predictors. We now realize that being a good doctor requires the ability to communicate, listen, and empathize--and so there is increasing pressure on medical schools to pay attention to interpersonal skills as well as to test scores. We can have better physicians if we're just smarter about how we choose medical-school students. But no one is saying that Dan Shonka is somehow missing some key ingredient in his analysis; that if he were only more perceptive he could predict Chase Daniel's career trajectory. The problem with picking quarterbacks is that Chase Daniel's performance can't be predicted. The job he's being groomed for is so particular and specialized that there is no way to know who will succeed at it and who won't. In fact, Berri and Simmons found no connection between where a quarterback was taken in the draft--that is, how highly he was rated on the basis of his college performance--and how well he played in the pros.

  The entire time that Chase Daniel was on the field against Oklahoma State, his backup, Chase Patton, stood on the sidelines, watching. Patton didn't play a single down. In his four years at Missouri, up to that point, he had thrown a total of twenty-six passes. And yet there were people in Shonka's world who thought that Patton would end up as a better professional quarterback than Daniel. The week of the Oklahoma State game, the national sports magazine ESPN even put the two players on its cover, with the title "CHASE DANIEL MIGHT WIN THE HEISMAN"--referring to the trophy given to college football's best player. "HIS BACKUP COULD WIN THE SUPER BOWL." Why did everyone like Patton so much? It wasn't clear. Maybe he looked good in practice. Maybe it was because this season in the N.F.L. a quarterback who had also never started in a single college game is playing superbly for the New England Patriots. It sounds absurd to put an athlete on the cover of a magazine for no particular reason. But perhaps that's just the quarterback problem taken to an extreme. If college performance doesn't tell us anything, why shouldn't we value someone who hasn't had the chance to play as highly as someone who plays as well as anyone in the land?

  4.

  Picture a young preschool teacher, sitting on a classroom floor surrounded by seven children. She is holding an alphabet book, and working through the letters with the children, one by one: " 'A' is for apple. . . . 'C' is for cow." The session was taped, and the videotape is being watched by a group of experts, who are charting and grading each of the teacher's moves.

  After thirty seconds, the leader of the group--Bob Pianta, the dean of the University of Virginia's Curry School of Education--stops the tape. He points to two little girls on the right side of the circle. They are unusually active, leaning into the circle and reaching out to touch the book.

  "What I'm struck by is how lively the affect is in this room," Pianta said. "One of the things the teacher is doing is creating a holding space for that. And what distinguishes her from other teachers is that she flexibly allows the kids to move and point to the book. She's not rigidly forcing the kids to sit back."

  Pianta's team has developed a system for evaluating various competencies relating to student-teacher interaction. Among them is "regard for student perspective"; that is, a teacher's knack for allowing students some flexibility in how they become engaged in the classroom. Pianta stopped and rewound the tape twice, until what the teacher had managed to achieve became plain: the children were active, but somehow the class hadn't become a free-for-all.

  "A lesser teacher would have responded to the kids' leaning over as misbehavior," Pianta went on. " 'We can't do this right now. You need to be sitting still.' She would have turned this off."

  Bridget Hamre, one of Pianta's colleagues, chimed in: "These are three- and four-year-olds. At this age, when kids show their engagement it's not like the way we show our engagement, where we look alert. They're leaning forward and wriggling. That's their way of doing it. And a good teacher doesn't interpret that as bad behavior. You can see how hard it is to teach new teachers this idea, because the minute you teach them to have regard for the student's perspective, they think you have to give up control of the classroom."

  The lesson continued. Pianta pointed out how the teacher managed to personalize the material. " 'C' is for cow" turned into a short discussion of which of the kids had ever visited a farm. "Almost every time a child says something, she responds to it, which is what we describe as teacher sensitivity," Hamre said.

  The teacher then asked the children if anyone's name "began with that letter. Calvin," a boy named Calvin says. The teacher nods, and says, "Calvin starts with 'C.' " A little girl in the middle says, "Me!" The teacher turns to her. "Your name's Venisha. Letter 'V.' Venisha."

  It was a key moment. Of all the teacher elements analyzed by --the Virginia group, feedbacka direct, personal response by a teacher to a specific statement by a student--seems to be most closely linked to academic success. Not only did the teacher catch the "Me!" amid the wiggling and tumult; she addressed it directly.

  "Mind you, that's not great feedback," Hamre said. "High-quality feedback is where there is a back-and-forth exchange to get a deeper understanding." The perfect way to handle that moment would have been for the teacher to pause and pull out Venisha's name card, point to the letter "V," show her how different it is from "C," and make the class sound out both letters. But the teacher didn't do that--either because it didn't occur to her or because she was distracted by the wiggling of the girls to her right.

  "On the other hand, she could have completely ignored the girl, which happens a lot," Hamre went on. "The other thing that happens a lot is the teacher will just say, 'You're wrong.' Yes-no feedback is probably the predominant kind of feedback, which provides almost no information for the kid in terms of learning."

  Pianta showed another tape, of a nearly identical situation: a circle of pre-schoolers around a teacher. The lesson was about how we can tell when someone is happy or sad. The teacher began by acting out a short conversation between two hand puppets, Henrietta and Twiggle: Twiggle is sad until Henrietta shares some watermelon with him.

  "The idea that the teacher is trying to get across is that you can tell by looking at somebody's face how they're feeling, whether they're feeling sad or happy," Hamre said. "What kids of this age tend to say is you can tell how they're feeling because of something that happened to them. They lost their puppy and that's why they're sad. They don't really get this idea. So she's been challenged, and she's struggling."

  The teacher begins, "Remember when we did something and we drew our face?" She touches her face, pointing out her eyes and mouth. "When somebody is happy, their face tells us that they're happy. And their eyes tell us." The children look on blankly. The teacher plunges on: "Watch, watch." She smiles broadly. "This is happy! How can you tell that I'm happy? Look at my face. Tell me what changes about my face when I'm happy. No, no, look at my face. . . . No. . . ."

  A little girl next to her says, "Eyes," providing the teacher with an opportunity to use one of her students to draw the lesson out. But the teacher doesn't hear her. Again, she asks, "What's changed about my face?" She smiles and she frowns, as if she can reach the children by sheer force of repetition. Pianta stopped the tape. One problem, he pointed out, was that Henrietta made Twiggle happy by sharing watermelon with him, which doesn't illustrate what the lesson is about.

  "You know, a better way to handle this would be to anchor something around the kids," Pianta said. "She should ask, 'What makes you feel happy?' The kids could answer. Then she could say, 'Show me your face when you have that feeling? O.K., what does So-and-So's face look like? Now tell me what makes you sad. Show me your face when you're sad. Oh, look, her face changed!' You've basically made the point. And then you could have the kids practice, or something. But this is going to go nowhere."

  "What's changed about my face?" the teacher repeated, for what seemed like the hundredth time. One boy leaned forward into the circle, trying to engage himself in the lesson, in the way that little children do. His eyes were on the teacher. "Sit up!" she snapped at him.

  As Pianta played one tape after another, the patterns started to become clear. Here was a teacher who read out sentences, in a spelling test, and every sentence came from her own life--"I went to a wedding last week"--which meant she was missing an opportunity to say something that engaged her students. Another teacher walked over to a computer to do a PowerPoint presentation, only to realize that she hadn't turned it on. As she waited for it to boot up, the classroom slid into chaos.

  Then there was the superstar--a young high-school math teacher, in jeans and a green polo shirt. "So let's see," he began, standing up at the blackboard. "Special right triangles. We're going to do practice with this, just throwing out ideas." He drew two triangles. "Label the length of the side, if you can. If you can't, we'll all do it." He was talking and moving quickly, which Pianta said might be interpreted as a bad thing, because this was trigonometry. It wasn't easy material. But his energy seemed to infect the class. And all the time he offered the promise of help. If you can't, we'll all do it. In a corner of the room was a student named Ben, who'd evidently missed a few classes. "See what you can remember, Ben," the teacher said. Ben was lost. The teacher quickly went to his side: "I'm going to give you a way to get to it." He made a quick suggestion: "How about that?" Ben went back to work. The teacher slipped over to the student next to Ben, and glanced at her work. "That's all right!" He went to a third student, then a fourth. Two and a half minutes into the lesson--the length of time it took that subpar teacher to turn on the computer--he had already laid out the problem, checked in with nearly every student in the class, and was back at the blackboard, to take the lesson a step further.

  "In a group like this, the standard m.o. would be: he's at the board, broadcasting to the kids, and has no idea who knows what he's doing and who doesn't know," Pianta said. "But he's giving individualized feedback. He's off the charts on feedback." Pianta and his team watched in awe.

  5.

  Educational-reform efforts typically start with a push for higher standards for teachers--that is, for the academic and cognitive requirements for entering the profession to be as stiff as possible. But after you've watched Pianta's tapes, and seen how complex the elements of effective teaching are, this emphasis on book smarts suddenly seems peculiar. The preschool teacher with the alphabet book was sensitive to her students' needs and knew how to let the two girls on the right wiggle and squirm without disrupting the rest of the students; the trigonometry teacher knew how to complete a circuit of his classroom in two and a half minutes and make everyone feel as if he or she were getting his personal attention. But these aren't cognitive skills.

  A group of researchers--Thomas J. Kane, an economist at Harvard's school of education; Douglas Staiger, an economist at Dartmouth; and Robert Gordon, a policy analyst at the Center for American Progress--have investigated whether it helps to have a teacher who has earned a teaching certification or a master's degree. Both are expensive, time-consuming credentials that almost every district expects teachers to acquire; neither makes a difference in the classroom. Test scores, graduate degrees, and certifications--as much as they appear related to teaching prowess--turn out to be about as useful in predicting success as having a quarterback throw footballs into a bunch of garbage cans.

  Another educational researcher, Jacob Kounin, once did an analysis of "desist" events, in which a teacher has to stop some kind of misbehavior. In one instance, "Mary leans toward the table to her right and whispers to Jane. Both she and Jane giggle. The teacher says, 'Mary and Jane, stop that!' " That's a desist event. But how a teacher desists--her tone of voice, her attitudes, her choice of words--appears to make no difference at all in maintaining an orderly classroom. How can that be? Kounin went back over the videotape and noticed that forty-five seconds before Mary whispered to Jane, Lucy and John had started whispering. Then Robert had noticed and joined in, making Jane giggle, whereupon Jane said something to John. Then Mary whispered to Jane. It was a contagious chain of misbehavior, and what really was significant was not how a teacher stopped the deviancy at the end of the chain but whether she was able to stop the chain before it started. Kounin called that ability "withitness," which he defined as "a teacher's communicating to the children by her actual behavior (rather than by verbally announcing: 'I know what's going on') that she knows what the children are doing, or has the proverbial 'eyes in the back of her head.' " It stands to reason that to be a great teacher you have to have withitness. But how do you know whether someone has withitness until she stands up in front of a classroom of twenty-five wiggly Janes, Lucys, Johns, and Roberts and tries to impose order?

  6.

  Perhaps no profession has taken the implications of the quarterback problem more seriously than the financial-advice field, and the experience of financial advisers is a useful guide to what could happen in teaching as well. There are no formal qualifications for entering the field except a college degree. Financial-services firms don't look for only the best students, or require graduate degrees or specify a list of prerequisites. No one knows beforehand what makes a high-performing financial adviser different from a low-performing one, so the field throws the door wide open.

  "A question I ask is, 'Give me a typical day,' " Ed Deutschlander, the co-president of North Star Resource Group, in Minneapolis, says. "If that person says, 'I get up at five-thirty, hit the gym, go to the library, go to class, go to my job, do homework until eleven,' that person has a chance." Deutschlander, in other words, begins by looking for the same general traits that every corporate recruiter looks for.

  Deutschlander says that last year his firm interviewed about a thousand people, and found forty-nine it liked, a ratio of twenty interviewees to one candidate. Those candidates were put through a four-month "training camp," in which they tried to act like real financial advisers. "They should be able to obtain in that four-month period a minimum of ten official clients," Deutschlander said. "If someone can obtain ten clients, and is able to maintain a minimum of ten meetings a week, that means that person has gathered over a hundred introductions in that four-month period. Then we know that person is at least fast enough to play this game."

  Of the forty-nine people invited to the training camp, twenty-three made the cut and were hired as apprentice advisers. Then the real sorting began. "Even with the top performers, it really takes three to four years to see whether someone can make it," Deutschlander says. "You're just scratching the surface at the beginning. Four years from now, I expect to hang on to at least thirty to forty per cent of that twenty-three."

  People like Deutschlander are referred to as gatekeepers, a title that suggests that those at the door of a profession are expected to discriminate--to select who gets through the gate and who doesn't. But Deutschlander sees his role as keeping the gate as wide open as possible: to find ten new financial advisers, he's willing to interview a thousand people. The equivalent of that approach, in the N.F.L., would be for a team to give up trying to figure out who the "best" college quarterback is, and, instead, try out three or four "good" candidates.

  In teaching, the implications are even more profound. They suggest that we shouldn't be raising standards. We should be lowering them, because there is no point in raising standards if standards don't track with what we care about. Teaching should be open to anyone with a pulse and a college degree--and teachers should be judged after they have started their jobs, not before. That means that the profession needs to start the equivalent of Ed Deutschlander's training camp. It needs an apprenticeship system that allows candidates to be rigorously evaluated. Kane and Staiger have calculated that, given the enormous differences between the top and the bottom of the profession, you'd probably have to try out four candidates to find one good teacher. That means tenure can't be routinely awarded, the way it is now. Currently, the salary structure of the teaching profession is highly rigid, and that would also have to change in a world where we want to rate teachers on their actual performance. An apprentice should get apprentice wages. But if we find eighty-fifth-percentile teachers who can teach a year and a half's material in one year, we're going to have to pay them a lot--both because we want them to stay and because the only way to get people to try out for what will suddenly be a high-risk profession is to offer those who survive the winnowing a healthy reward.

  Is this solution to teaching's quarterback problem politically possible? Taxpayers might well balk at the costs of trying out four teachers to find one good one. Teachers' unions have been resistant to even the slightest move away from the current tenure arrangement. But all the reformers want is for the teaching profession to copy what firms like North Star have been doing for years. Deutschlander interviews a thousand people to find ten advisers. He spends large amounts of money to figure out who has the particular mixture of abilities to do the job. "Between hard and soft costs," he says, "most firms sink between a hundred thousand dollars and two hundred and fifty thousand dollars on someone in their first three or four years," and in most cases, of course, that investment comes to naught. But, if you were willing to make that kind of investment and show that kind of patience, you wound up with a truly high-performing financial adviser. "We have a hundred and twenty-five full-time advisers," Deutschlander says. "Last year, we had seventy-one of them qualify for the Million Dollar Round Table"--the industry's association of its most successful practitioners. "We're seventy-one out of a hundred and twenty-five in that élite group." What does it say about a society that it devotes more care and patience to the selection of those who handle its money than of those who handle its children?

  7.

  Midway through the fourth quarter of the Oklahoma State-Missouri game, the Tigers were in trouble. For the first time all year, they were behind late in the game. They needed to score, or they'd lose any chance of a national championship. Daniel took the snap from his center, and planted his feet to pass. His receivers were covered. He began to run. The Oklahoma State defenders closed in on him. He was under pressure, something that rarely happened to him in the spread. Desperate, he heaved the ball downfield, right into the arms of a Cowboy defender.

  Shonka jumped up. "That's not like him!" he cried out. "He doesn't throw stuff up like that."

  Next to Shonka, a scout for the Kansas City Chiefs looked crestfallen. "Chase never throws something up for grabs!"

  Can underprivileged outsiders have an advantage?

  1.

  Sidney Weinberg was born in 1891, one of eleven children of Pincus Weinberg, a struggling Polish-born liquor wholesaler and bootlegger in Brooklyn. Sidney was short, a "Kewpie doll," as the New Yorker writer E. J. Kahn, Jr., described him, "in constant danger of being swallowed whole by executive-size chairs." He pronounced his name "Wine-boig." He left school at fifteen. He had scars on his back from knife fights in his preteen days, when he sold evening newspapers at the Hamilton Avenue terminus of the Manhattan-Brooklyn ferry.

  At sixteen, he made a visit to Wall Street, keeping an eye out for a "nice-looking, tall building," as he later recalled. He picked 43 Exchange Place, where he started at the top floor and worked his way down, asking at every office, "Want a boy?" By the end of the day, he had reached the third-floor offices of a small brokerage house. There were no openings. He returned to the brokerage house the next morning. He lied that he was told to come back, and bluffed himself into a job assisting the janitor, for three dollars a week. The small brokerage house was Goldman Sachs.

  From that point, Charles Ellis tells us in a new book, "The Partnership: The Making of Goldman Sachs," Weinberg's rise was inexorable. Early on, he was asked to carry a flagpole on the trolley uptown to the Sachs family's town house. The door was opened by Paul Sachs, the grandson of the firm's founder, and Sachs took a shine to him. Weinberg was soon promoted to the mailroom, which he promptly reorganized. Sachs sent him to Browne's Business College, in Brooklyn, to learn penmanship. By 1925, the firm had bought him a seat on the New York Stock Exchange. By 1927, he had made partner. By 1930, he was a senior partner, and for the next thirty-nine years--until his death, in 1969--Weinberg was Goldman Sachs, turning it from a floundering, mid-tier partnership into the premier investment bank in the world.

  2.

  The rags-to-riches story--that staple of American biography--has over the years been given two very different interpretations. The nineteenth-century version stressed the value of compensating for disadvantage. If you wanted to end up on top, the thinking went, it was better to start at the bottom, because it was there that you learned the discipline and motivation essential for success. "New York merchants preferred to hire country boys, on the theory that they worked harder, and were more resolute, obedient, and cheerful than native New Yorkers," Irvin G. Wyllie wrote in his 1954 study "The Self-Made Man in America." Andrew Carnegie, whose personal history was the defining self-made-man narrative of the nineteenth century, insisted that there was an advantage to being "cradled, nursed and reared in the stimulating school of poverty." According to Carnegie, "It is not from the sons of the millionaire or the noble that the world receives its teachers, its martyrs, its inventors, its statesmen, its poets, or even its men of affairs. It is from the cottage of the poor that all these spring."

  Today, that interpretation has been reversed. Success is seen as a matter of capitalizing on socioeconomic advantage, not compensating for disadvantage. The mechanisms of social mobility--scholarships, affirmative action, housing vouchers, Head Start--all involve attempts to convert the poor from chronic outsiders to insiders, to rescue them from what is assumed to be a hopeless state. Nowadays, we don't learn from poverty, we escape from poverty, and a book like Ellis's history of Goldman Sachs is an almost perfect case study of how we have come to believe social mobility operates. Six hundred pages of Ellis's book are devoted to the modern-day Goldman, the firm that symbolized the golden era of Wall Street. From the boom years of the nineteen-eighties through the great banking bubble of the past decade, Goldman brought impeccably credentialled members of the cognitive and socioeconomic élite to Wall Street, where they conjured up fantastically complex deals and made enormous fortunes. The opening seventy-two pages of the book, however, the chapters covering the Sidney Weinberg years, seem as though they belong to a different era. The man who created what we know as Goldman Sachs was a poor, uneducated member of a despised minority--and his story is so remarkable that perhaps only Andrew Carnegie could make sense of it.

  3.

  Weinberg was not a financial wizard. His gifts were social. In his heyday, Weinberg served as a director on thirty-one corporate boards. He averaged two hundred and fifty committee or board meetings a year, and when he was not in meetings he would often take a steam at the Hotel Biltmore's Turkish baths with the likes of Robert Woodruff, of Coca-Cola, and Bernard Gimbel, of Gimbels. During the Depression, Weinberg served on Franklin Roosevelt's Business Advisory and Planning Council, and F.D.R. dubbed him the Politician, for his skill at mediating among contentious parties. He spent the war years as the vice-chairman of the War Production Board, where he was known as the Body Snatcher, because of the way he persuaded promising young business executives to join the war effort. (Weinberg seems to have been the first to realize that signing up promising young executives for public service during the war was the surest way to sign them up as clients after the war.)

  When Ford Motor Company decided to go public, in the mid-nineteen-fifties, in what remains one of the world's biggest initial public offerings, both major parties in the hugely complicated transaction--the Ford family and the Ford Foundation--wanted Weinberg to represent them. He was Mr. Wall Street. "In his role as the power behind the throne," E. J. Kahn wrote in a New Yorker Profile of Weinberg, fifty years ago, "he probably comes as close as Bernard Baruch to embodying the popular conception of Bernard Baruch." Kahn went on:

There is hardly a prominent corporation executive of whom he cannot--and, indeed, does not--say, "He's an intimate close personal friend of mine." . . . Industrialists who want information about other industrialists automatically turn to Weinberg, much as merchants consult credit-rating agencies. His end of many telephone conversations consists of fragments like "Who? . . . Of course I know him. Intimately. . . . Used to be Under-Secretary of the Treasury. . . . O.K., I'll have him call you."

  This gregariousness is what we expect of the head of an investment bank. Wall Street--particularly the clubby Wall Street of the early and middle part of the twentieth century--was a relationship business: you got to do the stock offering for Continental Can because you knew the head of Continental Can. We further assume that businesses based on social ties reward cultural insiders. That's one of the reasons we no longer think of poverty as being useful in the nineteenth-century sense; no matter how hard you work, or how disciplined you are, it is difficult to overcome the socially marginalizing effects of an impoverished background. In order to do the stock offering for Continental Can, you need to know the head of Continental Can, and in order to know the head of Continental Can it really helps to have been his classmate at Yale.

  But Weinberg wasn't Yale. He was P.S. 13. Nor did he try to pretend that he was an insider. He did the opposite. "You'll have to make that plainer," he would say. "I'm just a dumb, uneducated kid from Brooklyn." He bought a modest house in Scarsdale in the nineteen-twenties, and lived there the rest of his life. He took the subway. He may have worked closely with the White House, but this was the Roosevelt White House, in the nineteen-thirties, at a time when none of the Old Guard on Wall Street were New Dealers. Weinberg would talk about his public school as if it were Princeton, and as a joke he would buy up Phi Beta Kappa keys from pawnshops and hand them out to visitors like party favors. His savvy was such that Roosevelt wanted to make him Ambassador to the Soviet Union, and his grasp of the intricacies of Wall Street was so shrewd that his phone never stopped ringing. But as often as he could he reminded his peers that he was from the other side of the tracks.

  At one board meeting, Ellis writes, "a long presentation was being made that was overloaded with dull, detailed statistics. Number after number was read off. When the droning presenter finally paused for breath, Weinberg jumped up, waving his papers in mock triumph, to call out 'Bingo!' " The immigrant's best strategy, in the famous adage, is to think Yiddish and dress British. Weinberg thought British and dressed Yiddish.

  Why did that strategy work? This is the great mystery of Weinberg's career, and it's hard to escape the conclusion that Carnegie was on to something: there are times when being an outsider is precisely what makes you a good insider. It's not difficult to imagine, for example, that the head of Continental Can liked the fact that Weinberg was from nothing, in the same way that New York City employers preferred country boys to city boys. That C.E.O. dwelled in a world with lots of people who went to Yale and then to Wall Street; he knew that some of them were good at what they did and some of them were just well connected, and separating the able from the incompetent wasn't always easy. Weinberg made it out of Brooklyn; how could he not be good?

  Weinberg's outsiderness also allowed him to play the classic "middleman minority" role. One of the reasons that the Parsi in India, the East Asians in Africa, the Chinese in Southeast Asia, and the Lebanese in the Caribbean, among others, have been so successful, sociologists argue, is that they are decoupled from the communities in which they operate. If you are a Malaysian in Malaysia, or a Kenyan in Kenya, or an African-American in Watts, and you want to run a grocery store, you start with a handicap: you have friends and relatives who want jobs, or discounts. You can't deny credit or collect a debt from your neighbor, because he's your neighbor, and your social and business lives are tied up together. As the anthropologist Brian Foster writes of commerce in Thailand:

A trader who was subject to the traditional social obligations and constraints would find it very difficult to run a viable business. If, for example, he were fully part of the village society and subject to the constraints of the society, he would be expected to be generous in the traditional way to those in need. It would be difficult for him to refuse credit, and it would not be possible to collect debts. . . . The inherent conflict of interest in a face-to-face market transaction would make proper etiquette impossible or would at least strain it severely, which is an important factor in Thai social relations.

  The minority has none of those constraints. He's free to keep social and financial considerations separate. He can call a bad debt a bad debt, or a bad customer a bad customer, without worrying about the social implications of his honesty.

  Weinberg was decoupled from the business establishment in the same way, and that seems to have been a big part of what drew executives to him. The chairman of General Foods avowed, "Sidney is the only man I know who could ever say to me in the middle of a board meeting, as he did once, 'I don't think you're very bright,' and somehow give me the feeling that I'd been paid a compliment." That Weinberg could make a rebuke seem like a compliment is testament to his charm. That he felt free to deliver the rebuke in the first place is testament to his sociological position. You can't tell the chairman of General Foods that he's an idiot if you were his classmate at Yale. But you can if you're Pincus Weinberg's son from Brooklyn. Truthtelling is easier from a position of cultural distance.

  Here is Ellis on Weinberg, again:

Shortly after he was elected a director of General Electric, he was called upon by Philip D. Reed, GE's chairman of the board, to address a group of company officials at a banquet at the Waldorf Astoria. In presenting Weinberg, Reed said . . . that he hoped Mr. Weinberg felt, as he felt, that GE was the greatest outfit in the greatest industry in the greatest country in the world. Weinberg got to his feet. "I'll string along with your chairman about this being the greatest country," he began. "And I guess I'll even buy that part about the electrical industry. But as to GE's being the greatest business in the field, why, I'm damned if I'll commit myself until I've had a look-see." Then he sat down to vigorous applause.

  At G.E., Weinberg's irreverence was cherished. During the Second World War, a top Vichy official, Admiral Jean-François Darlan, visited the White House. Darlan was classic French military, imperious and entitled, and was thought to have Nazi sympathies. Protocol dictated that the Allies treat Darlan with civility, and everyone did--save for Weinberg. The outsider felt perfectly free to say what everyone else wanted to but could not, and in so doing surely endeared himself to the whole room. "When it was time to leave," Ellis writes, "Weinberg reached into his pocket as he came to the front door, pulled out a quarter, and handed it to the resplendently uniformed admiral, saying, 'Here, boy, get me a cab.'"

  The idea that outsiders can profit by virtue of their outsiderness runs contrary to our understanding of minorities. "Think Yiddish, dress British" presumes that the outsider is better off cloaking his differences. But there are clearly also times and places where minorities benefit by asserting and even exaggerating their otherness. The Berkeley historian Yuri Slezkine argues, in "The Jewish Century" (2004), that Yiddish did not evolve typically: if you study its form and structure, you discover its deliberate and fundamental artificiality--it is the language of people who are interested, in Slezkine's words, in "the maintenance of difference, the conscious preservation of the self and thus of strangeness."

  Similarly, in field work in a Malaysian village, the anthropologist L. A. Peter Gosling observed a Chinese shopkeeper who

appeared to be considerably acculturated to Malay culture, and was scrupulously sensitive to Malays in every way, including the normal wearing of a sarong, quiet and polite Malay speech, and a humble and affable manner. However, at harvest time when he would go to the field to collect crops on which he had advanced credit, he would put on his Chinese costume of shorts and undershirt, and speak in a much more abrupt fashion, acting, as one Malay farmer put it, "just like a Chinese." This behavior was to insure that he would not be treated like a fellow Malay who might be expected to be more generous on price or credit terms.

  Is this what Weinberg was up to with his constant references to P.S. 13? Ellis's book repeats stories about Weinberg from Lisa Endlich's 1999 history, "Goldman Sachs: The Culture of Success," which in turn repeats stories about Weinberg from Kahn's Profile, which in turn--one imagines--repeats stories honed by Weinberg and his friends over the years. And what is clear when you read those stories is how obviously they are stories: anecdotes clearly constructed for strategic effect. According to Ellis:

A friend told of Weinberg's being the guest of honor at J. P. Morgan's luncheon table, where the following exchange occurred: "Mr. Weinberg, I presume you served in the last war?"
"Yes, sir, I was in the war--in the navy."
"What were you in the navy?"
"Cook, Second Class."
Morgan was delighted.

  Of course, J. P. Morgan wasn't actually delighted. He died in 1913, before the First World War started. So he wasn't the mogul at the table. But you can understand why Weinberg would want to pretend that he was. And although Weinberg did a stint as a cook (on account of poor eyesight), he quickly got himself transferred to the Office of Naval Intelligence, and then spent most of the war heading up the inspection of all vessels using the port of Norfolk. But you can understand why that little bit of additional history doesn't fit, either.

  Here's another one:

The heir to a large retailing fortune once spent a night in Scarsdale with the Weinbergs and retired early. After Weinberg and his wife, whose only servant was a cook, had emptied the ashtrays and picked up the glasses, they noticed that their guest had put his suit and shoes outside his bedroom door. Amused, Weinberg took the suit and shoes down to the kitchen, cleaned the shoes, brushed the suit, and put them back. The following day, as the guest was leaving, he handed Weinberg a five dollar bill and asked him to pass it along to the butler who had taken such excellent care of things. Weinberg thanked him gravely and pocketed the money.

  Let's see: we're supposed to believe that the retailing heir has dinner at the modest Weinberg residence in Scarsdale and never once sees a butler, and doesn't see a butler in the morning, either, and yet somehow remains convinced that there's a butler around. Did he imagine the butler was hiding in a closet? No matter. This is another of those stories which Weinberg needed to tell, and his audience needed to hear.

  4.

  It's one thing to argue that being an outsider can be strategically useful. But Andrew Carnegie went farther. He believed that poverty provided a better preparation for success than wealth did; that, at root, compensating for disadvantage was more useful, developmentally, than capitalizing on advantage.

  This idea is both familiar and perplexing. Consider the curious fact that many successful entrepreneurs suffer from serious learning disabilities. Paul Orfalea, the founder of the Kinko's chain, was a D student who failed two grades, was expelled from four schools, and graduated at the bottom of his high-school class. "In third grade, the only word I could read was 'the,' " he says. "I used to keep track of where the group was reading by following from one 'the' to the next." Richard Branson, the British billionaire who started the Virgin empire, dropped out of school at fifteen after struggling with reading and writing. "I was always bottom of the class," he has said. John Chambers, who built the Silicon Valley firm Cisco into a hundred-billion-dollar corporation, has trouble reading e-mail. One of the pioneers of the cellular-phone industry, Craig McCaw, is dyslexic, as is Charles Schwab, the founder of the discount brokerage house that bears his name. When the business-school professor Julie Logan surveyed a group of American small-business owners recently, she found that thirty-five per cent of them self-identified as dyslexic.

  That is a remarkable statistic. Dyslexia affects the very skills that lie at the center of an individual's ability to manage the modern world. Yet Schwab and Orfalea and Chambers and Branson seem to have made up for their disabilities, in the same way that the poor, in Carnegie's view, can make up for their poverty. Because of their difficulties with reading and writing, they were forced to develop superior oral-communication and problem-solving skills. Because they had to rely on others to help them navigate the written word, they became adept at delegating authority. In one study, conducted in Britain, eighty per cent of dyslexic entrepreneurs were found to have held the position of captain in a high-school sport, versus twenty-seven per cent of non-dyslexic entrepreneurs. They compensated for their academic shortcomings by developing superior social skills, and, when they reached the workplace, those compensatory skills gave them an enormous head start. "I didn't have a lot of self-confidence as a kid," Orfalea said once, in an interview. "And that is for the good. If you have a healthy dose of rejection in your life, you are going to have to figure out how to do it your way."

  There's no question that we are less than comfortable with the claims that people like Schwab and Orfalea make on behalf of their disabilities. As impressive as their success has been, none of us would go so far as to wish dyslexia on our own children. If a disproportionately high number of entrepreneurs are dyslexic, so are a disproportionately high number of prisoners. Systems in which people compensate for disadvantage seem to us unacceptably Darwinian. The stronger get stronger, and the weaker get even weaker. The man who boasts of walking seven miles to school, barefoot, every morning, happily drives his own grandchildren ten blocks in an S.U.V. We have become convinced that the surest path to success for our children involves providing them with a carefully optimized educational experience: the "best" schools, the most highly educated teachers, the smallest classrooms, the shiniest facilities, the greatest variety of colors in the art-room paint box. But one need only look at countries where schoolchildren outperform their American counterparts--despite larger classes, shabbier schools, and smaller budgets--to wonder if our wholesale embrace of the advantages of advantages isn't as simplistic as Carnegie's wholesale embrace of the advantages of disadvantages.

  In E. J. Kahn's Profile, he tells the story of a C.E.O. retreat that Weinberg attended, organized by Averell Harriman. It was at Sun Valley, Harriman's ski resort, where, Kahn writes, it emerged that Weinberg had never skied before:

Several corporation presidents pooled their cash resources to bet him twenty-five dollars that he could not ski down the steepest and longest slope in the area. Weinberg was approaching fifty but game. "I got hold of an instructor named Franz Something or Fritz Something and had a thirty minute lesson," he says. "Then I rode up to the top of the mountain. It took me half a day to come down, and I finished with only one ski, and for two weeks I was black and blue all over, but I won the bet."

  Here you have the Waspy élite of corporate America, off in their mountain idyll, subjecting the little Jew from Brooklyn to a bit of boarding-school hazing. (In a reminder of the anti-Semitism permeating Weinberg's world, Ellis tells us that, in the Depression, Manufacturers Trust, a predominantly Jewish company, had to agree to install a Gentile as C.E.O. as a condition of being rescued by a coalition of banks.) It is also possible, though, to read that story as highlighting the determination of the Brooklyn kid who'll be damned if he's going to let himself lose a bet to those smirking C.E.O.s. One imagines that Weinberg told that tale the first way to his wife, and the second way to his buddies in the Biltmore steam room. And when he tried to get out of bed the next morning it probably occurred to him that sometimes being humiliated provides a pretty good opportunity to show a lodge full of potential clients that you would ski down a mountain for them.

  Twenty years later, Weinberg had his greatest score, handling the initial public offering for Ford Motor Company, which was founded, of course, by that odious anti-Semite Henry Ford. Did taking the business prick Weinberg's conscience? Maybe so. But he probably realized that the unstated premise behind the idea that the Jews control all the banks is that Jews are really good bankers. The first was a stereotype that oppressed; the second was a stereotype that, if you were smart about it, you could use to win a few clients. If you're trying to build an empire, you work with what you have.

  5.

  In 1918, Henry Goldman, one of the senior partners of Goldman Sachs, quit the firm in a dispute over Liberty Bonds. Goldman was a Germanophile, who objected to aiding the Allied war effort. (This is the same Henry Goldman who later bought the twelve-year-old Yehudi Menuhin a Stradivarius and Albert Einstein a yacht.) The Sachs brothers--Walter and Arthur--were desperate for a replacement, and they settled, finally, on a young man named Waddill Catchings, a close friend of Arthur Sachs from Harvard. He had worked at Sullivan & Cromwell, Wall Street's great patrician law firm. He had industrial experience, having reorganized several companies, and "on top of all that," Ellis tells us, "Catchings was one of the most talented, charming, handsome, well-educated, and upwardly mobile people in Wall Street."

  Catchings's bold idea was to create a huge investment trust, called the Goldman Sachs Trading Corporation. It was a precursor to today's hedge funds; it borrowed heavily to buy controlling stakes in groups of corporations. The fund was originally intended to be twenty-five million dollars, but then Catchings, swept up in the boom market of the nineteen-twenties, doubled it to fifty million, doubled it again to a hundred million, then merged the Goldman fund with another fund and added two subsidiary trusts, until G.S.T.C. controlled half a billion dollars in assets.

  "Walter and Arthur Sachs were travelling in Europe during the summer of 1929," Ellis writes. "In Italy they learned of the deals Catchings was doing on his own, and Walter Sachs got worried. On his return to New York, he went straight to Catchings' apartment in the Plaza Hotel to urge greater caution. But Catchings, still caught up in the bull-market euphoria, was unmoved. "The trouble with you, Walter, is that you've no imagination," he said.

  Why do we equate genius with precocity?

  1.

  Ben Fountain was an associate in the real-estate practice at the Dallas offices of Akin, Gump, Strauss, Hauer & Feld, just a few years out of law school, when he decided he wanted to write fiction. The only thing Fountain had ever published was a law-review article. His literary training consisted of a handful of creative-writing classes in college. He had tried to write when he came home at night from work, but usually he was too tired to do much. He decided to quit his job.

  "I was tremendously apprehensive," Fountain recalls. "I felt like I'd stepped off a cliff and I didn't know if the parachute was going to open. Nobody wants to waste their life, and I was doing well at the practice of law. I could have had a good career. And my parents were very proud of me--my dad was so proud of me. . . . It was crazy."

  He began his new life on a February morning--a Monday. He sat down at his kitchen table at 7:30 A.M. He made a plan. Every day, he would write until lunchtime. Then he would lie down on the floor for twenty minutes to rest his mind. Then he would return to work for a few more hours. He was a lawyer. He had discipline. "I figured out very early on that if I didn't get my writing done I felt terrible. So I always got my writing done. I treated it like a job. I did not procrastinate." His first story was about a stockbroker who uses inside information and crosses a moral line. It was sixty pages long and took him three months to write. When he finished that story, he went back to work and wrote another--and then another.

  In his first year, Fountain sold two stories. He gained confidence. He wrote a novel. He decided it wasn't very good, and he ended up putting it in a drawer. Then came what he describes as his dark period, when he adjusted his expectations and started again. He got a short story published in Harper's. A New York literary agent saw it and signed him up. He put together a collection of short stories titled "Brief Encounters with Che Guevara," and Ecco, a HarperCollins imprint, published it. The reviews were sensational. The Times Book Review called it "heartbreaking." It won the Hemingway Foundation/PEN award. It was named a No. 1 Book Sense Pick. It made major regional best-seller lists, was named one of the best books of the year by the San Francisco Chronicle, the Chicago Tribune, and Kirkus Reviews, and drew comparisons to Graham Greene, Evelyn Waugh, Robert Stone, and John le Carré.

  Ben Fountain's rise sounds like a familiar story: the young man from the provinces suddenly takes the literary world by storm. But Ben Fountain's success was far from sudden. He quit his job at Akin, Gump in 1988. For every story he published in those early years, he had at least thirty rejections. The novel that he put away in a drawer took him four years. The dark period lasted for the entire second half of the nineteen-nineties. His breakthrough with "Brief " came in 2006, eighteen years after he first sat down to write at his kitchen table. The "young" writer from the provinces took the literary world by storm at the age of forty-eight.

  2.

  Genius, in the popular conception, is inextricably tied up with precocity--doing something truly creative, we're inclined to think, requires the freshness and exuberance and energy of youth. Orson Welles made his masterpiece, "Citizen Kane," at twenty-five. Herman Melville wrote a book a year through his late twenties, culminating, at age thirty-two, with "Moby-Dick." Mozart wrote his breakthrough Piano Concerto No. 9 in E-Flat-Major at the age of twenty-one. In some creative forms, like lyric poetry, the importance of precocity has hardened into an iron law. How old was T. S. Eliot when he wrote "The Love Song of J. Alfred Prufrock" ("I grow old . . . I grow old")? Twenty-three. "Poets peak young," the creativity researcher James Kaufman maintains. Mihály Csíkszentmihályi, the author of "Flow," agrees: "The most creative lyric verse is believed to be that written by the young." According to the Harvard psychologist Howard Gardner, a leading authority on creativity, "Lyric poetry is a domain where talent is discovered early, burns brightly, and then peters out at an early age."

  A few years ago, an economist at the University of Chicago named David Galenson decided to find out whether this assumption about creativity was true. He looked through forty-seven major poetry anthologies published since 1980 and counted the poems that appear most frequently. Some people, of course, would quarrel with the notion that literary merit can be quantified. But Galenson simply wanted to poll a broad cross-section of literary scholars about which poems they felt were the most important in the American canon. The top eleven are, in order, T. S. Eliot's "Prufrock," Robert Lowell's "Skunk Hour," Robert Frost's "Stopping by Woods on a Snowy Evening," William Carlos Williams's "Red Wheelbarrow," Elizabeth Bishop's "The Fish," Ezra Pound's "The River Merchant's Wife," Sylvia Plath's "Daddy," Pound's "In a Station of the Metro," Frost's "Mending Wall," Wallace Stevens's "The Snow Man," and Williams's "The Dance." Those eleven were composed at the ages of twenty-three, forty-one, forty-eight, forty, twenty-nine, thirty, thirty, twenty-eight, thirty-eight, forty-two, and fifty-nine, respectively. There is no evidence, Galenson concluded, for the notion that lyric poetry is a young person's game. Some poets do their best work at the beginning of their careers. Others do their best work decades later. Forty-two per cent of Frost's anthologized poems were written after the age of fifty. For Williams, it's forty-four per cent. For Stevens, it's forty-nine per cent.

  The same was true of film, Galenson points out in his study "Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity." Yes, there was Orson Welles, peaking as a director at twenty-five. But then there was Alfred Hitchcock, who made "Dial M for Murder," "Rear Window," "To Catch a Thief," "The Trouble with Harry," "Vertigo," "North by Northwest," and "Psycho"--one of the greatest runs by a director in history--between his fifty-fourth and sixty-first birthdays. Mark Twain published "Adventures of Huckleberry Finn" at forty-nine. Daniel Defoe wrote "Robinson Crusoe" at fifty-eight.

  The examples that Galenson could not get out of his head, however, were Picasso and Cézanne. He was an art lover, and he knew their stories well. Picasso was the incandescent prodigy. His career as a serious artist began with a masterpiece, "Evocation: The Burial of Casagemas," produced at age twenty. In short order, he painted many of the greatest works of his career--including "Les Demoiselles d'Avignon," at the age of twenty-six. Picasso fit our usual ideas about genius perfectly.

  Cézanne didn't. If you go to the Cézanne room at the Musée d'Orsay, in Paris--the finest collection of Cézannes in the world--the array of masterpieces you'll find along the back wall were all painted at the end of his career. Galenson did a simple economic analysis, tabulating the prices paid at auction for paintings by Picasso and Cézanne with the ages at which they created those works. A painting done by Picasso in his mid-twenties was worth, he found, an average of four times as much as a painting done in his sixties. For Cézanne, the opposite was true. The paintings he created in his mid-sixties were valued fifteen times as highly as the paintings he created as a young man. The freshness, exuberance, and energy of youth did little for Cézanne. He was a late bloomer--and for some reason in our accounting of genius and creativity we have forgotten to make sense of the Cézannes of the world.

  3.

  The first day that Ben Fountain sat down to write at his kitchen table went well. He knew how the story about the stockbroker was supposed to start. But the second day, he says, he "completely freaked out." He didn't know how to describe things. He felt as if he were back in first grade. He didn't have a fully formed vision, waiting to be emptied onto the page. "I had to create a mental image of a building, a room, a façade, haircut, clothes--just really basic things," he says. "I realized I didn't have the facility to put those into words. I started going out and buying visual dictionaries, architectural dictionaries, and going to school on those."

  He began to collect articles about things he was interested in, and before long he realized that he had developed a fascination with Haiti. "The Haiti file just kept getting bigger and bigger," Fountain says. "And I thought, O.K., here's my novel. For a month or two I said I really don't need to go there, I can imagine everything. But after a couple of months I thought, Yeah, you've got to go there, and so I went, in April or May of '91."

  He spoke little French, let alone Haitian Creole. He had never been abroad. Nor did he know anyone in Haiti. "I got to the hotel, walked up the stairs, and there was this guy standing at the top of the stairs," Fountain recalls. "He said, 'My name is Pierre. You need a guide.' I said, 'You're sure as hell right, I do.' He was a very genuine person, and he realized pretty quickly I didn't want to go see the girls, I didn't want drugs, I didn't want any of that other stuff," Fountain went on. "And then it was, boom, 'I can take you there. I can take you to this person.' "

  Fountain was riveted by Haiti. "It's like a laboratory, almost," he says. "Everything that's gone on in the last five hundred years--colonialism, race, power, politics, ecological disasters--it's all there in very concentrated form. And also I just felt, viscerally, pretty comfortable there." He made more trips to Haiti, sometimes for a week, sometimes for two weeks. He made friends. He invited them to visit him in Dallas. ("You haven't lived until you've had Haitians stay in your house," Fountain says.) "I mean, I was involved. I couldn't just walk away. There's this very nonrational, nonlinear part of the whole process. I had a pretty specific time era that I was writing about, and certain things that I needed to know. But there were other things I didn't really need to know. I met a fellow who was with Save the Children, and he was on the Central Plateau, which takes about twelve hours to get to on a bus, and I had no reason to go there. But I went up there. Suffered on that bus, and ate dust. It was a hard trip, but it was a glorious trip. It had nothing to do with the book, but it wasn't wasted knowledge."

  In "Brief Encounters with Che Guevara," four of the stories are about Haiti, and they are the strongest in the collection. They feel like Haiti; they feel as if they've been written from the inside looking out, not the outside looking in. "After the novel was done, I don't know, I just felt like there was more for me, and I could keep going, keep going deeper there," Fountain recalls. "Always there's something--always something--here for me. How many times have I been? At least thirty times."

  Prodigies like Picasso, Galenson argues, rarely engage in that kind of open-ended exploration. They tend to be "conceptual," Galenson says, in the sense that they start with a clear idea of where they want to go, and then they execute it. "I can hardly understand the importance given to the word 'research,' " Picasso once said in an interview with the artist Marius de Zayas. "In my opinion, to search means nothing in painting. To find is the thing." He continued, "The several manners I have used in my art must not be considered as an evolution or as steps toward an unknown ideal of painting. . . . I have never made trials or experiments."

  But late bloomers, Galenson says, tend to work the other way around. Their approach is experimental. "Their goals are imprecise, so their procedure is tentative and incremental," Galenson writes in "Old Masters and Young Geniuses," and he goes on:

The imprecision of their goals means that these artists rarely feel they have succeeded, and their careers are consequently often dominated by the pursuit of a single objective. These artists repeat themselves, painting the same subject many times, and gradually changing its treatment in an experimental process of trial and error. Each work leads to the next, and none is generally privileged over others, so experimental painters rarely make specific preparatory sketches or plans for a painting. They consider the production of a painting as a process of searching, in which they aim to discover the image in the course of making it; they typically believe that learning is a more important goal than making finished paintings. Experimental artists build their skills gradually over the course of their careers, improving their work slowly over long periods. These artists are perfectionists and are typically plagued by frustration at their inability to achieve their goal.

  Where Picasso wanted to find, not search, Cézanne said the opposite: "I seek in painting."

  An experimental innovator would go back to Haiti thirty times. That's how that kind of mind figures out what it wants to do. When Cézanne was painting a portrait of the critic Gustave Geffroy, he made him endure eighty sittings, over three months, before announcing the project a failure. (The result is one of that string of masterpieces in the Musée "Orsay.) When Cézanne painted his dealer, Ambrose Vollard, he made Vollard arrive at eight in the morning and sit on a rickety platform until eleven-thirty, without a break, on a hundred and fifty occasions--before abandoning the portrait. He would paint a scene, then repaint it, then paint it again. He was notorious for slashing his canvases to pieces in fits of frustration.

  Mark Twain was the same way. Galenson quotes the literary critic Franklin Rogers on Twain's trial-and-error method: "His routine procedure seems to have been to start a novel with some structural plan which ordinarily soon proved defective, whereupon he would cast about for a new plot which would overcome the difficulty, rewrite what he had already written, and then push on until some new defect forced him to repeat the process once again." Twain fiddled and despaired and revised and gave up on "Huckleberry Finn" so many times that the book took him nearly a decade to complete. The Cézannes of the world bloom late not as a result of some defect in character, or distraction, or lack of ambition, but because the kind of creativity that proceeds through trial and error necessarily takes a long time to come to fruition.

  One of the best stories in "Brief Encounters" is called "Near-Extinct Birds of the Central Cordillera." It's about an ornithologist taken hostage by the FARC guerrillas of Colombia. Like so much of Fountain's work, it reads with an easy grace. But there was nothing easy or graceful about its creation. "I struggled with that story," Fountain says. "I always try to do too much. I mean, I probably wrote five hundred pages of it in various incarnations." Fountain is at work right now on a novel. It was supposed to come out this year. It's late.

  4.

  Galenson's idea that creativity can be divided into these types--conceptual and experimental--has a number of important implications. For example, we sometimes think of late bloomers as late starters. They don't realize they're good at something until they're fifty, so of course they achieve late in life. But that's not quite right. Cézanne was painting almost as early as Picasso was. We also sometimes think of them as artists who are discovered late; the world is just slow to appreciate their gifts. In both cases, the assumption is that the prodigy and the late bloomer are fundamentally the same, and that late blooming is simply genius under conditions of market failure. What Galenson's argument suggests is something else--that late bloomers bloom late because they simply aren't much good until late in their careers.

  "All these qualities of his inner vision were continually hampered and obstructed by Cézanne's incapacity to give sufficient verisimilitude to the personae of his drama," the great English art critic Roger Fry wrote of the early Cézanne. "With all his rare endowments, he happened to lack the comparatively common gift of illustration, the gift that any draughtsman for the illustrated papers learns in a school of commercial art; whereas, to realize such visions as Cézanne's required this gift in high degree." In other words, the young Cézanne couldn't draw. Of "The Banquet," which Cézanne painted at thirty-one, Fry writes, "It is no use to deny that Cézanne has made a very poor job of it." Fry goes on, "More happily endowed and more integral personalities have been able to express themselves harmoniously from the very first. But such rich, complex, and conflicting natures as Cézanne's require a long period of fermentation." Cézanne was trying something so elusive that he couldn't master it until he'd spent decades practicing.

  This is the vexing lesson of Fountain's long attempt to get noticed by the literary world. On the road to great achievement, the late bloomer will resemble a failure: while the late bloomer is revising and despairing and changing course and slashing canvases to ribbons after months or years, what he or she produces will look like the kind of thing produced by the artist who will never bloom at all. Prodigies are easy. They advertise their genius from the get-go. Late bloomers are hard. They require forbearance and blind faith. (Let's just be thankful that Cézanne didn't have a guidance counsellor in high school who looked at his primitive sketches and told him to try accounting.) Whenever we find a late bloomer, we can't but wonder how many others like him or her we have thwarted because we prematurely judged their talents. But we also have to acccept that there's nothing we can do about it. How can we ever know which of the failures will end up blooming?

  Not long after meeting Ben Fountain, I went to see the novelist Jonathan Safran Foer, the author of the 2002 best-seller "Everything Is Illuminated." Fountain is a graying man, slight and modest, who looks, in the words of a friend of his, like a "golf pro from Augusta, Georgia." Foer is in his early thirties and looks barely old enough to drink. Fountain has a softness to him, as if years of struggle have worn away whatever sharp edges he once had. Foer gives the impression that if you touched him while he was in full conversational flight you would get an electric shock.

  "I came to writing really by the back door," Foer said. "My wife is a writer, and she grew up keeping journals--you know, parents said, 'Lights out, time for bed,' and she had a little flashlight under the covers, reading books. I don't think I read a book until much later than other people. I just wasn't interested in it."

  Foer went to Princeton and took a creative-writing class in his freshman year with Joyce Carol Oates. It was, he explains, "sort of on a whim, maybe out of a sense that I should have a diverse course load." He'd never written a story before. "I didn't really think anything of it, to be honest, but halfway through the semester I arrived to class early one day, and she said, 'Oh, I'm glad I have this chance to talk to you. I'm a fan of your writing.' And it was a real revelation for me."

  Oates told him that he had the most important of writerly qualities, which was energy. He had been writing fifteen pages a week for that class, an entire story for each seminar. "Why does a dam with a crack in it leak so much?" he said, with a laugh. "There was just something in me, there was like a pressure."

  As a sophomore, he took another creative-writing class. During the following summer, he went to Europe. He wanted to find the village in Ukraine where his grandfather had come from. After the trip, he went to Prague. There he read Kafka, as any literary undergraduate would, and sat down at his computer.

  "I was just writing," he said. "I didn't know that I was writing until it was happening. I didn't go with the intention of writing a book. I wrote three hundred pages in ten weeks. I really wrote. I'd never done it like that."

  It was a novel about a boy named Jonathan Safran Foer who visits the village in Ukraine where his grandfather had come from. Those three hundred pages were the first draft of "Everything Is Illuminated"--the exquisite and extraordinary novel that established Foer as one of the most distinctive literary voices of his generation. He was nineteen years old.

  Foer began to talk about the other way of writing books, where you painstakingly honed your craft, over years and years. "I couldn't do that," he said. He seemed puzzled by it. It was clear that he had no understanding of how being an experimental innovator would work. "I mean, imagine if the craft you're trying to learn is to be an original. How could you learn the craft of being an original?"

  He began to describe his visit to Ukraine. "I went to the shtetl where my family came from. It's called Trachimbrod, the name I use in the book. It's a real place. But you know what's funny? It's the single piece of research that made its way into the book." He wrote the first sentence, and he was proud of it, and then he went back and forth in his mind about where to go next. "I spent the first week just having this debate with myself about what to do with this first sentence. And once I made the decision, I felt liberated to just create--and it was very explosive after that."

  If you read "Everything Is Illuminated," you end up with the same feeling you get when you read "Brief Encounters with Che Guevara"--the sense of transport you experience when a work of literature draws you into its own world. Both are works of art. It's just that, as artists, Fountain and Foer could not be less alike. Fountain went to Haiti thirty times. Foer went to Trachimbrod just once. "I mean, it was nothing," Foer said. "I had absolutely no experience there at all. It was just a springboard for my book. It was like an empty swimming pool that had to be filled up." Total time spent getting inspiration for his novel: three days.

  5.

  Ben Fountain did not make the decision to quit the law and become a writer all by himself. He is married and has a family. He met his wife, Sharon, when they were both in law school at Duke. When he was doing real-estate work at Akin, Gump, she was on the partner track in the tax practice at Thompson & Knight. The two actually worked in the same building in downtown Dallas. They got married in 1985, and had a son in April of 1987. Sharie, as Fountain calls her, took four months of maternity leave before returning to work. She made partner by the end of that year.

  "We had our son in a day care downtown," she recalls. "We would drive in together, one of us would take him to day care, the other one would go to work. One of us would pick him up, and then, somewhere around eight o'clock at night, we would have him bathed, in bed, and then we hadn't even eaten yet, and we'd be looking at each other, going, 'This is just the beginning.' " She made a face. "That went on for maybe a month or two, and Ben's like, 'I don't know how people do this.' We both agreed that continuing at that pace was probably going to make us all miserable. Ben said to me, 'Do you want to stay home?' Well, I was pretty happy in my job, and he wasn't, so as far as I was concerned it didn't make any sense for me to stay home. And I didn't have anything besides practicing law that I really wanted to do, and he did. So I said, 'Look, can we do this in a way that we can still have some day care and so you can write?' And so we did that."

  Ben could start writing at seven-thirty in the morning because Sharie took their son to day care. He stopped working in the afternoon because that was when he had to pick him up, and then he did the shopping and the household chores. In 1989, they had a second child, a daughter. Fountain was a full-fledged North Dallas stay-at-home dad.

  "When Ben first did this, we talked about the fact that it might not work, and we talked about, generally, 'When will we know that it really isn't working?' and I'd say, 'Well, give it ten years,' " Sharie recalled. To her, ten years didn't seem unreasonable. "It takes a while to decide whether you like something or not," she says. And when ten years became twelve and then fourteen and then sixteen, and the kids were off in high school, she stood by him, because, even during that long stretch when Ben had nothing published at all, she was confident that he was getting better. She was fine with the trips to Haiti, too. "I can't imagine writing a novel about a place you haven't at least tried to visit," she says. She even went with him once, and on the way into town from the airport there were people burning tires in the middle of the road.

  "I was making pretty decent money, and we didn't need two incomes," Sharie went on. She has a calm, unflappable quality about her. "I mean, it would have been nice, but we could live on one."

  Sharie was Ben's wife. But she was also--to borrow a term from long ago--his patron. That word has a condescending edge to it today, because we think it far more appropriate for artists (and everyone else for that matter) to be supported by the marketplace. But the marketplace works only for people like Jonathan Safran Foer, whose art emerges, fully realized, at the beginning of their career, or Picasso, whose talent was so blindingly obvious that an art dealer offered him a hundred-and-fifty-franc-a-month stipend the minute he got to Paris, at age twenty. If you are the type of creative mind that starts without a plan, and has to experiment and learn by doing, you need someone to see you through the long and difficult time it takes for your art to reach its true level.

  This is what is so instructive about any biography of Cézanne. Accounts of his life start out being about Cézanne, and then quickly turn into the story of Cézanne's circle. First and foremost is always his best friend from childhood, the writer Émile Zola, who convinces the awkward misfit from the provinces to come to Paris, and who serves as his guardian and protector and coach through the long, lean years.

  Here is Zola, already in Paris, in a letter to the young Cézanne back in Provence. Note the tone, more paternal than fraternal:

You ask me an odd question. Of course one can work here, as anywhere else, if one has the will. Paris offers, further, an advantage you can't find elsewhere: the museums in which you can study the old masters from 11 to 4. This is how you must divide your time. From 6 to 11 you go to a studio to paint from a live model; you have lunch, then from 12 to 4 you copy, in the Louvre or the Luxembourg, whatever masterpiece you like. That will make up nine hours of work. I think that ought to be enough.

  Zola goes on, detailing exactly how Cézanne could manage financially on a monthly stipend of a hundred and twenty-five francs:

I'll reckon out for you what you should spend. A room at 20 francs a month; lunch at 18 sous and dinner at 22, which makes two francs a day, or 60 francs a month. . . . Then you have the studio to pay for: the Atelier Suisse, one of the least expensive, charges, I think, 10 francs. Add 10 francs for canvas, brushes, colors; that makes 100. So you'll have 25 francs left for laundry, light, the thousand little needs that turn up.

  Camille Pissarro was the next critical figure in Cézanne's life. It was Pissarro who took Cézanne under his wing and taught him how to be a painter. For years, there would be periods in which they went off into the country and worked side by side.

  Then there was Ambrose Vollard, the sponsor of Cézanne's first one-man show, at the age of fifty-six. At the urging of Pissarro, Renoir, Degas, and Monet, Vollard hunted down Cézanne in Aix. He spotted a still-life in a tree, where it had been flung by Cézanne in disgust. He poked around the town, putting the word out that he was in the market for Cézanne's canvases. In "Lost Earth: A Life of Cézanne," the biographer Philip Callow writes about what happened next:

Before long someone appeared at his hotel with an object wrapped in a cloth. He sold the picture for 150 francs, which inspired him to trot back to his house with the dealer to inspect several more magnificent Cézannes. Vollard paid a thousand francs for the job lot, then on the way out was nearly hit on the head by a canvas that had been overlooked, dropped out the window by the man's wife. All the pictures had been gathering dust, half buried in a pile of junk in the attic.

  All this came before Vollard agreed to sit a hundred and fifty times, from eight in the morning to eleven-thirty, without a break, for a picture that Cézanne disgustedly abandoned. Once, Vollard recounted in his memoir, he fell asleep, and toppled off the makeshift platform. Cézanne berated him, incensed: "Does an apple move?" This is called friendship.

  Finally, there was Cézanne's father, the banker Louis-Auguste. From the time Cézanne first left Aix, at the age of twenty-two, Louis-Auguste paid his bills, even when Cézanne gave every indication of being nothing more than a failed dilettante. But for Zola, Cézanne would have remained an unhappy banker's son in Provence; but for Pissarro, he would never have learned how to paint; but for Vollard (at the urging of Pissarro, Renoir, Degas, and Monet), his canvases would have rotted away in some attic; and, but for his father, Cézanne's long apprenticeship would have been a financial impossibility. That is an extraordinary list of patrons. The first three--Zola, Pissarro, and Vollard--would have been famous even if Cézanne never existed, and the fourth was an unusually gifted entrepreneur who left Cézanne four hundred thousand francs when he died. Cézanne didn't just have help. He had a dream team in his corner.

  This is the final lesson of the late bloomer: his or her success is highly contingent on the efforts of others. In biographies of Cézanne, Louis-Auguste invariably comes across as a kind of grumpy philistine, who didn't appreciate his son's genius. But Louis-Auguste didn't have to support Cézanne all those years. He would have been within his rights to make his son get a real job, just as Sharie might well have said no to her husband's repeated trips to the chaos of Haiti. She could have argued that she had some right to the life style of her profession and status--that she deserved to drive a BMW, which is what power couples in North Dallas drive, instead of a Honda Accord, which is what she settled for.

  But she believed in her husband's art, or perhaps, more simply, she believed in her husband, the same way Zola and Pissarro and Vollard and--in his own, querulous way--Louis-Auguste must have believed in Cézanne. Late bloomers' stories are invariably love stories, and this may be why we have such difficulty with them. We'd like to think that mundane matters like loyalty, steadfastness, and the willingness to keep writing checks to support what looks like failure have nothing to do with something as rarefied as genius. But sometimes genius is anything but rarefied; sometimes it's just the thing that emerges after twenty years of working at your kitchen table.

  Who says big ideas are rare?

  1.

  Nathan Myhrvold met Jack Horner on the set of the "Jurassic Park" sequel in 1996. Horner is an eminent paleontologist, and was a consultant on the movie. Myhrvold was there because he really likes dinosaurs. Between takes, the two men got to talking, and Horner asked Myhrvold if he was interested in funding dinosaur expeditions.

  Myhrvold is of Nordic extraction, and he looks every bit the bearded, fair-haired Viking--not so much the tall, ferocious kind who raped and pillaged as the impish, roly-poly kind who stayed home by the fjords trying to turn lead into gold. He is gregarious, enthusiastic, and nerdy on an epic scale. He graduated from high school at fourteen. He started Microsoft's research division, leaving, in 1999, with hundreds of millions. He is obsessed with aperiodic tile patterns. (Imagine a floor tiled in a pattern that never repeats.) When Myhrvold built his own house, on the shores of Lake Washington, outside Seattle--a vast, silvery hypermodernist structure described by his wife as the place in the sci-fi movie where the aliens live--he embedded some sixty aperiodic patterns in the walls, floors, and ceilings. His front garden is planted entirely with vegetation from the Mesozoic era. ("If the 'Jurassic Park' thing happens," he says, "this is where the dinosaurs will come to eat.") One of the scholarly achievements he is proudest of is a paper he co-wrote proving that it was theoretically possible for sauropods--his favorite kind of dinosaur--to have snapped their tails back and forth faster than the speed of sound. How could he say no to the great Jack Horner?

  "What you do on a dinosaur expedition is you hike and look at the ground," Myhrvold explains. "You find bones sticking out of the dirt and, once you see something, you dig." In Montana, which is prime dinosaur country, people had been hiking around and looking for bones for at least a hundred years. But Horner wanted to keep trying. So he and Myhrvold put together a number of teams, totalling as many as fifty people. They crossed the Fort Peck reservoir in boats, and began to explore the Montana badlands in earnest. They went out for weeks at a time, several times a year. They flew equipment in on helicopters. They mapped the full dinosaur ecology--bringing in specialists from other disciplines. And they found dinosaur bones by the truckload.

  Once, a team member came across a bone sticking out from the bottom of a recently eroded cliff. It took Horner's field crew three summers to dig it out, and when they broke the bone open a black, gooey substance trickled out--a discovery that led Myhrvold and his friend Lowell Wood on a twenty-minute digression at dinner one night about how, given enough goo and a sufficient number of chicken embryos, they could "make another one."

  There was also Myhrvold's own find: a line of vertebrae, as big as apples, just lying on the ground in front of him. "It was seven years ago. It was a bunch of bones from a fairly rare dinosaur called a thescelosaurus. I said, 'Oh, my God!' I was walking with Jack and my son. Then Jack said, 'Look, there's a bone in the side of the hill.' And we look at it, and it's a piece of a jawbone with a tooth the size of a banana. It was a T. rex skull. There was nothing else it could possibly be."

  People weren't finding dinosaur bones, and they assumed that it was because they were rare. But--and almost everything that Myhrvold has been up to during the past half decade follows from this fact--it was our fault. We didn't look hard enough.

  Myhrvold gave the skeleton to the Smithsonian. It's called the N. rex. "Our expeditions have found more T. rex than anyone else in the world," Myhrvold said. "From 1909 to 1999, the world found eighteen T. rex specimens. From 1999 until now, we've found nine more." Myhrvold has the kind of laugh that scatters pigeons. "We have dominant T. rex market share."

  2.

  In 1874, Alexander Graham Bell spent the summer with his parents in Brantford, Ontario. He was twenty-seven years old, and employed as a speech therapist in Boston. But his real interest was solving the puzzle of what he then called the "harmonic telegraph." In Boston, he had tinkered obsessively with tuning forks and electromagnetic coils, often staying up all night when he was in the grip of an idea. When he went to Brantford, he brought with him an actual human ear, taken from a cadaver and preserved, to which he attached a pen, so that he could record the vibration of the ear's bones when he spoke into it.

  One day, Bell went for a walk on a bluff overlooking the Grand River, near his parents' house. In a recent biography of Bell, "Reluctant Genius," Charlotte Gray writes:

A large tree had blown down here, creating a natural and completely private belvedere, which [he] had dubbed his "dreaming place." Slouched on a wicker chair, his hands in his pockets, he stared unseeing at the swiftly flowing river below him. Far from the bustle of Boston and the pressure of competition from other eager inventors, he mulled over everything he had discovered about sound.

  In that moment, Bell knew the answer to the puzzle of the harmonic telegraph. Electric currents could convey sound along a wire if they undulated in accordance with the sound waves. Back in Boston, he hired a research assistant, Thomas Watson. He turned his attic into a laboratory, and redoubled his efforts. Then, on March 10, 1876, he set up one end of his crude prototype in his bedroom, and had Watson take the other end to the room next door. Bell, always prone to clumsiness, spilled acid on his clothes. "Mr. Watson, come here," he cried out. Watson came --but only because he had heard Bell on the receiver, plain as day. The telephone was born.

  In 1999, when Nathan Myhrvold left Microsoft and struck out on his own, he set himself an unusual goal. He wanted to see whether the kind of insight that leads to invention could be engineered. He formed a company called Intellectual Ventures. He raised hundreds of millions of dollars. He hired the smartest people he knew. It was not a venture-capital firm. Venture capitalists fund insights--that is, they let the magical process that generates new ideas take its course, and then they jump in. Myhrvold wanted to make insights--to come up with ideas, patent them, and then license them to interested companies. He thought that if he brought lots of very clever people together he could reconstruct that moment by the Grand River.

  One rainy day last November, Myhrvold held an "invention session," as he calls such meetings, on the technology of self-assembly. What if it was possible to break a complex piece of machinery into a thousand pieces and then, at some predetermined moment, have the machine put itself back together again? That had to be useful. But for what?

  The meeting, like many of Myhrvold's sessions, was held in a conference room in the Intellectual Ventures laboratory, a big warehouse in an industrial park across Lake Washington from Seattle: plasma TV screens on the walls, a long table furnished with bottles of Diet Pepsi and big bowls of cashews.

  Chairing the meeting was Casey Tegreene, an electrical engineer with a law degree, who is the chief patent counsel for I.V. He stood at one end of the table. Myhrvold was at the opposite end. Next to him was Edward Jung, whom Myhrvold met at Microsoft. Jung is lean and sleek, with closely cropped fine black hair. Once, he spent twenty-two days walking across Texas with nothing but a bedroll, a flashlight, and a rifle, from Big Bend, in the west, to Houston, where he was going to deliver a paper at a biology conference. On the other side of the table from Jung was Lowell Wood, an imposing man with graying red hair and an enormous head. Three or four pens were crammed into his shirt pocket. The screen saver on his laptop was a picture of Stonehenge.

  "You know how musicians will say, 'My teacher was So-and-So, and his teacher was So-and-So,' right back to Beethoven?" Myhrvold says. "So Lowell was the great protégé of Edward Teller. He was at Lawrence Livermore. He was the technical director of Star Wars." Myhrvold and Wood have known each other since Myhrvold was a teen-ager and Wood interviewed him for a graduate fellowship called the Hertz. "If you want to know what Nathan was like at that age," Wood said, "look at that ball of fire now and scale that up by eight or ten decibels." Wood bent the rules for Myhrvold; the Hertz was supposed to be for research in real-world problems. Myhrvold's field at that point, quantum cosmology, involved the application of quantum mechanics to the period just after the big bang, which means, as Myhrvold likes to say, that he had no interest in the universe a microsecond after its creation.

  The chairman of the chemistry department at Stanford, Richard Zare, had flown in for the day, as had Eric Leuthardt, a young neurosurgeon from Washington University, in St. Louis, who is a regular at I.V. sessions. At the back was a sombre, bearded man named Rod Hyde, who had been Wood's protégé at Lawrence Livermore.

  Tegreene began. "There really aren't any rules," he told everyone. "We may start out talking about refined plastics and end up talking about shoes, and that's O.K."

  He started in on the "prep." In the previous weeks, he and his staff had reviewed the relevant scientific literature and recent patent filings in order to come up with a short briefing on what was and wasn't known about self-assembly. A short BBC documentary was shown, on the early work of the scientist Lionel Penrose. Richard Zare passed around a set of what looked like ceramic dice. Leuthardt drew elaborate diagrams of the spine on the blackboard. Self-assembly was very useful in eye-of-the-needle problems--in cases where you had to get something very large through a very small hole--and Leuthardt wondered if it might be helpful in minimally invasive surgery.

  The conversation went in fits and starts. "I'm asking a simple question and getting a long-winded answer," Jung said at one point, quietly. Wood played the role of devil's advocate. During a break, Myhrvold announced that he had just bought a CAT scanner, on an Internet auction site.

  "I put in a minimum bid of twenty-nine hundred dollars," he said. There was much murmuring and nodding around the room. Myhrvold's friends, like Myhrvold, seemed to be of the opinion that there is no downside to having a CAT scanner, especially if you can get it for twenty-nine hundred dollars.

  Before long, self-assembly was put aside and the talk swung to how to improve X-rays, and then to the puzzling phenomenon of soldiers in Iraq who survive a bomb blast only to die a few days later of a stroke. Wood thought it was a shock wave, penetrating the soldiers' helmets and surging through their brains, tearing blood vessels away from tissue. "Lowell is the living example of something better than the Internet," Jung said after the meeting was over. "On the Internet, you can search for whatever you want, but you have to know the right terms. With Lowell, you just give him a concept, and this stuff pops out."

  Leuthardt, the neurosurgeon, thought that Wood's argument was unconvincing. The two went back and forth, arguing about how you could make a helmet that would better protect soldiers.

  "We should be careful how much mental energy we spend on this," Leuthardt said, after a few minutes.

  Wood started talking about the particular properties of bullets with tungsten cores.

  "Shouldn't someone tell the Pentagon?" a voice said, only half jokingly, from the back of the room.

  3.

  How useful is it to have a group of really smart people brainstorm for a day? When Myhrvold started out, his expectations were modest. Although he wanted insights like Alexander Graham Bell's, Bell was clearly one in a million, a genius who went on to have ideas in an extraordinary number of areas--sound recording, flight, lasers, tetrahedral construction, and hydrofoil boats, to name a few. The telephone was his obsession. He approached it from a unique perspective, that of a speech therapist. He had put in years of preparation before that moment by the Grand River, and it was impossible to know what unconscious associations triggered his great insight. Invention has its own algorithm: genius, obsession, serendipity, and epiphany in some unknowable combination. How can you put that in a bottle?

  But then, in August of 2003, I.V. held its first invention session, and it was a revelation. "Afterward, Nathan kept saying, 'There are so many inventions,' " Wood recalled. "He thought if we came up with a half-dozen good ideas it would be great, and we came up with somewhere between fifty and a hundred. I said to him, 'But you had eight people in that room who are seasoned inventors. Weren't you expecting a multiplier effect?' And he said, 'Yeah, but it was more than multiplicity.' Not even Nathan had any idea of what it was going to be like."

  The original expectation was that I.V. would file a hundred patents a year. Currently, it's filing five hundred a year. It has a backlog of three thousand ideas. Wood said that he once attended a two-day invention session presided over by Jung, and after the first day the group went out to dinner. "So Edward took his people out, plus me," Wood said. "And the eight of us sat down at a table and the attorney said, 'Do you mind if I record the evening?' And we all said no, of course not. We sat there. It was a long dinner. I thought we were lightly chewing the rag. But the next day the attorney comes up with eight single-spaced pages flagging thirty-six different inventions from dinner. Dinner."

  And the kinds of ideas the group came up with weren't trivial. Intellectual Ventures just had a patent issued on automatic, battery-powered glasses, with a tiny video camera that reads the image off the retina and adjusts the fluid-filled lenses accordingly, up to ten times a second. It just licensed off a cluster of its patents, for eighty million dollars. It has invented new kinds of techniques for making microchips and improving jet engines; it has proposed a way to custom-tailor the mesh "sleeve" that neurosurgeons can use to repair aneurysms.

  Bill Gates, whose company, Microsoft, is one of the major investors in Intellectual "Ventures, says, I can give you fifty examples of ideas they've had where, if you take just one of them, you'd have a startup company right there." Gates has participated in a number of invention sessions, and, with other members of the Gates Foundation, meets every few months with Myhrvold to brainstorm about things like malaria or H.I.V. "Nathan sent over a hundred scientific papers beforehand," Gates said of the last such meeting. "The amount of reading was huge. But it was fantastic. There's this idea they have where you can track moving things by counting wing beats. So you could build a mosquito fence and clear an entire area. They had some ideas about super-thermoses, so you wouldn't need refrigerators for certain things. They also came up with this idea to stop hurricanes. Basically, the waves in the ocean have energy, and you use that to lower the temperature differential. I'm not saying it necessarily is going to work. But it's just an example of something where you go, Wow."

  One of the sessions that Gates participated in was on the possibility of resuscitating nuclear energy. "Teller had this idea way back when that you could make a very safe, passive nuclear reactor," Myhrvold explained. "No moving parts. Proliferation-resistant. Dead simple. Every serious nuclear accident involves operator error, so you want to eliminate the operator altogether. Lowell and Rod and others wrote a paper on it once. So we did several sessions on it."

  The plant, as they conceived it, would produce something like one to three gigawatts of power, which is enough to serve a medium-sized city. The reactor core would be no more than several metres wide and about ten metres long. It would be enclosed in a sealed, armored box. The box would work for thirty years, without need for refuelling. Wood's idea was that the box would run on thorium, which is a very common, mildly radioactive metal. (The world has roughly a hundred-thousand-year supply, he figures.) Myhrvold's idea was that it should run on spent fuel from existing power plants. "Waste has negative cost," Myhrvold said. "This is how we make this idea politically and regulatorily attractive. Lowell and I had a monthlong no-holds-barred nuclear-physics battle. He didn't believe waste would work. It turns out it does." Myhrvold grinned. "He concedes it now."

  It was a long-shot idea, easily fifteen years from reality, if it became a reality at all. It was just a tantalizing idea at this point, but who wasn't interested in seeing where it would lead? "We have thirty guys working on it," he went on. "I have more people doing cutting-edge nuclear work than General Electric. We're looking for someone to partner with us, because this is a huge undertaking. We took out an ad in Nuclear News, which is the big trade journal. It looks like something from The Onion: 'Intellectual Ventures interested in nuclear-core designer and fission specialist.' And, no, the F.B.I. hasn't come knocking." He lowered his voice to a stage whisper. "Lowell is known to them."

  It was the dinosaur-bone story all over again. You sent a proper search team into territory where people had been looking for a hundred years, and, lo and behold, there's a T. rex tooth the size of a banana. Ideas weren't precious. They were everywhere, which suggested that maybe the extraordinary process that we thought was necessary for invention--genius, obsession, serendipity, epiphany--wasn't necessary at all.

  4.

  In June of 1876, a few months after he shouted out, "Mr. Watson, come here," Alexander Graham Bell took his device to the World's Fair in Philadelphia. There, before an audience that included the emperor of Brazil, he gave his most famous public performance. The emperor accompanied Bell's assistant, Willie Hubbard, to an upper gallery, where the receiver had been placed, leaving Bell with his transmitter. Below them, and out of sight, Bell began to talk. "A storm of emotions crossed the Brazilian emperor's face--uncertainty, amazement, elation," Charlotte Gray writes. "Lifting his head from the receiver . . . he gave Willie a huge grin and said, 'This thing speaks!' " Gray continues:

Soon a steady stream of portly, middle-aged men were clambering into the gallery, stripping off their jackets, and bending their ears to the receiver. "For an hour or more," Willie remembered, "all took turns in talking and listening, testing the line in every possible way, evidently looking for some trickery, or thinking that the sound was carried through the air. . . . It seemed to be nearly all too wonderful for belief."

  Bell was not the only one to give a presentation on the telephone at the Philadelphia Exhibition, however. Someone else spoke first. His name was Elisha Gray. Gray never had an epiphany overlooking the Grand River. Few have claimed that Gray was a genius. He does not seem to have been obsessive, or to have routinely stayed up all night while in the grip of an idea--although we don't really know, because, unlike Bell, he has never been the subject of a full-length biography. Gray was simply a very adept inventor. He was the author of a number of discoveries relating to the telegraph industry, including a self-adjusting relay that solved the problem of circuits sticking open or shut, and a telegraph printer--a precursor of what was later called the Teletype machine. He worked closely with Western Union. He had a very capable partner named Enos Barton, with whom he formed a company that later became the Western Electric Company and its offshoot Graybar (of Graybar Building fame). And Gray was working on the telephone at the same time that Bell was. In fact, the two filed notice with the Patent Office in Washington, D.C., on the same day--February 14, 1876. Bell went on to make telephones with the company that later became A. T. & T. Gray went on to make telephones in partnership with Western Union and Thomas Edison, and--until Gray's team was forced to settle a lawsuit with Bell's company--the general consensus was that Gray and Edison's telephone was better than Bell's telephone.

  In order to get one of the greatest inventions of the modern age, in other words, we thought we needed the solitary genius. But if Alexander Graham Bell had fallen into the Grand River and drowned that day back in Brantford, the world would still have had the telephone, the only difference being that the telephone company would have been nicknamed Ma Gray, not Ma Bell.

  5.

  This phenomenon of simultaneous discovery--what science historians call "multiples"--turns out to be extremely common. One of the first comprehensive lists of multiples was put together by William Ogburn and Dorothy Thomas, in 1922, and they found a hundred and forty-eight major scientific discoveries that fit the multiple pattern. Newton and Leibniz both discovered calculus. Charles Darwin and Alfred Russel Wallace both discovered evolution. Three mathematicians "invented" decimal fractions. Oxygen was discovered by Joseph Priestley, in Wiltshire, in 1774, and by Carl Wilhelm Scheele, in Uppsala, a year earlier. Color photography was invented at the same time by Charles Cros and by Louis Ducos du Hauron, in France. Logarithms were invented by John Napier and Henry Briggs in Britain, and by Joost Bürgi in Switzerland.

  "There were four independent discoveries of sunspots, all in 1611; namely, by Galileo in Italy, Scheiner in Germany, Fabricius in Holland and Harriott in England," Ogburn and Thomas note, and they continue:

The law of the conservation of energy, so significant in science and philosophy, was formulated four times independently in 1847, by Joule, Thomson, Colding and Helmholz. They had been anticipated by Robert Mayer in 1842. There seem to have been at least six different inventors of the thermometer and no less than nine claimants of the invention of the telescope. Typewriting machines were invented simultaneously in England and in America by several individuals in these countries. The steamboat is claimed as the "exclusive" discovery of Fulton, Jouffroy, Rumsey, Stevens and Symmington.

  For Ogburn and Thomas, the sheer number of multiples could mean only one thing: scientific discoveries must, in some sense, be inevitable. They must be in the air, products of the intellectual climate of a specific time and place. It should not surprise us, then, that calculus was invented by two people at the same moment in history. Pascal and Descartes had already laid the foundations. The Englishman John Wallis had pushed the state of knowledge still further. Newton's teacher was Isaac Barrow, who had studied in Italy, and knew the critical work of Torricelli and Cavalieri. Leibniz knew Pascal's and Descartes's work from his time in Paris. He was close to a German named Henry Oldenburg, who, now living in London, had taken it upon himself to catalogue the latest findings of the English mathematicians. Leibniz and Newton may never have actually sat down together and shared their work in detail. But they occupied a common intellectual milieu. "All the basic work was done--someone just needed to take the next step and put it together," Jason Bardi writes in "The Calculus Wars," a history of the idea's development. "If Newton and Leibniz had not discovered it, someone else would have." Calculus was in the air.

  Of course, that is not the way Newton saw it. He had done his calculus work in the mid-sixteen-sixties, but never published it. And after Leibniz came out with his calculus, in the sixteen-eighties, people in Newton's circle accused Leibniz of stealing his work, setting off one of the great scientific scandals of the seventeenth century. That is the inevitable human response. We're reluctant to believe that great discoveries are in the air. We want to believe that great discoveries are in our heads--and to each party in the multiple the presence of the other party is invariably cause for suspicion.

  Thus the biographer Robert Bruce, in "Bell: Alexander Graham Bell and the Conquest of Solitude," casts a skeptical eye on Elisha Gray. Was it entirely coincidence, he asks, that the two filed on exactly the same day? "If Gray had prevailed in the end," he goes on,

Bell and his partners, along with fanciers of the underdog, would have suspected chicanery. After all, Gray did not put his concept on paper nor even mention it to anyone until he had spent nearly a month in Washington making frequent visits to the Patent Office, and until Bell's notarized specifications had for several days been the admiration of at least some of "the people in the Patent Office." . . . It is easier to believe that a conception already forming in Gray's mind was precipitated by rumors of what Bell was about to patent, than to believe that chance alone brought Gray to inspiration and action at that precise moment.

  In "The Telephone Gambit," Seth Shulman makes the opposite case. Just before Bell had his famous conversation with Watson, Shulman points out, he visited the Patent Office in Washington. And the transmitter design that Bell immediately sketched in his notebook upon his return to Boston was identical to the sketch of the transmitter that Gray had submitted to the Patent Office. This could not be coincidence, Shulman concludes, and thereupon constructs an ingenious (and, it should be said, highly entertaining) revisionist account of Bell's invention, complete with allegations of corruption and romantic turmoil. Bell's telephone, he writes, is "one of the most consequential thefts in history."

  But surely Gray and Bell occupied their scientific moment in the same way that Leibniz and Newton did. They arrived at electric speech by more or less the same pathway. They were trying to find a way to send more than one message at a time along a telegraph wire--which was then one of the central technological problems of the day. They had read the same essential sources--particularly the work of Philipp Reis, the German physicist who had come startlingly close to building a working telephone back in the early eighteen-sixties. The arguments of Bruce and Shulman suppose that great ideas are precious. It is too much for them to imagine that a discovery as remarkable as the telephone could arise in two places at once. But five people came up with the steamboat, and nine people came up with the telescope, and, if Gray had fallen into the Grand River along with Bell, some Joe Smith somewhere would likely have come up with the telephone instead and Ma Smith would have run the show. Good ideas are out there for anyone with the wit and the will to find them, which is how a group of people can sit down to dinner, put their minds to it, and end up with eight single-spaced pages of ideas.

  6.

  Last March, Myhrvold decided to do an invention session with Eric Leuthardt and several other physicians in St. Louis. Rod Hyde came, along with a scientist from M.I.T. named Ed Boyden. Wood was there as well.

  "Lowell came in looking like the Cheshire Cat," Myhrvold recalled. "He said, 'I have a question for everyone. You have a tumor, and the tumor becomes metastatic, and it sheds metastatic cancer cells. How long do those circulate in the bloodstream before they land?' And we all said, 'We don't know. Ten times?' 'No,' he said. 'As many as a million times.' Isn't that amazing? If you had no time, you'd be screwed. But it turns out that these cells are in your blood for as long as a year before they land somewhere. What that says is that you've got a chance to intercept them."

  How did Wood come to this conclusion? He had run across a stray fact in a recent issue of The New England Journal of Medicine. "It was an article that talked about, at one point, the number of cancer cells per millilitre of blood," he said. "And I looked at that figure and said, 'Something's wrong here. That can't possibly be true.' The number was incredibly high. Too high. It has to be one cell in a hundred litres, not what they were saying--one cell in a millilitre. Yet they spoke of it so confidently. I clicked through to the references. It was a commonplace. There really were that many cancer cells."

  Wood did some arithmetic. He knew that human beings have only about five litres of blood. He knew that the heart pumps close to a hundred millilitres of blood per beat, which means that all of our blood circulates through our bloodstream in a matter of minutes. The New England Journal article was about metastatic breast cancer, and it seemed to Wood that when women die of metastatic breast cancer they don't die with thousands of tumors. The vast majority of circulating cancer cells don't do anything.

  "It turns out that some small per cent of tumor cells are actually the deadly " "; he went on. " Tumor stem cells are what really initiate metastases. And isn't it astonishing that they have to turn over at least ten thousand times before they can find a happy home? You naïvely think it's once or twice or three times. Maybe five times at most. It isn't. In other words, metastatic cancer--the brand of cancer that kills us--is an amazingly hard thing to initiate. Which strongly suggests that if you tip things just a little bit you essentially turn off the process."

  That was the idea that Wood presented to the room in St. Louis. From there, the discussion raced ahead. Myhrvold and his inventors had already done a lot of thinking about using tiny optical filters capable of identifying and zapping microscopic particles. They also knew that finding cancer cells in blood is not hard. They're often the wrong size or the wrong shape. So what if you slid a tiny filter into a blood vessel of a cancer patient? "You don't have to intercept very much of the blood for it to work," Wood went on. "Maybe one ten-thousandth of it. The filter could be put in a little tiny vein in the back of the hand, because that's all you need. Or maybe I intercept all of the blood, but then it doesn't have to be a particularly efficient filter."

  Wood was a physicist, not a doctor, but that wasn't necessarily a liability, at this stage. "Glad15"People in biology and medicine don't do arithmetic," he said. He wasn't being critical of biologists and physicians: this was, after all, a man who read medical journals for fun. He meant that the traditions of medicine encouraged qualitative observation and interpretation. But what physicists do--out of sheer force of habit and training--is measure things and compare measurements, and do the math to put measurements in context. At that moment, while reading The New England Journal, Wood had the advantages of someone looking at a familiar fact with a fresh perspective.

  That was also why Myhrvold had wanted to take his crew to St. Louis to meet with the surgeons. He likes to say that the only time a physicist and a brain surgeon meet is when the physicist is about to be cut open--and to his mind that made no sense. Surgeons had all kinds of problems that they didn't realize had solutions, and physicists had all kinds of solutions to things that they didn't realize were problems. At one point, Myhrvold asked the surgeons what, in a perfect world, would make their lives easier, and they said that they wanted an X-ray that went only skin deep. They wanted to know, before they made their first incision, what was just below the surface. When the Intellectual Ventures crew heard that, their response was amazement. "That's your dream? A subcutaneous X-ray? We can do that."

  Insight could be orchestrated: that was the lesson. If someone who knew how to make a filter had a conversation with someone who knew a lot about cancer and with someone who read the medical literature like a physicist, then maybe you could come up with a cancer treatment. It helped as well that Casey Tegreene had a law degree, Lowell Wood had spent his career dreaming up weapons for the government, Nathan Myhrvold was a ball of fire, Edward Jung had walked across Texas. They had different backgrounds and temperaments and perspectives, and if you gave them something to think about that they did not ordinarily think about--like hurricanes, or jet engines, or metastatic cancer--you were guaranteed a fresh set of eyes.

  There were drawbacks to this approach, of course. The outsider, not knowing what the insider knew, would make a lot of mistakes and chase down a lot of rabbit holes. Myhrvold admits that many of the ideas that come out of the invention sessions come to naught. After a session, the Ph.D.s on the I.V. staff examine each proposal closely and decide which ones are worth pursuing. They talk to outside experts; they reread the literature. Myhrvold isn't even willing to guess what his company's most promising inventions are. "That's a fool's game," he says. If ideas are cheap, there is no point in making predictions, or worrying about failures, or obsessing, like Newton and Leibniz, or Bell and Gray, over who was first. After I.V. came up with its cancer-filter idea, it discovered that there was a company, based in Rochester, that was already developing a cancer filter. Filters were a multiple. But so what? If I.V.'s design wasn't the best, Myhrvold had two thousand nine hundred and ninety-nine other ideas to pursue.

  In his living room, Myhrvold has a life-size T. rex skeleton, surrounded by all manner of other dinosaur artifacts. One of those is a cast of a nest of oviraptor eggs, each the size of an eggplant. You'd think a bird that big would have one egg, or maybe two. That's the general rule: the larger the animal, the lower the fecundity. But it didn't. For Myhrvold, it was one of the many ways in which dinosaurs could teach us about ourselves. "You know how many eggs were in that nest?" Myhrvold asked. "Thirty-two."

  7.

  In the nineteen-sixties, the sociologist Robert K. Merton wrote a famous essay on scientific discovery in which he raised the question of what the existence of multiples tells us about genius. No one is a partner to more multiples, he pointed out, than a genius, and he came to the conclusion that our romantic notion of the genius must be wrong. A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do. The genius is not a unique source of insight; he is merely an efficient source of insight. "Consider the case of Kelvin, by way of illustration," Merton writes, summarizing work he had done with his Columbia colleague Elinor Barber:

After examining some 400 of his 661 scientific communications and addresses . . . Dr. Elinor Barber and I find him testifying to at least 32 multiple discoveries in which he eventually found that his independent discoveries had also been made by others. These 32 multiples involved an aggregate of 30 other scientists, some, like Stokes, Green, Helmholtz, Cavendish, Clausius, Poincaré, Rayleigh, themselves men of undeniable genius, others, like Hankel, Pfaff, Homer Lane, Varley and Lamé, being men of talent, no doubt, but still not of the highest order. . . . For the hypothesis that each of these discoveries was destined to find expression, even if the genius of Kelvin had not obtained, there is the best of traditional proof: each was in fact made by others. Yet Kelvin's stature as a genius remains undiminished. For it required a considerable number of others to duplicate these 32 discoveries which Kelvin himself made.

  This is, surely, what an invention session is: it is Hankel, Pfaff, Homer Lane, Varley, and Lamé in a room together, and if you have them on your staff you can get a big chunk of 's discoveries, without ever needing to have Kelvin--which is fortunate, because, although there are plenty of Homer Lanes, Varleys, and Pfaffs in the world, there are very few Kelvins.

  Merton's observation about scientific geniuses is clearly not true of artistic geniuses, however. You can't pool the talents of a dozen Salieris and get Mozart's Requiem. You can't put together a committee of really talented art students and get Matisse's "La Danse." A work of artistic genius is singular, and all the arguments over calculus, the accusations back and forth between the Bell and the Gray camps, and our persistent inability to come to terms with the existence of multiples are the result of our misplaced desire to impose the paradigm of artistic invention on a world where it doesn't belong. Shakespeare owned Hamlet because he created him, as none other before or since could. Alexander Graham Bell owned the telephone only because his patent application landed on the examiner's desk a few hours before Gray's. The first kind of creation was sui generis; the second could be re-created in a warehouse outside Seattle.

  This is a confusing distinction, because we use the same words to describe both kinds of inventors, and the brilliant scientist is every bit as dazzling in person as the brilliant playwright. The unavoidable first response to Myhrvold and his crew is to think of them as a kind of dream team, but, of course, the fact that they invent as prodigiously and effortlessly as they do is evidence that they are not a dream team at all. You could put together an Intellectual Ventures in Los Angeles, if you wanted to, and Chicago, and New York and Baltimore, and anywhere you could find enough imagination, a fresh set of eyes, and a room full of Varleys and Pfaffs.

  The statistician Stephen Stigler once wrote an elegant essay about the futility of the practice of eponymy in science--that is, the practice of naming a scientific discovery after its inventor. That's another idea inappropriately borrowed from the cultural realm. As Stigler pointed out, "It can be found that Laplace employed Fourier Transforms in print before Fourier published on the topic, that Lagrange presented Laplace Transforms before Laplace began his scientific career, that Poisson published the Cauchy distribution in 1824, twenty-nine years before Cauchy touched on it in an incidental manner, and that Bienaymé stated and proved the Chebychev Inequality a decade before and in greater generality than Chebychev's first work on the topic." For that matter, the Pythagorean theorem was known before Pythagoras; Gaussian distributions were not discovered by Gauss. The examples were so legion that Stigler declared the existence of Stigler's Law: "No scientific discovery is named after its original discoverer." There are just too many people with an equal shot at those ideas floating out there in the ether. We think we're pinning medals on heroes. In fact, we're pinning tails on donkeys.

  Stigler's Law was true, Stigler gleefully pointed out, even of Stigler's Law itself. The idea that credit does not align with discovery, he reveals at the very end of his essay, was in fact first put forth by Merton. "We may expect," Stigler concluded, "that in years to come, Robert K. Merton, and his colleagues and students, will provide us with answers to these and other questions regarding eponymy, completing what, but for the Law, would be called the Merton Theory of the reward system of science."

  8.

  In April, Lowell Wood was on the East Coast for a meeting of the Hertz Foundation fellows in Woods Hole. Afterward, he came to New York to make a pilgrimage to the American Museum of Natural History. He had just half a day, so he began right away in the Dinosaur Halls. He spent what he later described as a "ridiculously prolonged" period of time at the first station in the Ornithischian Hall--the ankylosaurus shrine. He knew it by heart. His next stop was the dimetrodon, the progenitor of Mammalia. This was a family tradition. When Wood first took his daughter to the museum, she dubbed the fossil "Great Grand-Uncle Dimetrodon," and they always paid their respects to it. Next, he visited a glyptodont; this creature was the only truly armored mammal, a fact of great significance to a former weaponeer.

  What I.Q. doesn't tell you about race.

  If what I.Q. tests measure is immutable and innate, what explains the Flynn effect--the steady rise in scores across generations?

  1.

  One Saturday in November of 1984, James Flynn, a social scientist at the University of Otago, in New Zealand, received a large package in the mail. It was from a colleague in Utrecht, and it contained the results of I.Q. tests given to two generations of Dutch eighteen-year-olds. When Flynn looked through the data, he found something puzzling. The Dutch eighteen-year-olds from the nineteen-eighties scored better than those who took the same tests in the nineteen-fifties--and not just slightly better, much better.

  Curious, Flynn sent out some letters. He collected intelligence-test results from Europe, from North America, from Asia, and from the developing world, until he had data for almost thirty countries. In every case, the story was pretty much the same. I.Q.s around the world appeared to be rising by 0.3 points per year, or three points per decade, for as far back as the tests had been administered. For some reason, human beings seemed to be getting smarter.

  Flynn has been writing about the implications of his findings--now known as the Flynn effect--for almost twenty-five years. His books consist of a series of plainly stated statistical observations, in support of deceptively modest conclusions, and the evidence in support of his original observation is now so overwhelming that the Flynn effect has moved from theory to fact. What remains uncertain is how to make sense of the Flynn effect. If an American born in the nineteen-thirties has an I.Q. of 100, the Flynn effect says that his children will have I.Q.s of 108, and his grandchildren I.Q.s of close to 120--more than a standard deviation higher. If we work in the opposite direction, the typical teen-ager of today, with an I.Q. of 100, would have had grandparents with average I.Q.s of 82--seemingly below the threshold necessary to graduate from high school. And, if we go back even farther, the Flynn effect puts the average I.Q.s of the schoolchildren of 1900 at around 70, which is to suggest, bizarrely, that a century ago the United States was populated largely by people who today would be considered mentally retarded.

  2.

  For almost as long as there have been I.Q. tests, there have been I.Q. fundamentalists. H. H. Goddard, in the early years of the past century, established the idea that intelligence could be measured along a single, linear scale. One of his particular contributions was to coin the word "moron." "The people who are doing the drudgery are, as a rule, in their proper places," he wrote. Goddard was followed by Lewis Terman, in the nineteen-twenties, who rounded up the California children with the highest I.Q.s, and confidently predicted that they would sit at the top of every profession. In 1969, the psychometrician Arthur Jensen argued that programs like Head Start, which tried to boost the academic performance of minority children, were doomed to failure, because I.Q. was so heavily genetic; and in 1994, Richard Hernsterin and Charles Murray published their bestselling hereditarian primer "The Bell Curve," which argued that blacks were innately inferiour in intelligence to whites. To the I.Q. fundamentalist, two things are beyond dispute: first, that I.Q. tests measure some hard and identifiable trait that predicts the quality of our thinking; and, second, that this trait is stable--that is, it is determined by our genes and largely impervious to environmental influences.

  This is what James Watson, the co-discoverer of DNA, meant when he told an English newspaper recently that he was "inherently gloomy" about the prospects for Africa. From the perspective of an I.Q. fundamentalist, the fact that Africans score lower than Europeans on I.Q. tests suggests an ineradicable cognitive disability. In the controversy that followed, Watson was defended by the journalist William Saletan, in a three-part series for the online magazine Slate. Drawing heavily on the work of J. Philippe Rushton--a psychologist who specializes in comparing the circumference of what he calls the Negroid brain with the length of the Negroid penis--Saletan took the fundamentalist position to its logical conclusion. To erase the difference between blacks and whites, Saletan wrote, would probably require vigorous interbreeding between the races, or some kind of corrective genetic engineering aimed at upgrading African stock. "Economic and cultural theories have failed to explain most of the pattern," Saletan declared, claiming to have been "soaking [his] head in each 's computations and arguments." One argument that Saletan never soaked his head in, however, was Flynn's, because what Flynn discovered in his mailbox upsets the certainties upon which I.Q. fundamentalism rests. If whatever the thing is that I.Q. tests measure can jump so much in a generation, it can't be all that immutable and it doesn't look all that innate.

  The very fact that average I.Q.s shift over time ought to create a "crisis of confidence," Flynn writes in "What Is Intelligence?" (Cambridge; $22), his latest attempt to puzzle through the implications of his discovery. "How could such huge gains be intelligence gains? Either the children of today were far brighter than their parents or, at least in some circumstances, I.Q. tests were not good measures of intelligence."

  3.

  The best way to understand why I.Q.s rise, Flynn argues, is to look at one of the most widely used I.Q. tests, the so-called WISC (for Wechsler Intelligence Scale for Children). The WISC is composed of ten subtests, each of which measures a different aspect of I.Q. Flynn points out that scores in some of the categories--those measuring general knowledge, say, or vocabulary or the ability to do basic arithmetic--have risen only modestly over time. The big gains on the WISC are largely in the category known as "similarities," where you get questions such as "In what way are 'dogs' and 'rabbits' alike?" Today, we tend to give what, for the purposes of I.Q. tests, is the right answer: dogs and rabbits are both mammals. A nineteenth-century American would have said that "you use dogs to hunt rabbits."

  "If the everyday world is your cognitive home, it is not natural to detach abstractions and logic and the hypothetical from their concrete referents," Flynn writes. Our great-grandparents may have been perfectly intelligent. But they would have done poorly on I.Q. tests because they did not participate in the twentieth century's great cognitive revolution, in which we learned to sort experience according to a new set of abstract categories. In Flynn's phrase, we have now had to put on "scientific spectacles," which enable us to make sense of the WISC questions about similarities. To say that Dutch I.Q. scores rose substantially between 1952 and 1982 was another way of saying that the Netherlands in 1982 was, in at least certain respects, much more cognitively demanding than the Netherlands in 1952. An I.Q., in other words, measures not so much how smart we are as how modern we are.

  This is a critical distinction. When the children of Southern Italian immigrants were given I.Q. tests in the early part of the past century, for example, they recorded median scores in the high seventies and low eighties, a full standard deviation below their American and Western European counterparts. Southern Italians did as poorly on I.Q. tests as Hispanics and blacks did. As you can imagine, there was much concerned talk at the time about the genetic inferiority of Italian stock, of the inadvisability of letting so many second-class immigrants into the United States, and of the squalor that seemed endemic to Italian urban neighborhoods. Sound familiar? These days, when talk turns to the supposed genetic differences in the intelligence of certain races, Southern Italians have disappeared from the discussion. "Did their genes begin to mutate somewhere in the 1930s?" the psychologists Seymour Sarason and John Doris ask, in their account of the Italian experience. "Or is it possible that somewhere in the 1920s, if not earlier, the sociocultural history of Italo-Americans took a turn from the blacks and the Spanish Americans which permitted their assimilation into the general undifferentiated mass of Americans?"

  The psychologist Michael Cole and some colleagues once gave members of the Kpelle tribe, in Liberia, a version of the WISC similarities test: they took a basket of food, tools, containers, and clothing and asked the tribesmen to sort them into appropriate categories. To the frustration of the researchers, the Kpelle chose functional pairings. They put a potato and a knife together because a knife is used to cut a potato. "A wise man could only do such-and-such," they explained. Finally, the researchers asked, "How would a fool do it?" The tribesmen immediately re-sorted the items into the "right" categories. It can be argued that taxonomical categories are a developmental improvement--that is, that the Kpelle would be more likely to advance, technologically and scientifically, if they started to see the world that way. But to label them less intelligent than Westerners, on the basis of their performance on that test, is merely to state that they have different cognitive preferences and habits. And if I.Q. varies with habits of mind, which can be adopted or discarded in a generation, what, exactly, is all the fuss about?

  When I was growing up, my family would sometimes play Twenty Questions on long car trips. My father was one of those people who insist that the standard categories of animal, vegetable, and mineral be supplemented with a fourth category: "abstract." Abstract could mean something like "whatever it was that was going through my mind when we drove past the water tower fifty miles back." That abstract category sounds absurdly difficult, but it wasn't: it merely required that we ask a slightly different set of questions and grasp a slightly different set of conventions, and, after two or three rounds of practice, guessing the contents of someone's mind fifty miles ago becomes as easy as guessing Winston Churchill. (There is one exception. That was the trip on which my old roommate Tom Connell chose, as an abstraction, "the Unknown Soldier"--which allowed him legitimately and gleefully to answer "I have no idea" to almost every question. There were four of us playing. We gave up after an hour.) Flynn would say that my father was teaching his three sons how to put on scientific spectacles, and that extra practice probably bumped up all of our I.Q.s a few notches. But let's be clear about what this means. There's a world of difference between an I.Q. advantage that's genetic and one that depends on extended car time with Graham Gladwell.

  4.

  Flynn is a cautious and careful writer. Unlike many others in the I.Q. debates, he resists grand philosophizing. He comes back again and again to the fact that I.Q. scores are generated by paper-and-pencil tests--and making sense of those scores, he tells us, is a messy and complicated business that requires something closer to the skills of an accountant than to those of a philosopher.

  For instance, Flynn shows what happens when we recognize that I.Q. is not a freestanding number but a value attached to a specific time and a specific test. When an I.Q. test is created, he reminds us, it is calibrated or "normed" so that the test-takers in the fiftieth percentile--those exactly at the median--are assigned a score of 100. But since I.Q.s are always rising, the only way to keep that hundred-point benchmark is periodically to make the tests more difficult--to "renorm" them. The original WISC was normed in the late nineteen-forties. It was then renormed in the early nineteen-seventies, as the WISC-R; renormed a third time in the late eighties, as the WISC III; and renormed again a few years ago, as the WISC IV--with each version just a little harder than its predecessor. The notion that anyone "has" an I.Q. of a certain number, then, is meaningless unless you know which WISC he took, and when he took it, since there's a substantial difference between getting a 130 on the WISC IV and getting a 130 on the much easier WISC.

  This is not a trivial issue. I.Q. tests are used to diagnose people as mentally retarded, with a score of 70 generally taken to be the cutoff. You can imagine how the Flynn effect plays havoc with that system. In the nineteen-seventies and eighties, most states used the WISC-R to make their mental-retardation diagnoses. But since kids--even kids with disabilities--score a little higher every year, the number of children whose scores fell below 70 declined steadily through the end of the eighties. Then, in 1991, the WISC III was introduced, and suddenly the percentage of kids labelled retarded went up. The psychologists Tomoe Kanaya, Matthew Scullin, and Stephen Ceci estimated that, if every state had switched to the WISC III right away, the number of Americans labelled mentally retarded should have doubled.

  That is an extraordinary number. The diagnosis of mental disability is one of the most stigmatizing of all educational and occupational classifications--and yet, apparently, the chances of being burdened with that label are in no small degree a function of the point, in the life cycle of the WISC, at which a child happens to sit for his evaluation. "As far as I can determine, no clinical or school psychologists using the WISC over the relevant 25 years noticed that its criterion of mental retardation became more lenient over time," Flynn wrote, in a 2000 paper. "Yet no one drew the obvious moral about psychologists in the field: They simply were not making any systematic assessment of the I.Q. criterion for mental retardation."

  Flynn brings a similar precision to the question of whether Asians have a genetic advantage in I.Q., a possibility that has led to great excitement among I.Q. fundamentalists in recent years. Data showing that the Japanese had higher I.Q.s than people of European descent, for example, prompted the British psychometrician and eugenicist Richard Lynn to concoct an elaborate evolutionary explanation involving the Himalayas, really cold weather, premodern hunting practices, brain size, and specialized vowel sounds. The fact that the I.Q.s of Chinese-Americans also seemed to be elevated has led I.Q. fundamentalists to posit the existence of an international I.Q. pyramid, with Asians at the top, European whites next, and Hispanics and blacks at the bottom.

  Here was a question tailor-made for James Flynn's accounting skills. He looked first at Lynn's data, and realized that the comparison was skewed. Lynn was comparing American I.Q. estimates based on a representative sample of schoolchildren with Japanese estimates based on an upper-income, heavily urban sample. Recalculated, the Japanese average came in not at 106.6 but at 99.2. Then Flynn turned his attention to the Chinese-American estimates. They turned out to be based on a 1975 study in San Francisco's Chinatown using something called the Lorge-Thorndike Intelligence Test. But the Lorge-Thorndike test was normed in the nineteen-fifties. For children in the nineteen-seventies, it would have been a piece of cake. When the Chinese-American scores were reassessed using up-to-date intelligence metrics, Flynn found, they came in at 97 verbal and 100 nonverbal. Chinese-Americans had slightly lower I.Q.s than white Americans.

  The Asian-American success story had suddenly been turned on its head. The numbers now suggested, Flynn said, that they had succeeded not because of their higher I.Q.s. but despite their lower I.Q.s. Asians were overachievers. In a nifty piece of statistical analysis, Flynn then worked out just how great that overachievement was. Among whites, virtually everyone who joins the ranks of the managerial, professional, and technical occupations has an I.Q. of 97 or above. Among Chinese-Americans, that threshold is 90. A Chinese-American with an I.Q. of 90, it would appear, does as much with it as a white American with an I.Q. of 97.

  There should be no great mystery about Asian achievement. It has to do with hard work and dedication to higher education, and belonging to a culture that stresses professional success. But Flynn makes one more observation. The children of that first successful wave of Asian-Americans really did have I.Q.s that were higher than everyone else's--coming in somewhere around 103. Having worked their way into the upper reaches of the occupational scale, and taken note of how much the professions value abstract thinking, Asian-American parents have evidently made sure that their own children wore scientific spectacles. "Chinese Americans are an ethnic group for whom high achievement preceded high I.Q. rather than the reverse," Flynn concludes, reminding us that in our discussions of the relationship between I.Q. and success we often confuse causes and effects. "It is not easy to view the history of their achievements without emotion," he writes. That is exactly right. To ascribe Asian success to some abstract number is to trivialize it.

  5.

  Two weeks ago, Flynn came to Manhattan to debate Charles Murray at a forum sponsored by the Manhattan Institute. Their subject was the black-white I.Q. gap in America. During the twenty-five years after the Second World War, that gap closed considerably. The I.Q.s of white Americans rose, as part of the general worldwide Flynn effect, but the I.Q.s of black Americans rose faster. Then, for about a period of twenty-five years, that trend stalled--and the question was why.

  Murray showed a series of PowerPoint slides, each representing different statistical formulations of the I.Q. gap. He appeared to be pessimistic that the racial difference would narrow in the future. "By the nineteen-seventies, you had gotten most of the juice out of the environment that you were going to get," he said. That gap, he seemed to think, reflected some inherent difference between the races. "Starting in the nineteen-seventies, to put it very crudely, you had a higher proportion of black kids being born to really dumb mothers," he said. When the debate's moderator, Jane Waldfogel, informed him that the most recent data showed that the race gap had begun to close again, Murray seemed unimpressed, as if the possibility that blacks could ever make further progress was inconceivable.

  Flynn took a different approach. The black-white gap, he pointed out, differs dramatically by age. He noted that the tests we have for measuring the cognitive functioning of infants, though admittedly crude, show the races to be almost the same. By age four, the average black I.Q. is 95.4--only four and a half points behind the average white I.Q. Then the real gap emerges: from age four through twenty-four, blacks lose six-tenths of a point a year, until their scores settle at 83.4.

  That steady decline, Flynn said, did not resemble the usual pattern of genetic influence. Instead, it was exactly what you would expect, given the disparate cognitive environments that whites and blacks encounter as they grow older. Black children are more likely to be raised in single-parent homes than are white children--and single-parent homes are less cognitively complex than two-parent homes. The average I.Q. of first-grade students in schools that blacks attend is 95, which means that "kids who want to be above average don't have to aim as high." There were possibly adverse differences between black teen-age culture and white teen-age culture, and an enormous number of young black men are in jail--which is hardly the kind of environment in which someone would learn to put on scientific spectacles.

  Criminal profiling made easy.

  1.

  On November 16, 1940, workers at the Consolidated Edison building on West Sixty-fourth Street in Manhattan found a homemade pipe bomb on a windowsill. Attached was a note: "Con Edison crooks, this is for you." In September of 1941, a second bomb was found, on Nineteenth Street, just a few blocks from Con Edison's headquarters, near Union Square. It had been left in the street, wrapped in a sock. A few months later, the New York police received a letter promising to "bring the Con Edison to justice--they will pay for their dastardly deeds." Sixteen other letters followed, between 1941 and 1946, all written in block letters, many repeating the phrase "dastardly deeds" and all signed with the initials "F.P." In March of 1950, a third bomb--larger and more powerful than the others--was found on the lower level of Grand Central Terminal. The next was left in a phone booth at the New York Public Library. It exploded, as did one placed in a phone booth in Grand Central. In 1954, the Mad Bomber--as he came to be known--struck four times, once in Radio City Music Hall, sending shrapnel throughout the audience. In 1955, he struck six times. The city was in an uproar. The police were getting nowhere. Late in 1956, in desperation, Inspector Howard Finney, of the New York City Police Department's crime laboratory, and two plainclothesmen paid a visit to a psychiatrist by the name of James Brussel.

  Brussel was a Freudian. He lived on Twelfth Street, in the West Village, and smoked a pipe. In Mexico, early in his career, he had done counter-espionage work for the F.B.I. He wrote many books, including "Instant Shrink: How to Become an Expert Psychiatrist in Ten Easy Lessons." Finney put a stack of documents on Brussel's desk: photographs of unexploded bombs, pictures of devastation, photostats of F.P.'s neatly lettered missives. "I didn't miss the look in the two plainclothesmen's eyes," Brussel writes in his memoir, "Casebook of a Crime Psychiatrist." "I'd seen that look before, most often in the Army, on the faces of hard, old-line, field-grade officers who were sure this newfangled psychiatry business was all nonsense."

  He began to leaf through the case materials. For sixteen years, F.P. had been fixated on the notion that Con Ed had done him some terrible injustice. Clearly, he was clinically paranoid. But paranoia takes some time to develop. F.P. had been bombing since 1940, which suggested that he was now middle-aged. Brussel looked closely at the precise lettering of F.P.'s notes to the police. This was an orderly man. He would be cautious. His work record would be exemplary. Further, the language suggested some degree of education. But there was a stilted quality to the word choice and the phrasing. Con Edison was often referred to as "the Con Edison." And who still used the expression "dastardly deeds"? F.P. seemed to be foreign-born. Brussel looked closer at the letters, and noticed that all the letters were perfect block capitals, except the "W"s. They were misshapen, like two "U"s. To Brussel's eye, those "W"s looked like a pair of breasts. He flipped to the crime-scene descriptions. When F.P. planted his bombs in movie theatres, he would slit the underside of the seat with a knife and stuff his explosives into the upholstery. Didn't that seem like a symbolic act of penetrating a woman, or castrating a man--or perhaps both? F.P. had probably never progressed beyond the Oedipal stage. He was unmarried, a loner. Living with a mother figure. Brussel made another leap. F.P. was a Slav. Just as the use of a garrote would have suggested someone of Mediterranean extraction, the bomb-knife combination struck him as Eastern European. Some of the letters had been posted from Westchester County, but F.P. wouldn't have mailed the letters from his home town. Still, a number of cities in southeastern Connecticut had a large Slavic population. And didn't you have to pass through Westchester to get to the city from Connecticut?

  Brussel waited a moment, and then, in a scene that has become legendary among criminal profilers, he made a prediction:

"One more thing." I closed my eyes because I didn't want to see their reaction. I saw the Bomber: impeccably neat, absolutely proper. A man who would avoid the newer styles of clothing until long custom had made them conservative. I saw him clearly--much more clearly than the facts really warranted. I knew I was letting my imagination get the better of me, but I couldn't help it.
"One more " I said, my eyes closed tight. "When you catch him--and I have no doubt you will--he'll be wearing a double-breasted suit."
"Jesus!" one of the detectives whispered.
"And it will be buttoned," I said. I opened my eyes. Finney and his men were looking at each other.
"A double-breasted suit," said the Inspector.
"Yes."
"Buttoned."
"Yes."
He nodded. Without another word, they left.

  A month later, George Metesky was arrested by police in connection with the New York City bombings. His name had been changed from Milauskas. He lived in Waterbury, Connecticut, with his two older sisters. He was unmarried. He was unfailingly neat. He attended Mass regularly. He had been employed by Con Edison from 1929 to 1931, and claimed to have been injured on the job. When he opened the door to the police officers, he said, "I know why you fellows are here. You think I'm the Mad Bomber." It was midnight, and he was in his pajamas. The police asked that he get dressed. When he returned, his hair was combed into a pompadour and his shoes were newly shined. He was also wearing a double-breasted suit--buttoned.

  2.

  In a new book, "Inside the Mind of BTK," the eminent F.B.I. criminal profiler John Douglas tells the story of a serial killer who stalked the streets of Wichita, Kansas, in the nineteen-seventies and eighties. Douglas was the model for Agent Jack Crawford in "The Silence of the Lambs." He was the protégé of the pioneering F.B.I. profiler Howard Teten, who helped establish the bureau's Behavioral Science Unit, at Quantico, in 1972, and who was a protégé of Brussel--which, in the close-knit fraternity of profilers, is like being analyzed by the analyst who was analyzed by Freud. To Douglas, Brussel was the father of criminal profiling, and, in both style and logic, "Inside the Mind of BTK" pays homage to "Casebook of a Crime Psychiatrist" at every turn.

  "BTK" stood for "Bind, Torture, Kill"--the three words that the killer used to identify himself in his taunting notes to the Wichita police. He had struck first in January, 1974, when he killed thirty-eight-year-old Joseph Otero in his home, along with his wife, Julie, their son, Joey, and their eleven-year-old daughter, who was found hanging from a water pipe in the basement with semen on her leg. The following April, he stabbed a twenty-four-year-old woman. In March, 1977, he bound and strangled another young woman, and over the next few years he committed at least four more murders. The city of Wichita was in an uproar. The police were getting nowhere. In 1984, in desperation, two police detectives from Wichita paid a visit to Quantico.

  The meeting, Douglas writes, was held in a first-floor conference room of the F.B.I.'s forensic-science building. He was then nearly a decade into his career at the Behavioral Science Unit. His first two best-sellers, "Mindhunter: Inside the FBI's Elite Serial Crime Unit," and "Obsession: The FBI's Legendary Profiler Probes the Psyches of Killers, Rapists, and Stalkers and Their Victims and Tells How to Fight Back," were still in the future. Working a hundred and fifty cases a year, he was on the road constantly, but BTK was never far from his thoughts. "Some nights I'd lie awake asking myself, 'Who the hell is this BTK?' " he writes. "What makes a guy like this do what he does? What makes him tick?"

  Roy Hazelwood sat next to Douglas. A lean chain-smoker, Hazelwood specialized in sex crimes, and went on to write the best-sellers "Dark Dreams" and "The Evil That Men Do." Beside Hazelwood was an ex-Air Force pilot named Ron Walker. Walker, Douglas writes, was "whip smart" and an "exceptionally quick study." The three bureau men and the two detectives sat around a massive oak table. "The objective of our session was to keep moving forward until we ran out of juice," Douglas writes. They would rely on the typology developed by their colleague Robert Ressler, himself the author of the true-crime best-sellers "Whoever Fights Monsters" and "I Have Lived in the Monster." The goal was to paint a picture of the killer--of what sort of man BTK was, and what he did, and where he worked, and what he was like--and with that scene "Inside the Mind of BTK" begins.

  We are now so familiar with crime stories told through the eyes of the profiler that it is easy to lose sight of how audacious the genre is. The traditional detective story begins with the body and centers on the detective's search for the culprit. Leads are pursued. A net is cast, widening to encompass a bewilderingly diverse pool of suspects: the butler, the spurned lover, the embittered nephew, the shadowy European. That's a Whodunit. In the profiling genre, the net is narrowed. The crime scene doesn't initiate our search for the killer. It defines the killer for us. The profiler sifts through the case materials, looks off into the distance, and knows. "Generally, a psychiatrist can study a man and make a few reasonable predictions about what the man may do in the future--how he will react to such-and-such a stimulus, how he will behave in such-and-such a situation," Brussel writes. "What I have done is reverse the terms of the prophecy. By studying a 's deeds, I have deduced what kind of man he might be." Look for a middle-aged Slav in a double-breasted suit. Profiling stories aren't Whodunits; they're Hedunits.

  In the Hedunit, the profiler does not catch the criminal. That's for local law enforcement. He takes the meeting. Often, he doesn't write down his predictions. It's up to the visiting police officers to take notes. He does not feel the need to involve himself in the subsequent investigation, or even, it turns out, to justify his predictions. Once, Douglas tells us, he drove down to the local police station and offered his services in the case of an elderly woman who had been savagely beaten and sexually assaulted. The detectives working the crime were regular cops, and Douglas was a bureau guy, so you can imagine him perched on the edge of a desk, the others pulling up chairs around him.

  " 'Okay,' I said to the detectives. . . . 'Here's what I think,' " Douglas begins. "It's a sixteen- or seventeen-year-old high school kid. . . . He'll be disheveled-looking, he'll have scruffy hair, generally poorly groomed." He went on: a loner, kind of weird, no girlfriend, lots of bottled-up anger. He comes to the old lady's house. He knows she's alone. Maybe he's done odd jobs for her in the past. Douglas continues:

I pause in my narrative and tell them there's someone who meets this description out there. If they can find him, they've got their offender.
One detective looks at another. One of them starts to smile. "Are you a psychic, Douglas?"
"No," I say, "but my job would be a lot easier if I were."
"Because we had a psychic, Beverly Newton, in here a couple of weeks ago, and she said just about the same things."

  You might think that Douglas would bridle at that comparison. He is, after all, an agent of the Federal Bureau of Investigation, who studied with Teten, who studied with Brussel. He is an ace profiler, part of a team that restored the F.B.I.'s reputation for crime-fighting, inspired countless movies, television shows, and best-selling thrillers, and brought the modern tools of psychology to bear on the savagery of the criminal mind--and some cop is calling him a psychic. But Douglas doesn't object. Instead, he begins to muse on the ineffable origins of his insights, at which point the question arises of what exactly this mysterious art called profiling is, and whether it can be trusted. Douglas writes,

What I try to do with a case is to take in all the evidence I have to work with . . . and then put myself mentally and emotionally in the head of the offender. I try to think as he does. Exactly how this happens, I'm not sure, any more than the novelists such as Tom Harris who've consulted me over the years can say exactly how their characters come to life. If there's a psychic component to this, I won't run from it.

  3.

  In the late nineteen-seventies, John Douglas and his F.B.I. colleague Robert Ressler set out to interview the most notorious serial killers in the country. They started in California, since, as Douglas says, "California has always had more than its share of weird and spectacular crimes." On weekends and days off, over the next months, they stopped by one federal prison after another, until they had interviewed thirty-six murderers.

  Douglas and Ressler wanted to know whether there was a pattern that connected a killer's life and personality with the nature of his crimes. They were looking for what psychologists would call a homology, an agreement between character and action, and, after comparing what they learned from the killers with what they already knew about the characteristics of their murders, they became convinced that they'd found one.

  Serial killers, they concluded, fall into one of two categories. Some crime scenes show evidence of logic and planning. The victim has been hunted and selected, in order to fulfill a specific fantasy. The recruitment of the victim might involve a ruse or a con. The perpetrator maintains control throughout the offense. He takes his time with the victim, carefully enacting his fantasies. He is adaptable and mobile. He almost never leaves a weapon behind. He meticulously conceals the body. Douglas and Ressler, in their respective books, call that kind of crime "organized."

  In a "disorganized" crime, the victim isn't chosen logically. She's seemingly picked at random and "blitz-attacked," not stalked and coerced. The killer might grab a steak knife from the kitchen and leave the knife behind. The crime is so sloppily executed that the victim often has a chance to fight back. The crime might take place in a high-risk environment. "Moreover, the disorganized killer has no idea of, or interest in, the personalities of his victims," Ressler writes in "Whoever Fights Monsters." "He does not want to know who they are, and many times takes steps to obliterate their personalities by quickly knocking them unconscious or covering their faces or otherwise disfiguring them."

  Each of these styles, the argument goes, corresponds to a personality type. The organized killer is intelligent and articulate. He feels superior to those around him. The disorganized killer is unattractive and has a poor self-image. He often has some kind of disability. He's too strange and withdrawn to be married or have a girlfriend. If he doesn't live alone, he lives with his parents. He has pornography stashed in his closet. If he drives at all, his car is a wreck.

  "The crime scene is presumed to reflect the murderer's behavior and personality in much the same way as furnishings reveal the homeowner's character," we're told in a crime manual that Douglas and Ressler helped write. The more they learned, the more precise the associations became. If the victim was white, the killer would be white. If the victim was old, the killer would be sexually immature.

  "In our research, we discovered that . . . frequently serial offenders had failed in their efforts to join police departments and had taken jobs in related fields, such as security guard or night watchman," Douglas writes. Given that organized rapists were preoccupied with control, it made sense that they would be fascinated by the social institution that symbolizes control. Out of that insight came another prediction: "One of the things we began saying in some of our profiles was that the UNSUB"--the unknown subject--"would drive a policelike vehicle, say a Ford Crown Victoria or Chevrolet Caprice."

  4.

  On the surface, the F.B.I.'s system seems extraordinarily useful. Consider a case study widely used in the profiling literature. The body of a twenty-six-year-old special-education teacher was found on the roof of her Bronx apartment building. She was apparently abducted just after she left her house for work, at six-thirty in the morning. She had been beaten beyond recognition, and tied up with her stockings and belt. The killer had mutilated her sexual organs, chopped off her nipples, covered her body with bites, written obscenities across her abdomen, masturbated, and then defecated next to the body.

  Let's pretend that we're an F.B.I. profiler. First question: race. The victim is white, so let's call the offender white. Let's say he's in his mid-twenties to early thirties, which is when the thirty-six men in the F.B.I.'s sample started killing. Is the crime organized or disorganized? Disorganized, clearly. It's on a rooftop, in the Bronx, in broad daylight--high risk. So what is the killer doing in the building at six-thirty in the morning? He could be some kind of serviceman, or he could live in the neighborhood. Either way, he appears to be familiar with the building. He's disorganized, though, so he's not stable. If he is employed, it's blue-collar work, at best. He probably has a prior offense, having to do with violence or sex. His relationships with women will be either nonexistent or deeply troubled. And the mutilation and the defecation are so strange that he's probably mentally ill or has some kind of substance-abuse problem. How does that sound? As it turns out, it's spot-on. The killer was Carmine Calabro, age thirty, a single, unemployed, deeply troubled actor who, when he was not in a mental institution, lived with his widowed father on the fourth floor of the building where the murder took place.

  But how useful is that profile, really? The police already had Calabro on their list of suspects: if you're looking for the person who killed and mutilated someone on the roof, you don't really need a profiler to tell you to check out the dishevelled, mentally ill guy living with his father on the fourth floor.

  That's why the F.B.I.'s profilers have always tried to supplement the basic outlines of the organized/disorganized system with telling details--something that lets the police zero in on a suspect. In the early eighties, Douglas gave a presentation to a roomful of police officers and F.B.I. agents in Marin County about the Trailside Killer, who was murdering female hikers in the hills north of San Francisco. In Douglas's view, the killer was a classic "disorganized" offender--a blitz attacker, white, early to mid-thirties, blue collar, probably with "a history of bed-wetting, fire-starting, and cruelty to animals." Then he went back to how asocial the killer seemed. Why did all the killings take place in heavily wooded areas, miles from the road? Douglas reasoned that the killer required such seclusion because he had some condition that he was deeply self-conscious about. Was it something physical, like a missing limb? But then how could he hike miles into the woods and physically overpower his victims? Finally, it came to him: " 'Another thing,' I added after a pregnant pause, 'the killer will have a speech impediment.' "

  And so he did. Now, that's a useful detail. Or is it? Douglas then tells us that he pegged the 's age as early thirties, and he turned out to be fifty. Detectives use profiles to narrow down the range of suspects. It doesn't do any good to get a specific detail right if you get general details wrong.

  In the case of Derrick Todd Lee, the Baton Rouge serial killer, the F.B.I. profile described the offender as a white male blue-collar worker, between twenty-five and thirty-five years old, who "wants to be seen as someone who is attractive and appealing to women." The profile went on, "However, his level of sophistication in interacting with women, especially women who are above him in the social strata, is low. Any contact he has had with women he has found attractive would be described by these women as 'awkward.' " The F.B.I. was right about the killer being a blue-collar male between twenty-five and thirty-five. But Lee turned out to be charming and outgoing, the sort to put on a cowboy hat and snakeskin boots and head for the bars. He was an extrovert with a number of girlfriends and a reputation as a ladies' man. And he wasn't white. He was black.

  A profile isn't a test, where you pass if you get most of the answers right. It's a portrait, and all the details have to cohere in some way if the image is to be helpful. In the mid-nineties, the British Home Office analyzed a hundred and eighty-four crimes, to see how many times profiles led to the arrest of a criminal. The profile worked in five of those cases. That's just 2.7 per cent, which makes sense if you consider the position of the detective on the receiving end of a profiler's list of conjectures. Do you believe the stuttering part? Or do you believe the thirty-year-old part? Or do you throw up your hands in frustration?

  5.

  There is a deeper problem with F.B.I. profiling. Douglas and Ressler didn't interview a representative sample of serial killers to come up with their typology. They talked to whoever happened to be in the neighborhood. Nor did they interview their subjects according to a standardized protocol. They just sat down and chatted, which isn't a particularly firm foundation for a psychological system. So you might wonder whether serial killers can really be categorized by their level of organization.

  Not long ago, a group of psychologists at the University of Liverpool decided to test the F.B.I.'s assumptions. First, they made a list of crime-scene characteristics generally considered to show organization: perhaps the victim was alive during the sex acts, or the body was posed in a certain way, or the murder weapon was missing, or the body was concealed, or torture and restraints were involved. Then they made a list of characteristics showing disorganization: perhaps the victim was beaten, the body was left in an isolated spot, the victim's belongings were scattered, or the murder weapon was improvised.

  If the F.B.I. was right, they reasoned, the crime-scene details on each of those two lists should "co-occur"--that is, if you see one or more organized traits in a crime, there should be a reasonably high probability of seeing other organized traits. When they looked at a sample of a hundred serial crimes, however, they couldn't find any support for the F.B.I.'s distinction. Crimes don't fall into one camp or the other. It turns out that they're almost always a mixture of a few key organized traits and a random array of disorganized traits. Laurence Alison, one of the leaders of the Liverpool group and the author of "The Forensic Psychologist's Casebook," told me, "The whole business is a lot more complicated than the F.B.I. imagines."

  Alison and another of his colleagues also looked at homology. If Douglas was right, then a certain kind of crime should correspond to a certain kind of criminal. So the Liverpool group selected a hundred stranger rapes in the United Kingdom, classifying them according to twenty-eight variables, such as whether a disguise was worn, whether compliments were given, whether there was binding, gagging, or blindfolding, whether there was apologizing or the theft of personal property, and so on. They then looked at whether the patterns in the crimes corresponded to attributes of the criminals--like age, type of employment, ethnicity, level of education, marital status, number of prior convictions, type of prior convictions, and drug use. Were rapists who bind, gag, and blindfold more like one another than they were like rapists who, say, compliment and apologize? The answer is no--not even slightly.

  "The fact is that different offenders can exhibit the same behaviors for completely different reasons," Brent Turvey, a forensic scientist who has been highly critical of the F.B.I.'s approach, says. "You've got a rapist who attacks a woman in the park and pulls her shirt up over her face. Why? What does that mean? There are ten different things it could mean. It could mean he "t want to see her. It could mean he doesn't want her to see him. It could mean he wants to see her breasts, he wants to imagine someone else, he wants to incapacitate her arms--all of those are possibilities. You can't just look at one behavior in isolation."

  A few years ago, Alison went back to the case of the teacher who was murdered on the roof of her building in the Bronx. He wanted to know why, if the F.B.I.'s approach to criminal profiling was based on such simplistic psychology, it continues to have such a sterling reputation. The answer, he suspected, lay in the way the profiles were written, and, sure enough, when he broke down the rooftop-killer analysis, sentence by sentence, he found that it was so full of unverifiable and contradictory and ambiguous language that it could support virtually any interpretation.

  Astrologers and psychics have known these tricks for years. The magician Ian Rowland, in his classic "The Full Facts Book of Cold Reading," itemizes them one by one, in what could easily serve as a manual for the beginner profiler. First is the Rainbow Ruse--the "statement which credits the client with both a personality trait and its opposite." ("I would say that on the whole you can be rather a quiet, self effacing type, but when the circumstances are right, you can be quite the life and soul of the party if the mood strikes you.") The Jacques Statement, named for the character in "As You Like It" who gives the Seven Ages of Man speech, tailors the prediction to the age of the subject. To someone in his late thirties or early forties, for example, the psychic says, "If you are honest about it, you often get to wondering what happened to all those dreams you had when you were younger." There is the Barnum Statement, the assertion so general that anyone would agree, and the Fuzzy Fact, the seemingly factual statement couched in a way that "leaves plenty of scope to be developed into something more specific." ("I can see a connection with Europe, possibly Britain, or it could be the warmer, Mediterranean part?") And that's only the start: there is the Greener Grass technique, the Diverted Question, the Russian Doll, Sugar Lumps, not to mention Forking and the Good Chance Guess--all of which, when put together in skillful combination, can convince even the most skeptical observer that he or she is in the presence of real insight.

  "Moving on to career matters, you don't work with children, do you?" Rowland will ask his subjects, in an example of what he dubs the "Vanishing Negative."

  No, I don't.

  "No, I thought not. That's not really your role."

  Of course, if the subject answers differently, there's another way to play the question: "Moving on to career matters, you don't work with children, do you?"

  I do, actually, part time.

  "Yes, I thought so."

  After Alison had analyzed the rooftop-killer profile, he decided to play a version of the cold-reading game. He gave the details of the crime, the profile prepared by the F.B.I., and a description of the offender to a group of senior police officers and forensic professionals in England. How did they find the profile? Highly accurate. Then Alison gave the same packet of case materials to another group of police officers, but this time he invented an imaginary offender, one who was altogether different from Calabro. The new killer was thirty-seven years old. He was an alcoholic. He had recently been laid off from his job with the water board, and had met the victim before on one of his rounds. What's more, Alison claimed, he had a history of violent relationships with women, and prior convictions for assault and burglary. How accurate did a group of experienced police officers find the F.B.I.'s profile when it was matched with the phony offender? Every bit as accurate as when it was matched to the real offender.

  James Brussel didn't really see the Mad Bomber in that pile of pictures and photostats, then. That was an illusion. As the literary scholar Donald Foster pointed out in his 2000 book "Author Unknown," Brussel cleaned up his predictions for his memoirs. He actually told the police to look for the bomber in White Plains, sending the N.Y.P.D.'s bomb unit on a wild goose chase in Westchester County, sifting through local records. Brussel also told the police to look for a man with a facial scar, which Metesky didn't have. He told them to look for a man with a night job, and Metesky had been largely unemployed since leaving Con Edison in 1931. He told them to look for someone between forty and fifty, and Metesky was over fifty. He told them to look for someone who was an "expert in civil or military ordnance" and the closest Metesky came to that was a brief stint in a machine shop. And Brussel, despite what he wrote in his memoir, never said that the Bomber would be a Slav. He actually told the police to look for a man "born and educated in Germany," a prediction so far off the mark that the Mad Bomber himself was moved to object. At the height of the police investigation, when the New York Journal American offered to print any communications from the Mad Bomber, Metesky wrote in huffily to say that "the nearest to my being 'Teutonic' is that my father boarded a liner in Hamburg for passage to this country--about sixty-five years ago."

  The true hero of the case wasn't Brussel; it was a woman named Alice Kelly, who had been assigned to go through Con Edison's personnel files. In January, 1957, she ran across an employee complaint from the early nineteen-thirties: a generator wiper at the Hell Gate plant had been knocked down by a backdraft of hot gases. The worker said that he was injured. The company said that he wasn't. And in the flood of angry letters from the ex-employee Kelly spotted a threat--to "take justice in my own hands"--that had appeared in one of the Mad Bomber's letters. The name on the file was George Metesky.

  Brussel did not really understand the mind of the Mad Bomber. He seems to have understood only that, if you make a great number of predictions, the ones that were wrong will soon be forgotten, and the ones that turn out to be true will make you famous. The Hedunit is not a triumph of forensic analysis. It's a party trick.

  6.

  "Here's where I'm at with this guy," Douglas said, kicking off the profiling session with which "Inside the Mind of BTK" begins. It was 1984. The killer was still at large. Douglas, Hazelwood, and Walker and the two detectives from Wichita were all seated around the oak table. Douglas took off his suit jacket and draped it over his chair. "Back when he started in 1974, he was in his mid-to-late twenties," Douglas began. "It's now ten years later, so that would put him in his mid-to-late thirties."

  It was Walker's turn: BTK had never engaged in any sexual penetration. That suggested to him someone with an "inadequate, immature sexual history." He would have a "lone-wolf type of personality. But he's not alone because he's shunned by others--it's because he chooses to be alone. . . . He can function in social settings, but only on the surface. He may have women friends he can talk to, but he'd feel very inadequate with a peer-group female." Hazelwood was next. BTK would be "heavily into masturbation." He went on, "Women who have had sex with this guy would describe him as aloof, uninvolved, the type who is more interested in her servicing him than the other way around."

  Douglas followed his lead. "The women he's been with are either many years younger, very naïve, or much older and depend on him as their meal ticket," he ventured. What's more, the profilers determined, BTK would drive a "decent" automobile, but it would be "nondescript."

  At this point, the insights began piling on. Douglas said he'd been thinking that BTK was married. But now maybe he was thinking he was divorced. He speculated that BTK was lower middle class, probably living in a rental. Walker felt BTK was in a "lower-paying white collar job, as opposed to blue collar." Hazelwood saw him as "middle class" and "articulate." The consensus was that his I.Q. was somewhere between 105 and 145. Douglas wondered whether he was connected with the military. Hazelwood called him a "now" person, who needed "instant gratification."

  Walker said that those who knew him "might say they remember him, but didn't really know much about him." Douglas then had a flash--"It was a sense, almost a knowing"--and said, "I wouldn't be surprised if, in the job he's in today, that he's wearing some sort of uniform. . . . This guy isn't mental. But he is crazy like a fox."

  They had been at it for almost six hours. The best minds in the F.B.I. had given the Wichita detectives a blueprint for their investigation. Look for an American male with a possible connection to the military. His I.Q. will be above 105. He will like to masturbate, and will be aloof and selfish in bed. He will drive a decent car. He will be a "now" person. He won't be comfortable with women. But he may have women friends. He will be a lone wolf. But he will be able to function in social settings. He won't be unmemorable. But he will be unknowable. He will be either never married, divorced, or married, and if he was or is married his wife will be younger or older. He may or may not live in a rental, and might be lower class, upper lower class, lower middle class or middle class. And he will be crazy like a fox, as opposed to being mental. If you're keeping score, that's a Jacques Statement, two Barnum Statements, four Rainbow Ruses, a Good Chance Guess, two predictions that aren't really predictions because they could never be verified--and nothing even close to the salient fact that BTK was a pillar of his community, the president of his church and the married father of two.

  Enron, intelligence, and the perils of too much information.

  1.

  On the afternoon of October 23, 2006, Jeffrey Skilling sat at a table at the front of a federal courtroom in Houston, Texas. He was wearing a navy-blue suit and a tie. He was fifty-two years old, but looked older. Huddled around him were eight lawyers from his defense team. Outside, television-satellite trucks were parked up and down the block.

  "We are here this afternoon," Judge Simeon Lake began, "for sentencing in United States of America versus Jeffrey K. Skilling, Criminal No. H-04-25." He addressed the defendant directly: "Mr. Skilling, you may now make a statement and present any information in mitigation."

  Skilling stood up. Enron, the company he had built into an energy-trading leviathan, had collapsed into bankruptcy almost exactly five years before. In May, he had been convicted by a jury of fraud. Under a settlement agreement, almost everything he owned had been turned over to a fund to compensate former shareholders.

  He spoke haltingly, stopping in mid-sentence. "In terms of remorse, Your Honor, I can't imagine more remorse," he said. He had "friends who have died, good men." He was innocent--"innocent of every one of these charges." He spoke for two or three minutes and sat down.

  Judge Lake called on Anne Beliveaux, who worked as the senior administrative assistant in Enron's tax department for eighteen years. She was one of nine people who had asked to address the sentencing hearing.

  "How would you like to be facing living off of sixteen hundred dollars a month, and that is what I'm facing," she said to Skilling. Her retirement savings had been wiped out by the Enron bankruptcy. "And, Mr. Skilling, that only is because of greed, nothing but greed. And you should be ashamed of yourself."

  The next witness said that Skilling had destroyed a good company, the third witness that Enron had been undone by the misconduct of its management; another lashed out at Skilling directly. "Mr. Skilling has proven to be a liar, a thief, and a drunk," a woman named Dawn Powers Martin, a twenty-two-year veteran of Enron, told the court. "Mr. Skilling has cheated me and my daughter of our retirement dreams. Now it's his time to be robbed of his freedom to walk the earth as a free man." She turned to Skilling and said, "While you dine on Chateaubriand and champagne, my daughter and I clip grocery coupons and eat leftovers." And on and on it went.

  The Judge asked Skilling to rise.

  "The evidence established that the defendant repeatedly lied to investors, including Enron's own employees, about various aspects of Enron's business," the Judge said. He had no choice but to be harsh: Skilling would serve two hundred and ninety-two months in prison--twenty-four years. The man who headed a firm that Fortune ranked among the "most admired" in the world had received one of the heaviest sentences ever given to a white-collar criminal. He would leave prison an old man, if he left prison at all.

  "I only have one request, Your Honor," Daniel Petrocelli, Skilling's lawyer, said. "If he received ten fewer months, which shouldn't make a difference in terms of the goals of sentencing, if you do the math and you subtract fifteen per cent for good time, he then qualifies under Bureau of Prisons policies to be able to serve his time at a lower facility. Just a ten-month reduction in sentence . . ."

  It was a plea for leniency. Skilling wasn't a murderer or a rapist. He was a pillar of the Houston community, and a small adjustment in his sentence would keep him from spending the rest of his life among hardened criminals.

  "No," Judge Lake said.

  2.

  The national-security expert Gregory Treverton has famously made a distinction between puzzles and mysteries. Osama bin Laden's whereabouts are a puzzle. We can't find him because we don't have enough information. The key to the puzzle will probably come from someone close to bin Laden, and until we can find that source bin Laden will remain at large.

  The problem of what would happen in Iraq after the toppling of Saddam Hussein was, by contrast, a mystery. It wasn't a question that had a simple, factual answer. Mysteries require judgments and the assessment of uncertainty, and the hard part is not that we have too little information but that we have too much. The C.I.A. had a position on what a post-invasion Iraq would look like, and so did the Pentagon and the State Department and Colin Powell and Dick Cheney and any number of political scientists and journalists and think-tank fellows. For that matter, so did every cabdriver in Baghdad.

  The distinction is not trivial. If you consider the motivation and methods behind the attacks of September 11th to be mainly a puzzle, for instance, then the logical response is to increase the collection of intelligence, recruit more spies, add to the volume of information we have about Al Qaeda. If you consider September 11th a mystery, though, you'd have to wonder whether adding to the volume of information will only make things worse. You'd want to improve the analysis within the intelligence community; you'd want more thoughtful and skeptical people with the skills to look more closely at what we already know about Al Qaeda. You'd want to send the counterterrorism team from the C.I.A. on a golfing trip twice a month with the counterterrorism teams from the F.B.I. and the N.S.A. and the Defense Department, so they could get to know one another and compare notes.

  If things go wrong with a puzzle, identifying the culprit is easy: it's the person who withheld information. Mysteries, though, are a lot murkier: sometimes the information we've been given is inadequate, and sometimes we aren't very smart about making sense of what we've been given, and sometimes the question itself cannot be answered. Puzzles come to satisfying conclusions. Mysteries often don't.

  If you sat through the trial of Jeffrey Skilling, you'd think that the Enron scandal was a puzzle. The company, the prosecution said, conducted shady side deals that no one quite understood. Senior executives withheld critical information from investors. Skilling, the architect of the firm's strategy, was a liar, a thief, and a drunk. We were not told enough--the classic puzzle premise--was the central assumption of the Enron prosecution.

  "This is a simple case, ladies and gentlemen," the lead prosecutor for the Department of Justice said in his closing arguments to the jury:

Because it's so simple, I'm probably going to end before my allotted time. It's black-and-white. Truth and lies. The shareholders, ladies and gentlemen, . . . buy a share of stock, and for that they're not entitled to much but they're entitled to the truth. They're entitled for the officers and employees of the company to put their interests ahead of their own. They're entitled to be told what the financial condition of the company is.

They are entitled to honesty, ladies and gentlemen.

  But the prosecutor was wrong. Enron wasn't really a puzzle. It was a mystery.

  3.

  In late July of 2000, Jonathan Weil, a reporter at the Dallas bureau of the Wall Street Journal, got a call from someone he knew in the investment-management business. Weil wrote the stock column, called "Heard in Texas," for the paper's regional edition, and he had been closely following the big energy firms based in Houston--Dynegy, El Paso, and Enron. His caller had a suggestion. "He said, 'You really ought to check out Enron and Dynegy and see where their earnings come from,' " Weil recalled. "So I did."

  Weil was interested in Enron's use of what is called mark-to-market accounting, which is a technique used by companies that engage in complicated financial trading. Suppose, for instance, that you are an energy company and you enter into a hundred-million-dollar contract with the state of California to deliver a billion kilowatt hours of electricity in 2016. How much is that contract worth? You aren't going to get paid for another ten years, and you aren't going to know until then whether you'll show a profit on the deal or a loss. Nonetheless, that hundred-million-dollar promise clearly matters to your bottom line. If electricity steadily drops in price over the next several years, the contract is going to become a hugely valuable asset. But if electricity starts to get more expensive as 2016 approaches, you could be out tens of millions of dollars. With mark-to-market accounting, you estimate how much revenue the deal is going to bring in and put that number in your books at the moment you sign the contract. If, down the line, the estimate changes, you adjust the balance sheet accordingly.

  When a company using mark-to-market accounting says it has made a profit of ten million dollars on revenues of a hundred million, then, it could mean one of two things. The company may actually have a hundred million dollars in its bank accounts, of which ten million will remain after it has paid its bills. Or it may be guessing that it will make ten million dollars on a deal where money may not actually change hands for years. Weil's source wanted him to see how much of the money Enron said it was making was "real."

  Weil got copies of the firm's annual reports and quarterly filings and began comparing the income statements and the cash-flow statements. "It took me a while to figure out everything I needed to," Weil said. "It probably took a good month or so. There was a lot of noise in the financial statements, and to zero in on this particular issue you needed to cut through a lot of that." Weil spoke to Thomas Linsmeier, then an accounting professor at Michigan State, and they talked about how some finance companies in the nineteen-nineties had used mark-to-market accounting on subprime loans--that is, loans made to higher-credit-risk consumers--and when the economy declined and consumers defaulted or paid off their loans more quickly than expected, the lenders suddenly realized that their estimates of how much money they were going to make were far too generous. Weil spoke to someone at the Financial Accounting Standards Board, to an analyst at the Moody's investment-rating agency, and to a dozen or so others. Then he went back to Enron's financial statements. His conclusions were sobering. In the second quarter of 2000, $747 million of the money Enron said it had made was "unrealized"--that is, it was money that executives thought they were going to make at some point in the future. If you took that imaginary money away, Enron had shown a significant loss in the second quarter. This was one of the most admired companies in the United States, a firm that was then valued by the stock market as the seventh-largest corporation in the country, and there was practically no cash coming into its coffers.

  Weil's story ran in the Journal on September 20, 2000. A few days later, it was read by a Wall Street financier named James Chanos. Chanos is a short-seller--an investor who tries to make money by betting that a company's stock will fall. "It pricked up my ears," Chanos said. "I read the 10-K and the 10-Q that first weekend," he went on, referring to the financial statements that public companies are required to file with federal regulators. "I went through it pretty quickly. I flagged right away the stuff that was questionable. I circled it. That was the first run-through. Then I flagged the pages and read the stuff I didn't understand, and reread it two or three times. I remember I spent a couple hours on it." Enron's profit margins and its return on equity were plunging, Chanos saw. Cash flow--the life blood of any business--had slowed to a trickle, and the company's rate of return was less than its cost of capital: it was as if you had borrowed money from the bank at nine-per-cent interest and invested it in a savings bond that paid you seven-per-cent interest. "They were basically liquidating themselves," Chanos said.

  In November of that year, Chanos began shorting Enron stock. Over the next few months, he spread the word that he thought the company was in trouble. He tipped off a reporter for Fortune, Bethany McLean. She read the same reports that Chanos and Weil had, and came to the same conclusion. Her story, under the headline "IS ENRON OVERPRICED?," ran in March of 2001. More and more journalists and analysts began taking a closer look at Enron, and the stock began to fall. In August, Skilling resigned. Enron's credit rating was downgraded. Banks became reluctant to lend Enron the money it needed to make its trades. By December, the company had filed for bankruptcy.

  Enron's downfall has been documented so extensively that it is easy to overlook how peculiar it was. Compare Enron, for instance, with Watergate, the prototypical scandal of the nineteen-seventies. To expose the White House coverup, Bob Woodward and Carl Bernstein used a source--Deep Throat--who had access to many secrets, and whose identity had to be concealed. He warned Woodward and Bernstein that their phones might be tapped. When Woodward wanted to meet with Deep Throat, he would move a flower pot with a red flag in it to the back of his apartment balcony. That evening, he would leave by the back stairs, take multiple taxis to make sure he wasn't being followed, and meet his source in an underground parking garage at 2 A.M. Here, from "All the President's Men," is Woodward's climactic encounter with Deep Throat:

"Okay," he said softly. "This is very serious. You can safely say that fifty people worked for the White House and CRP to play games and spy and sabotage and gather intelligence. Some of it is beyond belief, kicking at the opposition in every imaginable way."

Deep Throat nodded confirmation as Woodward ran down items on a list of tactics that he and Bernstein had heard were used against the political opposition: bugging, following people, false press leaks, fake letters, cancelling campaign rallies, investigating campaign workers' private lives, planting spies, stealing documents, planting provocateurs in political demonstrations.

"It's all in the files," Deep Throat said. "Justice and the Bureau know about it, even though it wasn't followed up."

Woodward was stunned. Fifty people directed by the White House and CRP to destroy the opposition, no holds barred?

Deep Throat nodded.

The White House had been willing to subvert--was that the right word?--the whole electoral process? Had actually gone ahead and tried to do it?

Another nod. Deep Throat looked queasy.

And hired fifty agents to do it?

"You can safely say more than fifty," Deep Throat said. Then he turned, walked up the ramp and out. It was nearly 6:00 a.m.

  Watergate was a classic puzzle: Woodward and Bernstein were searching for a buried secret, and Deep Throat was their guide.

  Did Jonathan Weil have a Deep Throat? Not really. He had a friend in the investment-management business with some suspicions about energy-trading companies like Enron, but the friend wasn't an insider. Nor did Weil's source direct him to files detailing the clandestine activities of the company. He just told Weil to read a series of public documents that had been prepared and distributed by Enron itself. Woodward met with his secret source in an underground parking garage in the hours before dawn. Weil called up an accounting expert at Michigan State.

  When Weil had finished his reporting, he called Enron for comment. "They had their chief accounting officer and six or seven people fly up to Dallas," Weil says. They met in a conference room at the Journal's offices. The Enron officials acknowledged that the money they said they earned was virtually all money that they hoped to earn. Weil and the Enron officials then had a long conversation about how certain Enron was about its estimates of future earnings. "They were telling me how brilliant the people who put together their mathematical models were," Weil says. "These were M.I.T. Ph.D.s. I said, 'Were your mathematical models last year telling you that the California electricity markets would be going berserk this year? No? Why not?' They said, 'Well, this is one of those crazy events.' It was late September, 2000, so I said, 'Who do you think is going to win? Bush or Gore?' They said, 'We don't know.' I said, 'Don't you think it will make a difference to the market whether you have an environmentalist Democrat in the White House or a Texas oil man?" It was all very civil. "There was no dispute about the numbers," Weil went on. "There was only a difference in how you should interpret them."

  Of all the moments in the Enron unravelling, this meeting is surely the strangest. The prosecutor in the Enron case told the jury to send Jeffrey Skilling to prison because Enron had hidden the truth: You're "entitled to be told what the financial condition of the company is," the prosecutor had said. But what truth was Enron hiding here? Everything Weil learned for his Enron exposé came from Enron, and when he wanted to confirm his numbers the company's executives got on a plane and sat down with him in a conference room in Dallas.

  Nixon never went to see Woodward and Bernstein at the Washington Post. He hid in the White House.

  4.

  The second, and perhaps more consequential, problem with Enron's accounting was its heavy reliance on what are called special-purpose entities, or S.P.E.s.

  An S.P.E. works something like this. Your company isn't doing well; sales are down and you are heavily in debt. If you go to a bank to borrow a hundred million dollars, it will probably charge you an extremely high interest rate, if it agrees to lend to you at all. But you've got a bundle of oil leases that over the next four or five years are almost certain to bring in a hundred million dollars. So you hand them over to a partnership--the S.P.E.--that you have set up with some outside investors. The bank then lends a hundred million dollars to the partnership, and the partnership gives the money to you. That bit of financial maneuvering makes a big difference. This kind of transaction did not (at the time) have to be reported in the company's balance sheet. So a company could raise capital without increasing its indebtedness. And because the bank is almost certain the leases will generate enough money to pay off the loan, it's willing to lend its money at a much lower interest rate. S.P.E.s have become commonplace in corporate America.

  Enron introduced all kinds of twists into the S.P.E. game. It didn't always put blue-chip assets into the partnerships--like oil leases that would reliably generate income. It sometimes sold off less than sterling assets. Nor did it always sell those assets to outsiders, who presumably would raise questions about the value of what they were buying. Enron had its own executives manage these partnerships. And the company would make the deals work--that is, get the partnerships and the banks to play along--by guaranteeing that, if whatever they had to sell declined in value, Enron would make up the difference with its own stock. In other words, Enron didn't sell parts of itself to an outside entity; it effectively sold parts of itself to itself--a strategy that was not only legally questionable but extraordinarily risky. It was Enron's tangle of financial obligations to the S.P.E.s that ended up triggering the collapse.

  When the prosecution in the Skilling case argued that the company had misled its investors, they were referring, in part, to these S.P.E.s. Enron's management, the argument went, had an obligation to reveal the extent to which it had staked its financial livelihood on these shadowy side deals. As the Powers Committee, a panel charged with investigating Enron's demise, noted, the company "failed to achieve a fundamental objective: they did not communicate the essence of the transactions in a sufficiently clear fashion to enable a reader of [Enron's] financial statements to understand what was going on." In short, we weren't told enough.

  Here again, though, the lessons of the Enron case aren't nearly so straightforward. The public became aware of the nature of these S.P.E.s through the reporting of several of Weil's colleagues at the Wall Street Journal--principally John Emshwiller and Rebecca --starting in the late summer of 2001. And how was Emshwiller tipped off to Enron's problems? The same way Jonathan Weil and Jim Chanos were: he read what Enron had reported in its own public filings. Here is the description of Emshwiller's epiphany, as described in Kurt Eichenwald's "Conspiracy of Fools," the definitive history of the Enron debacle. (Note the verb "scrounged," which Eichenwald uses to describe how Emshwiller found the relevant Enron documents. What he means by that is "downloaded.")

It was section eight, called "Related Party Transactions," that got John Emshwiller's juices flowing.

After being assigned to follow the Skilling resignation, Emshwiller had put in a request for an interview, then scrounged up a copy of Enron's most recent SEC filing in search of any nuggets.

What he found startled him. Words about some partnerships run by an unidentified "senior officer." Arcane stuff, maybe, but the numbers were huge. Enron reported more than $240 million in revenues in the first six months of the year from its dealings with them.

  Enron's S.P.E.s were, by any measure, evidence of extraordinary recklessness and incompetence. But you can't blame Enron for covering up the existence of its side deals. It didn't; it disclosed them. The argument against the company, then, is more accurately that it didn't tell its investors enough about its S.P.E.s. But what is enough? Enron had some three thousand S.P.E.s, and the paperwork for each one probably ran in excess of a thousand pages. It scarcely would have helped investors if Enron had made all three million pages public. What about an edited version of each deal? Steven Schwarcz, a professor at Duke Law School, recently examined a random sample of twenty S.P.E. disclosure statements from various corporations--that is, summaries of the deals put together for interested parties--and found that on average they ran to forty single-spaced pages. So a summary of Enron's S.P.E.s would have come to a hundred and twenty thousand single-spaced pages. What about a summary of all those summaries? That's what the bankruptcy examiner in the Enron case put together, and it took up a thousand pages. Well, then, what about a summary of the summary of the summaries? That's what the Powers Committee put together. The committee looked only at the "substance of the most significant transactions," and its accounting still ran to two hundred numbingly complicated pages and, as Schwarcz points out, that was "with the benefit of hindsight and with the assistance of some of the finest legal talent in the nation."

  A puzzle grows simpler with the addition of each new piece of information: if I tell you that Osama bin Laden is hiding in Peshawar, I make the problem of finding him an order of magnitude easier, and if I add that he's hiding in a neighborhood in the northwest corner of the city, the problem becomes simpler still. But here the rules seem different. According to the Powers report, many on Enron's board of directors failed to understand "the economic rationale, the consequences, and the risks" of their company's S.P.E. deals--and the directors sat in meetings where those deals were discussed in detail. In "Conspiracy of Fools," Eichenwald convincingly argues that Andrew Fastow, Enron's chief financial officer, didn't understand the full economic implications of the deals, either, and he was the one who put them together.

  "These were very, very sophisticated, complex transactions," says Anthony Catanach, who teaches accounting at the Villanova University School of Business and has written extensively on the Enron case. Referring to Enron's accounting firm, he said, "I'm not even sure any of Arthur Andersen's field staff at Enron would have been able to understand them, even if it was all in front of them. This is senior-management-type stuff. I spent two months looking at the Powers report, just diagramming it. These deals were really convoluted."

  Enron's S.P.E.s, it should be noted, would have been this hard to understand even if they were standard issue. S.P.E.s are by nature difficult. A company creates an S.P.E. because it wants to reassure banks about the risks of making a loan. To provide that reassurance, the company gives its lenders and partners very detailed information about a specific portion of its business. And the more certainty a company creates for the lender--the more guarantees and safeguards and explanations it writes into the deal--the less comprehensible the transaction becomes to outsiders. Schwarcz writes that Enron's disclosure was "necessarily imperfect." You can try to make financial transactions understandable by simplifying them, in which case you run the risk of smoothing over some of their potential risks, or you can try to disclose every potential pitfall, in which case you'll make the disclosure so unwieldy that no one will be able to understand it. To Schwarcz, all Enron proves is that in an age of increasing financial complexity the "disclosure paradigm"--the idea that the more a company tells us about its business, the better off we are--has become an anachronism.

  5.

  During the summer of 1943, Nazi propaganda broadcasts boasted that the German military had developed a devastating "super weapon." Immediately, the Allied intelligence services went to work. Spies confirmed that the Germans had built a secret weapons factory. Aerial photographs taken over northern France showed a strange new concrete installation pointed in the direction of England. The Allies were worried. Bombing missions were sent to try to disrupt the mysterious operation, and plans were drawn up to deal with the prospect of devastating new attacks on English cities. Nobody was sure, though, whether the weapon was real. There seemed to be weapons factories there, but it wasn't evident what was happening inside them. And there was a launching pad in northern France, but it might have been just a decoy, designed to distract the Allies from bombing real targets. The German secret weapon was a puzzle, and the Allies didn't have enough information to solve it. There was another way to think about the problem, though, which ultimately proved far more useful: treat the German secret weapon as a mystery.

  The mystery-solvers of the Second World War were small groups of analysts whose job was to listen to the overseas and domestic propaganda broadcasts of Japan and Germany. The British outfit had been around since shortly before the First World War and was run by the BBC. The American operation was known as the Screwball Division, the historian Stephen Mercado writes, and in the early nineteen-forties had been housed in a nondescript office building on K Street, in Washington. The analysts listened to the same speeches that anyone with a shortwave radio could listen to. They simply sat at their desks with headphones on, working their way through hours and hours of Nazi broadcasts. Then they tried to figure out how what the Nazis said publicly--about, for instance, the possibility of a renewed offensive against Russia--revealed what they felt about, say, invading Russia. One journalist at the time described the propaganda analysts as "the greatest collection of individualists, international rolling stones, and slightly batty geniuses ever gathered together in one organization." And they had very definite thoughts about the Nazis' secret weapon.

  The German leadership, first of all, was boasting about the secret weapon in domestic broadcasts. That was important. Propaganda was supposed to boost morale. If the Nazi leadership said things that turned out to be misleading, its credibility would fall. When German U-boats started running into increasingly effective Allied resistance in the spring of 1943, for example, Joseph Goebbels, the Nazi minister of propaganda, tacitly acknowledged the bad news, switching his emphasis from trumpeting recent victories to predicting long-term success, and blaming the weather for hampering U-boat operations. Up to that point, Goebbels had never lied to his own people about that sort of news. So if he said that Germany had a devastating secret weapon it meant, in all likelihood, that Germany had a devastating secret weapon.

  Starting from that premise, the analysts then mined the Nazis' public pronouncements for more insights. It was, they concluded, "beyond reasonable doubt" that as of November, 1943, the weapon existed, that it was of an entirely new type, that it could not be easily countered, that it would produce striking results, and that it would shock the civilian population upon whom it would be used. It was, furthermore, "highly probable" that the Germans were past the experimental stage as of May of 1943, and that something had happened in August of that year that significantly delayed deployment. The analysts based this inference, in part, on the fact that, in August, the Nazis abruptly stopped mentioning their secret weapon for ten days, and that when they started again their threats took on a new, less certain, tone. Finally, it could be tentatively estimated that the weapon would be ready between the middle of January and the middle of April, with a month's margin of error on either side. That inference, in part, came from Nazi propaganda in late 1943, which suddenly became more serious and specific in tone, and it seemed unlikely that Goebbels would raise hopes in this way if he couldn't deliver within a few months. The secret weapon was the Nazis' fabled V-1 rocket, and virtually every one of the propaganda analysts' predictions turned out to be true.

  The political scientist Alexander George described the sequence of V-1 rocket inferences in his 1959 book "Propaganda Analysis," and the striking thing about his account is how contemporary it seems. The spies were fighting a nineteenth-century war. The analysts belonged to our age, and the lesson of their triumph is that the complex, uncertain issues that the modern world throws at us require the mystery paradigm.

  Diagnosing prostate cancer used to be a puzzle, for example: the doctor would do a rectal exam and feel for a lumpy tumor on the surface of the patient's prostate. These days, though, we don't wait for patients to develop the symptoms of prostate cancer. Doctors now regularly test middle-aged men for elevated levels of PSA, a substance associated with prostate changes, and, if the results look problematic, they use ultrasound imaging to take a picture of the prostate. Then they perform a biopsy, removing tiny slices of the gland and examining the extracted tissue under a microscope. Much of that flood of information, however, is inconclusive: elevated levels of PSA don't always mean that you have cancer, and normal levels of PSA don't always mean that you don't--and, in any case, there's debate about what constitutes a "normal" PSA level. Nor is the biopsy definitive: because what a pathologist is looking for is early evidence of cancer--and in many cases merely something that might one day turn into cancer--two equally skilled pathologists can easily look at the same sample and disagree about whether there is any cancer present. Even if they do agree, they may disagree about the benefits of treatment, given that most prostate cancers grow so slowly that they never cause problems. The urologist is now charged with the task of making sense of a maze of unreliable and conflicting claims. He is no longer confirming the presence of a malignancy. He's predicting it, and the certainties of his predecessors have been replaced with outcomes that can only be said to be "highly probable" or "tentatively estimated." What medical progress has meant for prostate cancer--and, as the physician H. Gilbert Welch argues in his book "Should I Be Tested for Cancer?," for virtually every other cancer as well--is the transformation of diagnosis from a puzzle to a mystery.

  That same transformation is happening in the intelligence world as well. During the Cold War, the broad context of our relationship with the Soviet bloc was stable and predictable. What we didn't know was details. As Gregory Treverton, who was a former vice-chair of the National Intelligence Council, writes in his book "Reshaping National Intelligence for an Age of Information:"

Then the pressing questions that preoccupied intelligence were puzzles, ones that could, in principle, have been answered definitively if only the information had been available: How big was the Soviet economy? How many missiles did the Soviet Union have? Had it launched a "bolt from the blue" attack? These puzzles were intelligence's stock-in-trade during the Cold War.

  With the collapse of the Eastern bloc, Treverton and others have argued that the situation facing the intelligence community has turned upside down. Now most of the world is open, not closed. Intelligence officers aren't dependent on scraps from spies. They are inundated with information. Solving puzzles remains critical: we still want to know precisely where Osama bin Laden is hiding, where North Korea's nuclear-weapons facilities are situated. But mysteries increasingly take center stage. The stable and predictable divisions of East and West have been shattered. Now the task of the intelligence analyst is to help policymakers navigate the disorder. Several years ago, Admiral Bobby R. Inman was asked by a congressional commission what changes he thought would strengthen America's intelligence system. Inman used to head the National Security Agency, the nation's premier puzzle-solving authority, and was once the deputy director of the C.I.A. He was the embodiment of the Cold War intelligence structure. His answer: revive the State Department, the one part of the U.S. foreign-policy establishment that isn't considered to be in the intelligence business at all. In a post-Cold War world of "openly available information," Inman said, "what you need are observers with language ability, with understanding of the religions, cultures of the countries they're observing." Inman thought we needed fewer spies and more slightly batty geniuses.

  6.

  Enron revealed that the financial community needs to make the same transition. "In order for an economy to have an adequate system of financial reporting, it is not enough that companies make disclosures of financial information," the Yale law professor Jonathan Macey wrote in a landmark law-review article that encouraged many to rethink the Enron case. "In addition, it is vital that there be a set of financial intermediaries, who are at least as competent and sophisticated at receiving, processing, and interpreting financial information . . . as the companies are at delivering it." Puzzles are "transmitter-dependent"; they turn on what we are told. Mysteries are "receiver dependent"; they turn on the skills of the listener, and Macey argues that, as Enron's business practices grew more com-plicated, it was Wall Street's responsibility to keep pace.

  Victor Fleischer, who teaches at the University of Colorado Law School, points out that one of the critical clues about Enron's condition lay in the fact that it paid no income tax in four of its last five years. Enron's use of mark-to-market accounting and S.P.E.s was an accounting game that made the company look as though it were earning far more money than it was. But the I.R.S. doesn't accept mark-to-market accounting; you pay tax on income when you actually receive that income. And, from the I.R.S.'s perspective, all of Enron's fantastically complex maneuvering around its S.P.E.s was, as Fleischer puts it, "a non-event": until the partnership actually sells the asset--and makes either a profit or a loss--an S.P.E. is just an accounting fiction. Enron wasn't paying any taxes because, in the eyes of the I.R.S., Enron wasn't making any money.

  If you looked at Enron from the perspective of the tax code, that is, you would have seen a very different picture of the company than if you had looked through the more traditional lens of the accounting profession. But in order to do that you would have to be trained in the tax code and be familiar with its particular conventions and intricacies, and know what questions to ask. "The fact of the gap between [Enron's] accounting income and taxable income was easily observed," Fleischer notes, but not the source of the gap. "The tax code requires special training."

  Woodward and Bernstein didn't have any special training. They were in their twenties at the time of Watergate. In "All the President's Men," they even joke about their inexperience: Woodward's expertise was mainly in office politics; Bernstein was a college dropout. But it hardly mattered, because coverups, whistle-blowers, secret tapes, and exposés--the principal elements of the puzzle--all require the application of energy and persistence, which are the virtues of youth. Mysteries demand experience and insight. Woodward and Bernstein would never have broken the Enron story.

  "There have been scandals in corporate history where people are really making stuff up, but this wasn't a criminal enterprise of that kind," Macey says. "Enron was vanishingly close, in my view, to having complied with the accounting rules. They were going over the edge, just a little bit. And this kind of financial fraud--where people are simply stretching the truth--falls into the area that analysts and short-sellers are supposed to ferret out. The truth wasn't hidden. But you'd have to look at their financial statements, and you would have to say to yourself, What's that about? It's almost as if they were saying, 'We're doing some really sleazy stuff in footnote 42, and if you want to know more about it ask us.' And that's the thing. Nobody did."

  Alexander George, in his history of propaganda analysis, looked at hundreds of the inferences drawn by the American analysts about the Nazis, and concluded that an astonishing eighty-one per cent of them were accurate. George's account, however, spends almost as much time on the propaganda analysts' failures as on their successes. It was the British, for example, who did the best work on the V-1 rocket problem. They systematically tracked the "occurrence and volume" of Nazi reprisal threats, which is how they were able to pinpoint things like the setback suffered by the V-1 program in August of 1943 (it turned out that Allied bombs had caused serious damage) and the date of the Nazi V-1 rocket launch. K Street's analysis was lacklustre in comparison. George writes that the Americans "did not develop analytical techniques and hypotheses of sufficient refinement," relying instead on "impressionistic" analysis. George was himself one of the slightly batty geniuses of K Street, and, of course, he could easily have excused his former colleagues. They never left their desks, after all. All they had to deal with was propaganda, and their big source was Goebbels, who was a liar, a thief, and a drunk. But that is puzzle thinking. In the case of puzzles, we put the offending target, the C.E.O., in jail for twenty-four years and assume that our work is done. Mysteries require that we revisit our list of culprits and be willing to spread the blame a little more broadly. Because if you can't find the truth in a --even a mystery shrouded in propaganda--it's not just the fault of the propagandist. It's your fault as well.

  7.

  In the spring of 1998, Macey notes, a group of six students at Cornell University's business school decided to do their term project on Enron. "It was for an advanced financial-statement-analysis class taught by a guy at Cornell called Charles Lee, who is pretty famous in financial circles," one member of the group, Jay Krueger, recalls. In the first part of the semester, Lee had led his students through a series of intensive case studies, teaching them techniques and sophisticated tools to make sense of the vast amounts of information that companies disclose in their annual reports and S.E.C. filings. Then the students picked a company and went off on their own. "One of the second-years had a summer-internship interview with Enron, and he was very interested in the energy sector," Krueger went on. "So he said, 'Let's do them.' It was about a six-week project, half a semester. Lots of group meetings. It was a ratio analysis, which is pretty standard business-school fare. You know, take fifty different financial ratios, then lay that on top of every piece of information you could find out about the company, the businesses, how their performance compared to other competitors."

  What if you built a machine to predict hit movies?

  1.

  One sunny afternoon not long ago, Dick Copaken sat in a booth at Daniel, one of those hushed, exclusive restaurants on Manhattan's Upper East Side where the waiters glide spectrally from table to table. He was wearing a starched button-down shirt and a blue blazer. Every strand of his thinning hair was in place, and he spoke calmly and slowly, his large pink Charlie Brown head bobbing along evenly as he did. Copaken spent many years as a partner at the white-shoe Washington, D.C., firm Covington & Burling, and he has a lawyer's gravitas. One of his best friends calls him, admiringly, "relentless." He likes to tell stories. Yet he is not, strictly, a storyteller, because storytellers are people who know when to leave things out, and Copaken never leaves anything out: each detail is adduced, considered, and laid on the table--and then adjusted and readjusted so that the corners of the new fact are flush with the corners of the fact that preceded it. This is especially true when Copaken is talking about things that he really cares about, such as questions of international law or his grandchildren or, most of all, the movies.

  Dick Copaken loves the movies. His friend Richard Light, a statistician at Harvard, remembers summer vacations on Cape Cod with the Copakens, when Copaken would take his children and the Light children to the movies every day. "Fourteen nights out of fourteen," Light said. "Dick would say at seven o'clock, 'Hey, who's up for the movies?' And, all by himself, he would take the six kids to the movies. The kids had the time of their lives. And Dick would come back and give, with a completely straight face, a rigorous analysis of how each movie was put together, and the direction and the special effects and the animation." This is a man who has seen two or three movies a week for the past fifty years, who has filed hundreds of plots and characters and scenes away in his mind, and at Daniel he was talking about a movie that touched him as much as any he'd ever seen.

  "Nobody's heard of it," he said, and he clearly regarded this fact as a minor tragedy. "It's called 'Dear Frankie.' I watched it on a Virgin Atlantic flight because it was the only movie they had that I hadn't already seen. I had very low expectations. But I was blown away." He began, in his lawyer-like manner, to lay out the plot. It takes place in Scotland. A woman has fled an abusive relationship with her infant son and is living in a port town. The boy, now nine, is deaf, and misses the father he has never known. His mother has told him that his father is a sailor on a ship that rarely comes to shore, and has suggested that he write his father letters. These she intercepts, and replies to, writing as if she were the father. One day, the boy finds out that what he thinks is his father's ship is coming to shore. The mother has to find a man to stand in for the father. She does. The two fall in love. Unexpectedly, the real father reëmerges. He's dying, and demands to see his son. The mother panics. Then the little boy reveals his secret: he knew about his mother's ruse all along.

  "I was in tears over this movie," Copaken said. "You know, sometimes when you see a movie in the air you're in such an out-of-body mood that things get exaggerated. So when I got home I sat down and saw it another time. I was bawling again, even though I knew what was coming." Copaken shook his head, and then looked away. His cheeks were flushed. His voice was suddenly thick. There he was, a buttoned-down corporate lawyer, in a hushed restaurant where there is practically a sign on the wall forbidding displays of human emotion--and he was crying, a third time. "That absolutely hits me," he said, his face still turned away. "He knew all along what the mother was doing." He stopped to collect himself. "I can't even retell the damn story without getting emotional."

  He tried to explain why he was crying. There was the little boy, first of all. He was just about the same age as Copaken's grandson Jacob. So maybe that was part of it. Perhaps, as well, he was reacting to the idea of an absent parent. His own parents, Albert and Silvia, ran a modest community-law practice in Kansas City, and would shut down their office whenever Copaken or his brother had any kind of school activity or performance. In the Copaken world, it was an iron law that parents had to be present. He told a story about representing the Marshall Islands in negotiations with the U.S. government during the Cold War. A missile-testing range on the island was considered to be strategically critical. The case was enormously complex--involving something like fifty federal agencies and five countries--and, just as the negotiations were scheduled to begin, Copaken learned of a conflict: his eldest daughter was performing the lead role in a sixth-grade production of "The Wiz." "I made an instant decision," Copaken said. He told the President of the Marshall Islands that his daughter had to come first. Half an hour passed. "I get a frantic call from the State Department, very high levels: 'Dick, I got a call from the President of the Marshall Islands. What's going on?' I told him. He said, 'Dick, are you putting in jeopardy the national security of the United States for a sixth-grade production?' " In the end, the negotiations were suspended while Copaken flew home from Hawaii. "The point is," Copaken said, "that absence at crucial moments has been a worry to me, and maybe this movie just grabbed at that issue."

  He stopped, seemingly dissatisfied. Was that really why he'd cried? Hollywood is awash in stories of bad fathers and abandoned children, and Copaken doesn't cry in fancy restaurants every time he thinks of one of them. When he tried to remember the last time he cried at the movies, he was stumped. So he must have been responding to something else, too--some detail, some unconscious emotional trigger in the combination of the mother and the boy and the Scottish seaside town and the ship and the hired surrogate and the dying father. To say that he cried at "Dear Frankie" because of that lonely fatherless boy was as inadequate as saying that people cried at the death of Princess Diana because she was a beautiful princess. Surely it mattered as well that she was killed in the company of her lover, a man distrusted by the Royal Family. "t this "Romeo and Juliet"? And surely it mattered that she died in a tunnel, and that the tunnel was in Paris, and that she was chased by motorbikes, and that she was blond and her lover was dark--because each one of those additional narrative details has complicated emotional associations, and it is the subtle combination of all these associations that makes us laugh or choke up when we remember a certain movie, every single time, even when we're sitting in a fancy restaurant.

  Of course, the optimal combination of all those elements is a mystery. That's why it's so hard to make a really memorable movie, and why we reward so richly the few people who can. But suppose you really, really loved the movies, and suppose you were a relentless type, and suppose you used all of the skills you'd learned during the course of your career at the highest rungs of the law to put together an international team of story experts. Do you think you could figure it out?

  2.

  The most famous dictum about Hollywood belongs to the screenwriter William Goldman. "Nobody knows anything," Goldman wrote in "Adventures in the Screen Trade" a couple of decades ago. "Not one person in the entire motion picture field knows for a certainty what's going to work. Every time out it's a guess." One of the highest-grossing movies in history, "Glad16"Raiders of the Lost Ark," was offered to every studio in Hollywood, Goldman writes, and every one of them turned it down except Paramount: "Why did Paramount say yes? Because nobody knows anything. And why did all the other studios say no? Because nobody knows anything. And why did Universal, the mightiest studio of all, pass on Star Wars? . . . Because nobody, nobody--not now, not ever--knows the least goddamn thing about what is or isn't going to work at the box office."

  What Goldman was saying was a version of something that has long been argued about art: that there is no way of getting beyond one's own impressions to arrive at some larger, objective truth. There are no rules to art, only the infinite variety of subjective experience. "Beauty is no quality in things themselves," the eighteenth-century Scottish philosopher David Hume wrote. "It exists merely in the mind which contemplates them; and each mind perceives a different beauty." Hume might as well have said that nobody knows anything.

  But Hume had a Scottish counterpart, Lord Kames, and Lord Kames was equally convinced that traits like beauty, sublimity, and grandeur were indeed reducible to a rational system of rules and precepts. He devised principles of congruity, propriety, and perspicuity: an elevated subject, for instance, must be expressed in elevated language; sound and signification should be in concordance; a woman was most attractive when in distress; depicted misfortunes must never occur by chance. He genuinely thought that the superiority of Virgil's hexameters to Horace's could be demonstrated with Euclidean precision, and for every Hume, it seems, there has always been a Kames--someone arguing that if nobody knows anything it is only because nobody's looking hard enough.

  In a small New York loft, just below Union Square, for example, there is a tech startup called Platinum Blue that consults for companies in the music business. Record executives have tended to be Humean: though they can tell you how they feel when they listen to a song, they don't believe anyone can know with confidence whether a song is going to be a hit, and, historically, fewer than twenty per cent of the songs picked as hits by music executives have fulfilled those expectations. Platinum Blue thinks it can do better. It has a proprietary computer program that uses "spectral deconvolution software" to measure the mathematical relationships among all of a song's structural components: melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, frequency, and so on. On the basis of that analysis, the firm believes it can predict whether a song is likely to become a hit with eighty-per-cent accuracy. Platinum Blue is staunchly Kamesian, and, if you have a field dominated by those who say there are no rules, it is almost inevitable that someone will come along and say that there are. The head of Platinum Blue is a man named Mike McCready, and the service he is providing for the music business is an exact model of what Dick Copaken would like to do for the movie business.

  McCready is in his thirties, baldish and laconic, with rectangular hipster glasses. His offices are in a large, open room, with a row of windows looking east, across the rooftops of downtown Manhattan. In the middle of the room is a conference table, and one morning recently McCready sat down and opened his laptop to demonstrate the Platinum Blue technology. On his screen was a cluster of thousands of white dots, resembling a cloud. This was a "map" of the songs his group had run through its software: each dot represented a single song, and each song was positioned in the cloud according to its particular mathematical signature. "You could have one piano sonata by Beethoven at this end and another one here," McCready said, pointing at the opposite end, "as long as they have completely different chord progressions and completely different melodic structures."

  McCready then hit a button on his computer, which had the effect of eliminating all the songs that had not made the Billboard Top 30 in the past five years. The screen went from an undifferentiated cloud to sixty discrete clusters. This is what the universe of hit songs from the past five years looks like structurally; hits come out of a small, predictable, and highly conserved set of mathematical patterns. "We take a new CD far in advance of its release date," McCready said. "We analyze all twelve tracks. Then we overlay them on top of the already existing hit clusters, and what we can tell a record company is which of those songs conform to the mathematical pattern of past hits. Now, that doesn't mean that they will be hits. But what we are saying is that, almost certainly, songs that fall outside these clusters will not be --regardless of how much they sound and feel like hit songs, and regardless of how positive your call-out research or focus-group research is." Four years ago, when McCready was working with a similar version of the program at a firm in Barcelona, he ran thirty just-released albums, chosen at random, through his system. One stood out. The computer said that nine of the fourteen songs on the album had clear hit potential--which was unheard of. Nobody in his group knew much about the artist or had even listened to the record before, but the numbers said the album was going to be big, and McCready and his crew were of the belief that numbers do not lie. "Right around that time, a local newspaper came by and asked us what we were doing," McCready said. "We explained the hit-prediction thing, and that we were really turned on to a record by this artist called Norah Jones." The record was "Come Away with Me." It went on to sell twenty million copies and win eight Grammy awards.

  3.

  The strength of McCready's analysis is its precision. This past spring, for instance, he analyzed "Crazy," by Gnarls Barkley. The computer calculated, first of all, the song's Hit Grade--that is, how close it was to the center of any of those sixty hit clusters. Its Hit Grade was 755, on a scale where anything above 700 is exceptional. The computer also found that "Crazy" belonged to the same hit cluster as Dido's "Thank You," James Blunt's "You're Beautiful," and Ashanti's "Baby," as well as older hits like "Let Me Be There," by Olivia Newton-John, and "One Sweet Day," by Mariah Carey, so that listeners who liked any of those songs would probably like "Crazy," too. Finally, the computer gave "Crazy" a Periodicity Grade--which refers to the fact that, at any given time, only twelve to fifteen hit clusters are "active," because from month to month the particular mathematical patterns that excite music listeners will shift around. "Crazy" 's periodicity score was 658--which suggested a very good fit with current tastes. The data said, in other words, that "Crazy" was almost certainly going to be huge--and, sure enough, it was.

  If "Crazy" hadn't scored so high, though, the Platinum Blue people would have given the song's producers broad suggestions for fixing it. McCready said, "We can tell a producer, 'These are the elements that seem to be pushing your song into the hit cluster. These are the variables that are pulling your song away from the hit cluster. The problem seems to be in your bass line.' And the producer will make a bunch of mixes, where they do something different with the bass lines--increase the decibel level, or muddy it up. Then they come back to us. And we say, 'Whatever you were doing with mix No. 3, do a little bit more of that and you'll be back inside the hit cluster.'"

  McCready stressed that his system didn't take the art out of hit-making. Someone still had to figure out what to do with mix No. 3, and it was entirely possible that whatever needed to be done to put the song in the hit cluster wouldn't work, because it would make the song sound wrong--and in order to be a hit a song had to sound right. Still, for the first time you wouldn't be guessing about what needed to be done. You would know. And what you needed to know in order to fix the song was much simpler than anyone would have thought. McCready didn't care about who the artist was, or the cleverness of the lyrics. He didn't even have a way of feeding lyrics into his computer. He cared only about a song's underlying mathematical structure. "If you go back to the popular melodies written by Beethoven and Mozart three hundred years ago," he went on, "they conform to the same mathematical patterns that we are looking at today. What sounded like a beautiful melody to them sounds like a beautiful melody to us. What has changed is simply that we have come up with new styles and new instruments. Our brains are wired in a way--we assume--that keeps us coming back, again and again, to the same answers, the same pleasure centers." He had sales data and Top 30 lists and deconvolution software, and it seemed to him that if you put them together you had an objective way of measuring something like beauty. "We think we've figured out how the brain works regarding musical taste," McCready said.

  It requires a very particular kind of person, of course, to see the world as a code waiting to be broken. Hume once called Kames "the most arrogant man in the world," and to take this side of the argument you have to be. Kames was also a brilliant lawyer, and no doubt that matters as well, because to be a good lawyer is to be invested with a reverence for rules. (Hume defied his family's efforts to make him a lawyer.) And to think like Kames you probably have to be an outsider. Kames was born Henry Home, to a farming family, and grew up in the sparsely populated cropping-and-fishing county of Berwickshire; he became Lord Kames late in life, after he was elevated to the bench. (Hume was born and reared in Edinburgh.) His early published work was about law and its history, but he soon wandered into morality, religion, anthropology, soil chemistry, plant nutrition, and the physical sciences, and once asked his friend Benjamin Franklin to explain the movement of smoke in chimneys. Those who believe in the power of broad patterns and rules, rather than the authority of individuals or institutions, are not intimidated by the boundaries and hierarchies of knowledge. They don't defer to the superior expertise of insiders; they set up shop in a small loft somewhere downtown and take on the whole music industry at once. The difference between Hume and Kames is, finally, a difference in kind, not degree. You're either a Kamesian or you're not. And if you were to create an archetypal Kamesian--to combine lawyerliness, outsiderness, and supreme self-confidence in one dapper, Charlie Brown-headed combination? You'd end up with Dick Copaken.

  "I remember when I was a sophomore in high school and I went into the bathroom once to wash my hands," Copaken said. "I noticed the bubbles on the sink, and it fascinated me the way these bubbles would form and move around and float and reform, and I sat there totally transfixed. My father called me, and I didn't hear him. Finally, he comes in. 'Son. What the . . . are you all right?' I said, 'Bubbles, Dad, look what they do.' He said, 'Son, if you're going to waste your time, waste it on something that may have some future consequence.' Well, I kind of rose to the challenge. That summer, I bicycled a couple of miles to a library in Kansas City and I spent every day reading every book and article I could find on bubbles."

  Bubbles looked completely random, but young Copaken wasn't convinced. He built a bubble-making device involving an aerator from a fish tank, and at school he pleaded with the math department to teach him the quadratic equations he needed to show why the bubbles formed the way they did. Then he devised an experiment, and ended up with a bronze medal at the International Science Fair. His interest in bubbles was genuine, but the truth is that almost anything could have caught Copaken's eye: pop songs, movies, the movement of chimney smoke. What drew him was not so much solving this particular problem as the general principle that problems were solvable--that he, little Dick Copaken from Kansas City, could climb on his bicycle and ride to the library and figure out something that his father thought wasn't worth figuring out.

  Copaken has written a memoir of his experience defending the tiny Puerto Rican islands of Culebra and Vieques against the U.S. Navy, which had been using their beaches for target practice. It is a riveting story. Copaken takes on the vast Navy bureaucracy, armed only with arcane provisions of environmental law. He investigates the nesting grounds of the endangered hawksbill turtle, and the mating habits of a tiny yet extremely loud tree frog known as the coqui, and at one point he transports four frozen whale heads from the Bahamas to Harvard Medical School. Copaken wins. The Navy loses.

  The memoir reads like a David-and-Goliath story. It isn't. David changed the rules on Goliath. He brought a slingshot to a sword fight. People like Copaken, though, don't change the rules; they believe in rules. Copaken would have agreed to sword-on-sword combat. But then he would have asked the referee for a stay, deposed Goliath and his team at great length, and papered him with brief after brief until he conceded that his weapon did not qualify as a sword under §48(B)(6)(e) of the Samaria Convention of 321 B.C. (The Philistines would have settled.) And whereas David knew that he couldn't win a conventional fight with Goliath, the conviction that sustained Copaken's long battle with the Navy was, to the contrary, that so long as the battle remained conventional--so long as it followed the familiar pathways of the law and of due process--he really could win. Dick Copaken didn't think he was an underdog at all. If you believe in rules, Goliath is just another Philistine, and the Navy is just another plaintiff. As for the ineffable mystery of the Hollywood blockbuster? Well, Mr. Goldman, you may not know anything. But I do.

  4.

  Dick Copaken has a friend named Nick Meaney. They met on a case years ago. Meaney has thick dark hair. He is younger and much taller than Copaken, and seems to regard his friend with affectionate amusement. Meaney's background is in risk management, and for years he'd been wanting to bring the principles of that world to the movie business. In 2003, Meaney and Copaken were driving through the English countryside to Durham when Meaney told Copaken about a friend of his from college. The friend and his business partner were students of popular narrative: the sort who write essays for obscure journals serving the small band of people who think deeply about, say, the evolution of the pilot episode in transnational TV crime dramas. And, for some time, they had been developing a system for evaluating the commercial potential of stories. The two men, Meaney told Copaken, had broken down the elements of screenplay narrative into multiple categories, and then drawn on their encyclopedic knowledge of television and film to assign scripts a score in each of those categories--creating a giant screenplay report card. The system was extraordinarily elaborate. It was under constant refinement. It was also top secret. Henceforth, Copaken and Meaney would refer to the two men publicly only as "Mr. Pink" and "Mr. Brown," an homage to "Reservoir Dogs."

  "The guy had a big wall, and he started putting up little Post-its covering everything you can think of," Copaken said. It was unclear whether he was talking about Mr. Pink or Mr. Brown or possibly some Obi-Wan Kenobi figure from whom Mr. Pink and Mr. Brown first learned their trade. "You know, the star wears a blue shirt. The star doesn't zip up his pants. Whatever. So he put all these factors up and began moving them around as the scripts were either successful or unsuccessful, and he began grouping them and eventually this evolved to a kind of ad-hoc analytical system. He had no theory as to what would work, he just wanted to know what did work."

  Copaken and Meaney also shared a fascination with a powerful kind of computerized learning system called an artificial neural network. Neural networks are used for data mining--to look for patterns in very large amounts of data. In recent years, they have become a critical tool in many industries, and what Copaken and Meaney realized, when they thought about Mr. Pink and Mr. Brown, was that it might now be possible to bring neural networks to Hollywood. They could treat screenplays as mathematical propositions, using Mr. Pink and Mr. Brown's categories and scores as the motion-picture equivalents of melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, and frequency.

  Copaken and Meaney brought in a former colleague of Meaney's named Sean Verity, and the three of them signed up Mr. Pink and Mr. Brown. They called their company Epagogix--a reference to Aristotle's discussion of epagogic, or inductive, learning--and they started with a "training set" of screenplays that Mr. Pink and Mr. Brown had graded. Copaken and Meaney won't disclose how many scripts were in the training set. But let's say it was two hundred. Those scores--along with the U.S. box-office receipts for each of the films made from those screenplays--were fed into a neural network built by a computer scientist of Meaney's acquaintance. "I can't tell you his name," Meaney said, "but he's English to his bootstraps." Mr. Bootstraps then went to work, trying to use Mr. Pink and Mr. Brown's scoring data to predict the box-office receipts of every movie in the training set. He started with the first film and had the neural network make a guess: maybe it said that the hero's moral crisis in act one, which rated a 7 on the 10-point moral-crisis scale, was worth $7 million, and having a gorgeous red-headed eighteen-year-old female lead whose characterization came in at 6.5 was worth $3 million and a 9-point bonding moment between the male lead and a four-year-old boy in act three was worth $2 million, and so on, putting a dollar figure on every grade on Mr. Pink and Mr. Brown's report card until the system came up with a prediction. Then it compared its guess with how that movie actually did. Was it close? Of course not. The neural network then went back and tried again. If it had guessed $20 million and the movie actually made $110 million, it would reweight the movie's Pink/Brown scores and run the numbers a second time. And then it would take the formula that worked best on Movie One and apply it to Movie Two, and tweak that until it had a formula that worked on Movies One and Two, and take that formula to Movie Three, and then to four and five, and on through all two hundred movies, whereupon it would go back through all the movies again, through hundreds of thousands of iterations, until it had worked out a formula that did the best possible job of predicting the financial success of every one of the movies in its database.

  That formula, the theory goes, can then be applied to new scripts. If you were developing a $75-million buddy picture for Bruce Willis and Colin Farrell, Epagogix says, it can tell you, based on past experience, what that script's particular combination of narrative elements can be expected to make at the box office. If the formula says it's a $50-million script, you pull the plug. "We shoot turkeys," Meaney said. He had seen Mr. Bootstraps and the neural network in action: "It can sometimes go on for hours. If you look at the computer, you see lots of flashing numbers in a gigantic grid. It's like 'The Matrix.' There are a lot of computations. The guy is there, the whole time, looking at it. It eventually stops flashing, and it tells us what it thinks the American box-office will be. A number comes out."

  The way the neural network thinks is not that different from the way a Hollywood executive thinks: if you pitch a movie to a studio, the executive uses an ad-hoc algorithm--perfected through years of trial and error--to put a value on all the components in the story. Neural networks, though, can handle problems that have a great many variables, and they never play favorites--which means (at least in theory) that as long as you can give the neural network the same range of information that a human decision-maker has, it ought to come out ahead. That's what the University of Arizona computer scientist Hsinchun Chen demonstrated ten years ago, when he built a neural network to predict winners at the dog track. Chen used the ten variables that greyhound experts told him they used in making their bets--like fastest time and winning percentage and results for the past seven races--and trained his system with the results of two hundred races. Then he went to the greyhound track in Tucson and challenged three dog-racing handicappers to a contest. Everyone picked winners in a hundred races, at a modest two dollars a bet. The experts lost $71.40, $61.20, and $70.20, respectively. Chen won $124.80. It wasn't close, and one of the main reasons was the special interest the neural network showed in something called "race grade": greyhounds are moved up and down through a number of divisions, according to their ability, and dogs have a big edge when they've just been bumped down a level and a big handicap when they've just been bumped up. "The experts know race grade exists, but they don't weight it sufficiently," Chen said. "They are all looking at win percentage, place percentage, or thinking about the dogs' times."

  Copaken and Meaney figured that Hollywood's experts also had biases and skipped over things that really mattered. If a neural network won at the track, why not Hollywood? "One of the most powerful aspects of what we do is the ruthless objectivity of our system," Copaken said. "It doesn't care about maintaining relationships with stars or agents or getting invited to someone's party. It doesn't care about climbing the corporate ladder. It has one master and one master only: how do you get to bigger box-office? Nobody else in Hollywood is like that."

  In the summer of 2003, Copaken approached Josh Berger, a senior executive at Warner Bros. in Europe. Meaney was opposed to the idea: in his mind, it was too early. "I just screamed at Dick," he said. But Copaken was adamant. He had Mr. Bootstraps, Mr. Pink, and Mr. Brown run sixteen television pilots through the neural network, and try to predict the size of each show's eventual audience. "I told Josh, 'Stick this in a drawer, and I'll come back at the end of the season and we can check to see how we did,' " Copaken said. In January of 2004, Copaken tabulated the results. In six cases, Epagogix guessed the number of American homes that would tune in to a show to within .06 per cent. In thirteen of the sixteen cases, its predictions were within two per cent. Berger was floored. "It was incredible," he recalls. "It was like someone saying to you, 'We're going to show you how to count cards in Vegas.' It had that sort of quality."

  Copaken then approached another Hollywood studio. He was given nine unreleased movies to analyze. Mr. Pink, Mr. Brown, and Mr. Bootstraps worked only from the script--without reference to the stars or the director or the marketing budget or the producer. On three of the films--two of which were low-budget--the Epagogix estimates were way off. On the remaining six--including two of the studio's biggest-budget productions--they correctly identified whether the film would make or lose money. On one film, the studio thought it had a picture that would make a good deal more than $100 million. Epagogix said $49 million. The movie made less than $40 million. On another, a big-budget picture, the team's estimate came within $1.2 million of the final gross. On a number of films, they were surprisingly close. "They were basically within a few million," a senior executive at the studio said. "It was shocking. It was kind of weird." Had the studio used Epagogix on those nine scripts before filming started, it could have saved tens of millions of dollars. "I was impressed by a couple of things," another executive at the same studio said. "I was impressed by the things they thought mattered to a movie. They weren't the things that we typically give credit to. They cared about the venue, and whether it was a love story, and very specific things about the plot that they were convinced determined the outcome more than anything else. It felt very objective. And they could care less about whether the lead was Tom Cruise or Tom Jones."

  The Epagogix team knocked on other doors that weren't quite so welcoming. This was the problem with being a Kamesian. Your belief in a rule-bound universe was what gave you, an outsider, a claim to real expertise. But you were still an outsider. You were still Dick Copaken, the blue-blazered corporate lawyer who majored in bubbles as a little boy in Kansas City, and a couple of guys from the risk-management business, and three men called Pink, Brown, and Bootstraps--and none of you had ever made a movie in your life. And what were you saying? That stars didn't matter, that the director didn't matter, and that all that mattered was story--and, by the way, that you understood story the way the people on the inside, people who had spent a lifetime in the motion-picture business, didn't. "They called, and they said they had a way of predicting box-office success or failure, which is everyone's fantasy," one former studio chief recalled. "I said to them, 'I hope you're right.' " The executive seemed to think of the Epagogix team as a small band of Martians who had somehow slipped their U.F.O. past security. "In reality, there are so many circumstances that can affect a movie's success," the executive went on. "Maybe the actor or actress has an external problem. Or this great actor, for whatever reason, just fails. You have to fire a director. Or September 11th or some other thing happens. There are many people who have come forward saying they have a way of predicting box-office success, but so far nobody has been able to do it. I think we know something. We just don't know enough. I still believe in something called that magical thing--talent, the unexpected. The movie god has to shine on you." You were either a Kamesian or you weren't, and this person wasn't: "My first reaction to those guys? Bullshit."

  5.

  A few months ago, Dick Copaken agreed to lift the cloud of unknowing surrounding Epagogix, at least in part. He laid down three conditions: the meeting was to be in London, Mr. Pink and Mr. Brown would continue to be known only as Mr. Pink and Mr. Brown, and no mention was to be made of the team's current projects. After much discussion, an agreement was reached. Epagogix would analyze the 2005 movie "The Interpreter," which was directed by Sydney Pollack and starred Sean Penn and Nicole Kidman. "The Interpreter" had a complicated history, having gone through countless revisions, and there was a feeling that it could have done much better at the box office. If ever there was an ideal case study for the alleged wizardry of Epagogix, this was it.

  The first draft of the movie was written by Charles Randolph, a philosophy professor turned screenwriter. It opened in the fictional African country of Matobo. Two men in a Land Rover pull up to a soccer stadium. A group of children lead them to a room inside the building. On the ground is a row of corpses.

  Cut to the United Nations, where we meet Silvia Broome, a young woman who works as an interpreter. She goes to the U.N. Security Service and relates a terrifying story. The previous night, while working late in the interpreter's booth, she overheard two people plotting the assassination of Matobo's murderous dictator, Edmund Zuwanie, who is coming to New York to address the General Assembly. She says that the plotters saw her, and that her life may be in danger. The officer assigned to her case, Tobin Keller, is skeptical, particularly when he learns that she, too, is from Matobo, and that her parents were killed in the country's civil war. But after Broome suffers a series of threatening incidents Keller starts to believe her. His job is to protect Zuwanie, but he now feels moved to act as Broome's bodyguard as well. A quiet, slightly ambiguous romantic attraction begins to develop between them. Zuwanie's visit draws closer. Broome's job is to be his interpreter. On the day of the speech, Broome ends up in the greenroom with Zuwanie. Keller suddenly realizes the truth: that she has made up the whole story as a way of bringing Zuwanie to justice. He rushes to the greenroom. Broome, it seems, has poisoned Zuwanie and is withholding the antidote unless he goes onstage and confesses to the murder of his countrymen. He does. Broome escapes. A doctor takes a look at the poison. It's harmless. The doctor turns to the dictator, who has just been tricked into writing his own prison sentence: "You were never in danger, Mr. Zuwanie."

  Randolph says that the film he was thinking of while he was writing "The Interpreter" was Francis Ford Coppola's classic "The Conversation." He wanted to make a spare, stark movie about an isolated figure. "She's a terrorist," Randolph said of Silvia Broome. "She comes to this country to do a very specific task, and when that task is done she's gone again. I wanted to write about this idea of a noble terrorist, who tried to achieve her ends with a character assassination, not a real assassination." Randolph realized that most moviegoers--and most Hollywood executives--prefer characters who have psychological motivations. But he wasn't trying to make "Die Hard." "Look, I'm the son of a preacher," he said. "I believe that ideology motivates people."

  In 2004, Sydney Pollack signed on to direct the project. He loved the idea of an interpreter at the United Nations and the conceit of an overheard conversation. But he wanted to make a commercial movie, and parts of the script didn't feel right to him. He didn't like the twist at the end, for instance. "I felt like I had been tricked, because in fact there was no threat," Pollack said. "As much as I liked the original script, I felt like an audience would somehow, at the end, feel cheated." Pollack also felt that audiences would want much more from Silvia Broome's relationship with Tobin Keller. "I've never been able to do a movie without a love story in it," he said. "For me, the heart of it is always the man and the woman and who they are and what they are going through." Pollack brought Randolph back for rewrites. He then hired Scott Frank and Steven Zaillian, two of the most highly sought-after screenwriters in Hollywood--and after several months the story was turned inside out. Now Broome didn't tell the story of overhearing that conversation. It actually happened. She wasn't a terrorist anymore. She was a victim. She "t an isolated figure. She was given a social life. She wasn't manipulating Keller. Their relationship was more prominent. A series of new characters--political allies and opponents of Zuwanie's--were added, as was a scene in Brooklyn where a bus explodes, almost killing Broome. "I remember when I came on 'Minority Report,' and started over," said Frank, who wrote many of the new scenes for "The Interpreter." "There weren't many characters. When I finished, there were two mysteries and a hundred characters. I have diarrhea of the plot. This movie cried out for that. There are never enough suspects and red herrings."

  The lingering problem, though, was the ending. If Broome wasn't after Zuwanie, who was? "We struggled," Pollack said. "It was a long process, to the point where we almost gave up." In the end, Zuwanie was made the engineer of the plot: he fakes the attempt on his life in order to justify his attacks on his enemies back home. Zuwanie hires a man to shoot him, and then another of Zuwanie's men shoots the assassin before he can do the job--and in the chaos Broome ends up with a gun in her hand, training it on Zuwanie. "The end was the hardest part," Frank said. "All these balls were in the air. But I couldn't find a satisfying way to resolve it. We had to put a gun in the hand of a pacifist. I couldn't quite sew it up in the right way. Sydney kept saying, 'You're so close.' But I kept saying, 'Yeah, but I don't believe what I'm writing.' I wonder if I did a disservice to 'The Interpreter.' I don't know that I made it better. I may have just made it different."

  This, then, was the question for Epagogix: If Pollack's goal was to make "The Interpreter" a more commercial movie, how well did he succeed? And could he have done better?

  6.

  The debriefing took place in central London, behind the glass walls of the private dining room of a Mayfair restaurant. The waiters came in waves, murmuring their announcements of the latest arrival from the kitchen. The table was round. Copaken, dapper as always in his navy blazer, sat next to Sean Verity, followed by Meaney, Mr. Brown, and Mr. Pink. Mr. Brown was very tall, and seemed to have a northern English accent. Mr. Pink was slender and graying, and had an air of authority about him. His academic training was in biochemistry. He said he thought that, in the highly emotional business of Hollywood, having a scientific background was quite useful. There was no sign of Mr. Bootstraps.

  Mr. Pink began by explaining the origins of their system. "There were certain historical events that allowed us to go back and test how appealing one film was against another," he said. "The very simple one is that in the English market, in the sixties on Sunday night, religious programming aired on the major networks. Nobody watched it. And, as soon as that finished, movies came on. There were no lead-ins, and only two competing channels. Plus, across the country you had a situation where the commercial sector was playing a whole variety of movies against the standard, the BBC. It might be a John Wayne movie in Yorkshire, and a musical in Somerset, and the BBC would be the same movie everywhere. So you had a control. It was very pure and very simple. That was a unique opportunity to try and make some guesstimates as to why movies were doing what they were doing."

  Brown nodded. "We built a body of evidence until we had something systematic," he said.

  Pink estimated that they had analyzed thousands of movies. "The thing is that not everything comes to you as a script. For a long period, we worked for a broadcaster who used to send us a couple of paragraphs. We made our predictions based on that much. Having the script is actually too much information sometimes. You're trying to replicate what the audience is doing. They're trying to make a choice between three movies, and all they have at that point is whatever they've seen in TV Guide or on any trailer they've seen. We have to take a piece here and a piece here. Take a couple of reference points. When I look at a story, there are certain things I'm looking for--certain themes, and characters you immediately focus on." He thought for a moment. "That's not to deny that it matters whether the lead character wears a hat," he added, in a way that suggested he and Mr. Brown had actually thought long and hard about leads and hats.

  "There's always a pattern," he went on. "There are certain stories that come back, time and time again, and that always work. You know, whenever we go into a market--and we work in fifty markets--the initial thing people say is 'What do you know about our market?' The assumption is that, say, Japan is different from us--that there has to be something else going on there. But, basically, they're just like us. It's the consistency of these reappearing things that I find amazing."

  "Biblical stories are a classic case," Mr. Brown put in. "There is something about what they're telling and the message that's coming out that seems to be so universal. With Mel Gibson's 'The Passion,' people always say, 'Who could have predicted that?' And the answer is, we could have."

  They had looked at "The Interpreter" scripts a few weeks earlier. The process typically takes them a day. They read, they graded, and then they compared notes, because Mr. Pink was the sort who went for "Yojimbo" and Mr. Brown's favorite movie was "Alien" (the first one), so they didn't always agree. Mr. Brown couldn't remember a single script he'd read where he thought there wasn't room for improvement, and Mr. Pink, when asked the same question, could come up with just one: "Lethal Weapon." "A friend of mine gave me the shooting script before it came out, and I remember reading it and thinking, It's all there. It was all on the page." Once Mr. Pink and Mr. Brown had scored "The Interpreter," they gave their analyses to Mr. Bootstraps, who did fifteen runs through the neural network: the original Randolph script, the shooting script, and certain variants of the plot that Epagogix devised. Mr. Bootstraps then passed his results to Copaken, who wrote them up. The Epagogix reports are always written by Copaken, and they are models of lawyerly thoroughness. This one ran to thirty-eight pages. He had finished the final draft the night before, very late. He looked fresh as a daisy.

  Mr. Pink started with the original script. "My pure reaction? I found it very difficult to read. I got confused. I had to reread bits. We do this a lot. If a project takes more than an hour to read, then there's something going on that I'm not terribly keen on."

  "It didn't feel to me like a mass-appeal movie," Mr. Brown added. "It seemed more niche."

  When Mr. Bootstraps ran Randolph's original draft through the neural network, the computer called it a $33-million movie--an "intelligent" thriller, in the same commercial range as "The Constant Gardener" or "Out of Sight." According to the formula, the final shooting script was a $69-million picture (an estimate that came within $4 million of the actual box-office). Mr. Brown wasn't surprised. The shooting script, he said, "felt more like an American movie, where the first one seemed European in style."

  Everyone agreed, though, that Pollack could have done much better. There was, first of all, the matter of the United Nations. "They had a unique opportunity to get inside the building," Mr. Pink said. "But I came away thinking that it could have been set in any boxy office tower in Manhattan. An opportunity was missed. That's when we get irritated--when there are opportunities that could very easily be turned into something that would actually have had an impact."

  "Locale is an extra character," Mr. Brown said. "But in this case it's a very bland character that didn't really help."

  In the Epagogix secret formula, it seemed, locale matters a great deal. "You know, there's a big difference between city and countryside," Mr. Pink said. "It can have a huge effect on a movie's ability to draw in viewers. And writers just do not take advantage of it. We have a certain set of values that we attach to certain places."

  Mr. Pink and Mr. Brown ticked off the movies and television shows that they thought understood the importance of locale: "Crimson Tide," "Lawrence of Arabia," "Lost," "Survivor," "Castaway," "Deliverance." Mr. Pink said, "The desert island is something that we have always recognized as a pungent backdrop, but it's not used that often. In the same way, prisons can be a powerful environment, because they are so well defined." The U.N. could have been like that, but it wasn't. Then there was the problem of starting, as both scripts did, in Africa--and not just Africa but a fictional country in Africa. The whole team found that crazy. "Audiences are pretty parochial, by and large," Mr. Pink said. "If you start off by telling them, 'We're going to begin this movie in Africa,' you're going to lose them. They've bought their tickets. But when they come out they're going to say, 'It was all right. But it was Africa.' " The whole thing seemed to leave Mr. Pink quite distressed. He looked at Mr. Brown beseechingly.

  Mr. Brown changed the subject. "It's amazing how often quite little things, quite small aspects, can spoil everything," he said. "I remember seeing the trailer for 'V for Vendetta' and deciding against it right there, for one very simple reason: there was a ridiculous mask on the main character. If you can't see the face of the character, you can't tell what that person is thinking. You can't tell who they are. With 'Spider-Man' and 'Superman,' though, you do see the face, so you respond to them."

  The team once gave a studio a script analysis in which almost everything they suggested was, in Hollywood terms, small. They wanted the lead to jump off the page a little more. They wanted the lead to have a young sidekick--a relatively minor character--to connect with a younger demographic, and they wanted the city where the film was set to be much more of a presence. The neural network put the potential value of better characterization at an extra $2.46 million in U.S. box-office revenue; the value of locale adjustment at $4.92 million; the value of a sidekick at $12.3 million--and the value of all three together (given the resulting synergies) at $24.6 million. That's another $25 million for a few weeks of rewrites and maybe a day or two of extra filming. Mr. Bootstraps, incidentally, ran the numbers and concluded that the script would make $47 million if the suggested changes were not made. The changes were not made. The movie made $50 million.

  Mr. Pink and Mr. Brown went on to discuss the second "Interpreter" screenplay, the shooting script. They thought the ending was implausible. Charles Randolph had originally suggested that the Tobin Keller character be black, not white, in order to create the frisson of bringing together a white African and a black American. Mr. Pink and Mr. Brown independently came to the same conclusion. Apparently, the neural network ran the numbers on movies that paired black and white leads--"Lethal Weapon," "The Crying Game," "Independence Day," "Men in Black," "Die Another Day," "The Pelican Brief"--and found that the black-white combination could increase box-office revenue. The computer did the same kind of analysis on Scott Frank's "diarrhea of the plot," and found that there were too many villains. And if Silvia Broome was going to be in danger, Mr. Bootstraps made clear, she really had to be in danger.

  "Our feeling--and Dick, you may have to jump in here--is that the notion of a woman in peril is a very powerful narrative element," Mr. Pink said. He glanced apprehensively at Copaken, evidently concerned that what he was about to say might fall in the sensitive category of the proprietary. "How powerful?" He chose his words carefully. "Well above average. And the problem is that we lack a sense of how much danger she is in, so an opportunity is missed. There were times when you were thinking, Is this something she has created herself? Is someone actually after her? You are confused. There is an element of doubt, and that ambiguity makes it possible to doubt the danger of the situation." Of course, all that ambiguity was there because in the Randolph script she was making it all up, and we were supposed to doubt the danger of the situation. But Mr. Pink and Mr. Brown believed that, once you decided you weren't going to make a European-style niche movie, you had to abandon ambiguity altogether.

  "You've got to make the peril real," Mr. Pink said.

  The Epagogix revise of "The Interpreter" starts with an upbeat Silvia Broome walking into the United Nations, flirting with the security guard. The two men plotting the assassination later see her and chase her through the labyrinthine cor-ridors of what could only be the U.N. building. The ambiguous threats to Broome's life are now explicit. At one point in the Epagogix version, a villain pushes Broome's Vespa off one of Manhattan's iconic East River bridges. She hangs on to her motorbike for dear life, as it swings precariously over the edge of the parapet. Tobin Keller, in a police helicopter, swoops into view: "As she clings to Tobin's muscular body while the two of them are hoisted up into the hovering helicopter, we sense that she is feeling more than relief." In the Epagogix ending, Broome stabs one of Zuwanie's security men with a knife. Zuwanie storms off the stage, holds a press conference, and is shot dead by a friend of Broome's brother. Broome cradles the dying man in her arms. He " dies peacefully," with " a smile on his blood-spattered face." Then she gets appointed Matobo's U.N. ambassador. She turns to Keller. "'This time,' she notes with a wry smile . . . 'you will have to protect me.' " Bootstraps's verdict was that this version would result in a U.S. box-office of $111 million.

  "It's funny," Mr. Pink said. "This past weekend, 'The Bodyguard' was on TV. Remember that piece of"--he winced--"entertainment? Which is about a bodyguard and a woman. The final scene is that they are right back together. It is very clearly and deliberately sown. That is the commercial way, if you want more bodies in the seats."

  "You have to either consummate it or allow for the possibility of that," Copaken agreed.

  They were thinking now of what would happen if they abandoned all fealty to the original, and simply pushed the movie's premise as far as they could possibly go.

  Mr. Pink went on, "If Dick had said, 'You can take this project wherever you want,' we probably would have ended up with something a lot closer to 'The Bodyguard'--where you have a much more romantic film, a much more powerful focus to the two characters--without all the political stuff going on in the background. You go for the emotions on a very basic level. What would be the upper limit on that? You know, the upper limit of anything these days is probably still 'Titanic.' I'm not saying we could do six hundred million dollars. But it could be two hundred million."

  7.

  It was clear that the whole conversation was beginning to make Mr. Pink uncomfortable. He didn't like "The Bodyguard." Even the title made him wince. He was the sort who liked "Yojimbo," after all. The question went around the room: What would you do with "The Interpreter"? Sean Verity wanted to juice up the action-adventure elements and push it to the $150- to $160-million range. Meaney wanted to do without expensive stars: he didn't think they were worth the money. Copaken wanted more violence, and he also favored making Keller black. But he didn't want to go all the way to "The Bodyguard," either. This was a man who loved "Dear Frankie" as much as any film he'd seen in recent memory, and "Dear Frankie" had a domestic box-office gross of $1.3 million. If you followed the rules of Epagogix, there wouldn't be any movies like "Dear Frankie." The neural network had one master, the market, and answered one question: how do you get to bigger box-office? But once a movie had made you vulnerable--once you couldn't even retell the damn story without getting emotional--you couldn't be content with just one master anymore.

  That was the thing about the formula: it didn't make the task of filmmaking easier. It made it harder. So long as nobody knows anything, you've got license to do whatever you want. You can start a movie in Africa. You can have male and female leads not go off together--all in the name of making something new. Once you came to think that you knew something, though, you had to decide just how much money you were willing to risk for your vision. Did the Epagogix team know what the answer to that question was? Of course not. That question required imagination, and they weren't in the imagination business. They were technicians with tools: computer programs and analytical systems and proprietary software that calculated mathematical relationships among a laundry list of structural variables. At Platinum Blue, Mike McCready could tell you that the bass line was pushing your song out of the center of hit cluster 31. But he couldn't tell you exactly how to fix the bass line, and he couldn't guarantee that the redone version would still sound like a hit, and you didn't see him releasing his own album of computer-validated pop music. A Kamesian had only to read Lord Kames to appreciate the distinction. The most arrogant man in the world was a terrible writer: clunky, dense, prolix. He knew the rules of art. But that didn't make him an artist.

  Comment

  In 1925, a young American physicist was doing graduate work at Cambridge University, in England. He was depressed. He was fighting with his mother and had just broken up with his girlfriend. His strength was in theoretical physics, but he was being forced to sit in a laboratory making thin films of beryllium. In the fall of that year, he dosed an apple with noxious chemicals from the lab and put it on the desk of his tutor, Patrick Blackett. Blackett, luckily, didn't eat the apple. But school officials found out what happened, and arrived at a punishment: the student was to be put on probation and ordered to go to London for regular sessions with a psychiatrist.

  Probation? These days, we routinely suspend or expel high-school students for doing infinitely less harmful things, like fighting or drinking or taking drugs--that is, for doing the kinds of things that teen-agers do. This past summer, Rhett Bomar, the starting quarterback for the University of Oklahoma Sooners, was cut from the team when he was found to have been "overpaid" (receiving wages for more hours than he worked, with the apparent complicity of his boss) at his job at a car dealership. Even in Oklahoma, people seemed to think that kicking someone off a football team for having cut a few corners on his job made perfect sense. This is the age of zero tolerance. Rules are rules. Students have to be held accountable for their actions. Institutions must signal their expectations firmly and unambiguously: every school principal and every college president, these days, reads from exactly the same script. What, then, of a student who gives his teacher a poisoned apple? Surely he ought to be expelled from school and sent before a judge.

  Suppose you cared about the student, though, and had some idea of his situation and his potential. Would you feel the same way? You might. Trying to poison your tutor is no small infraction. Then again, you might decide, as the dons at Cambridge clearly did, that what had happened called for a measure of leniency. They knew that the student had never done anything like this before, and that he wasn't well. And they knew that to file charges would almost certainly ruin his career. Cambridge wasn't sure that the benefits of enforcing the law, in this case, were greater than the benefits of allowing the offender an unimpeded future.

  Schools, historically, have been home to this kind of discretionary justice. You let the principal or the teacher decide what to do about cheating because you know that every case of cheating is different--and, more to the point, that every cheater is different. Jimmy is incorrigible, and needs the shock of expulsion. But Bobby just needs a talking to, because he's a decent kid, and Mary and Jane cheated because the teacher foolishly stepped out of the classroom in the middle of the test, and the temptation was simply too much. A Tennessee study found that after zero-tolerance programs were adopted by the state's public schools the frequency of targeted offenses soared: the firm and unambiguous punishments weren't deterring bad behavior at all. Is that really a surprise? If you're a teen-ager, the announcement that an act will be sternly punished doesn't always sink in, and it isn't always obvious when you're doing the thing you aren't supposed to be doing. Why? Because you're a teen-ager.

  Somewhere along the way--perhaps in response to Columbine--we forgot the value of discretion in disciplining the young. "Ultimately, they have to make right decisions," the Oklahoma football coach, Bob Stoops, said of his players, after jettisoning his quarterback. "When they do not, the consequences are serious." Open and shut: he sounded as if he were talking about a senior executive of Enron, rather than a college sophomore whose primary obligation at Oklahoma was to throw a football in the direction of young men in helmets. You might think that if the University of Oklahoma was so touchy about its quarterback being "overpaid" it ought to have kept closer track of his work habits with an on-campus job. But making a fetish of personal accountability conveniently removes the need for institutional accountability. (We court-martial the grunts who abuse prisoners, not the commanding officers who let the abuse happen.) To acknowledge that the causes of our actions are complex and muddy seems permissive, and permissiveness is the hallmark of an ideology now firmly in disgrace. That conservative patron saint Whittaker Chambers once defined liberalism as Christ without the Crucifixion. But punishment without the possibility of redemption is worse: it is the Crucifixion without Christ.

  What's behind Ireland's economic miracle--and G.M.'s financial crisis?

  1.

  The years just after the Second World War were a time of great industrial upheaval in the United States. Strikes were commonplace. Workers moved from one company to another. Runaway inflation was eroding the value of wages. In the uncertain nineteen-forties, in the wake of the Depression and the war, workers wanted security, and in 1949 the head of the Toledo, Ohio, local of the United Auto Workers, Richard Gosser, came up with a proposal. The workers of Toledo needed pensions. But, he said, the pension plan should be regional, spread across the many small auto-parts makers, electrical-appliance manufacturers, and plastics shops in the Toledo area. That way, if workers switched jobs they could take their pension credits with them, and if a company went bankrupt its workers' retirement would be safe. Every company in the area, Gosser proposed, should pay ten cents an hour, per worker, into a centralized fund.

  The business owners of Toledo reacted immediately. "They were terrified," says Jennifer Klein, a labor historian at Yale University, who has written about the Toledo case. "They organized a trade association to stop the plan. In the business press, they actually said, 'This idea might be efficient and rational. But it's too dangerous.' Some of the larger employers stepped forward and said, 'We'll offer you a company pension. Forget about that whole other idea.' They took on the costs of setting up an individual company pension, at great expense, in order to head off what they saw as too much organized power for workers in the region."

  A year later, the same issue came up in Detroit. The president of General Motors at the time was Charles E. Wilson, known as Engine Charlie. Wilson was one of the highest-paid corporate executives in America, earning $586,100 (and paying, incidentally, $430,350 in taxes). He was in contract talks with Walter Reuther, the national president of the U.A.W. The two men had already agreed on a cost-of-living allowance. Now Wilson went one step further, and, for the first time, offered every G.M. employee health-care benefits and a pension.

  Reuther had his doubts. He lived in a northwest Detroit bungalow, and drove a 1940 Chevrolet. His salary was ten thousand dollars a year. He was the son of a Debsian Socialist, worked for the Socialist Party during his college days, and went to the Soviet Union in the nineteen-thirties to teach peasants how to be auto machinists. His inclination was to fight for changes that benefitted every worker, not just those lucky enough to be employed by General Motors. In the nineteen-thirties, unions had launched a number of health-care plans, many of which cut across individual company and industry lines. In the nineteen-forties, they argued for expanding Social Security. In 1945, when President Truman first proposed national health insurance, they cheered. In 1947, when Ford offered its workers a pension, the union voted it down. The labor movement believed that the safest and most efficient way to provide insurance against ill health or old age was to spread the costs and risks of benefits over the biggest and most diverse group possible. Walter Reuther, as Nelson Lichtenstein argues in his definitive biography, believed that risk ought to be broadly collectivized. Charlie Wilson, on the other hand, felt the way the business leaders of Toledo did: that collectivization was a threat to the free market and to the autonomy of business owners. In his view, companies themselves ought to assume the risks of providing insurance.

  America's private pension system is now in crisis. Over the past few years, American taxpayers have been put at risk of assuming tens of billions of dollars of pension liabilities from once profitable companies. Hundreds of thousands of retired steelworkers and airline employees have seen health-care benefits that were promised to them by their employers vanish. General Motors, the country's largest automaker, is between forty and fifty billion dollars behind in the money it needs to fulfill its health-care and pension promises. This crisis is sometimes portrayed as the result of corporate America's excessive generosity in making promises to its workers. But when it comes to retirement, health, disability, and unemployment benefits there is nothing exceptional about the United States: it is average among industrialized countries--more generous than Australia, Canada, Ireland, and Italy, just behind Finland and the United Kingdom, and on a par with the Netherlands and Denmark. The difference is that in most countries the government, or large groups of companies, provides pensions and health insurance. The United States, by contrast, has over the past fifty years followed the lead of Charlie Wilson and the bosses of Toledo and made individual companies responsible for the care of their retirees. It is this fact, as much as any other, that explains the current crisis. In 1950, Charlie Wilson was wrong, and Walter Reuther was right.

  2.

  The key to understanding the pension business is something called the "dependency ratio," and dependency ratios are best understood in the context of countries. In the past two decades, for instance, Ireland has gone from being one of the most economically backward countries in Western Europe to being one of the strongest: its growth rate has been roughly double that of the rest of Europe. There is no shortage of conventional explanations. Ireland joined the European Union. It opened up its markets. It invested well in education and economic infrastructure. It's a politically stable country with a sophisticated, mobile workforce.

  But, as the Harvard economists David Bloom and David Canning suggest in their study of the "Celtic Tiger," of greater importance may have been a singular demographic fact. In 1979, restrictions on contraception that had been in place since Ireland's founding were lifted, and the birth rate began to fall. In 1970, the average Irishwoman had 3.9 children. By the mid-nineteen-nineties, that number was less than two. As a result, when the Irish children born in the nineteen-sixties hit the workforce, there weren't a lot of children in the generation just behind them. Ireland was suddenly free of the enormous social cost of supporting and educating and caring for a large dependent population. It was like a family of four in which, all of a sudden, the elder child is old enough to take care of her little brother and the mother can rejoin the workforce. Overnight, that family doubles its number of breadwinners and becomes much better off.

  This relation between the number of people who aren't of working age and the number of people who are is captured in the dependency ratio. In Ireland during the sixties, when contraception was illegal, there were ten people who were too old or too young to work for every fourteen people in a position to earn a paycheck. That meant that the country was spending a large percentage of its resources on caring for the young and the old. Last year, Ireland's dependency ratio hit an all-time low: for every ten dependents, it had twenty-two people of working age. That change coincides precisely with the country's extraordinary economic surge.

  Demographers estimate that declines in dependency ratios are responsible for about a third of the East Asian economic miracle of the postwar era; this is a part of the world that, in the course of twenty-five years, saw its dependency ratio decline thirty-five per cent. Dependency ratios may also help answer the much-debated question of whether India or China has a brighter economic future. Right now, China is in the midst of what Joseph Chamie, the former director of the United Nations' population division, calls the "sweet spot." In the nineteen-sixties, China brought down its birth rate dramatically; those children are now grown up and in the workforce, and there is no similarly sized class of dependents behind them. India, on the other hand, reduced its birth rate much more slowly and has yet to hit the sweet spot. Its best years are ahead.

  The logic of dependency ratios, of course, works equally powerfully in reverse. If your economy benefits by having a big bulge of working-age people, then your economy will have a harder time of it when that bulge generation retires, and there are relatively few workers to take their place. For China, the next few decades will be more difficult. "China will peak with a 1-to-2.6 dependency ratio between 2010 and 2015," Bloom says. "But then it's back to a little over 1-to-1.5 by 2050. That's a pretty dramatic change. Thirty per cent of the Chinese population will be over sixty by 2050. That's four hundred and thirty-two million people." Demographers sometimes say that China is in a race to get rich before it gets old.

  Economists have long paid attention to population growth, making the argument that the number of people in a country is either a good thing (spurring innovation) or a bad thing (depleting scarce resources). But an analysis of dependency ratios tells us that what's critical is not just the growth of a population but its structure. "The introduction of demographics has reduced the need for the argument that there was something exceptional about East Asia or idiosyncratic to Africa," Bloom and Canning write, in their study of the Irish economic miracle. "Once age-structure dynamics are introduced into an economic growth model, these regions are much closer to obeying common principles of economic growth."

  This is an important point. People have talked endlessly of Africa's political and social and economic shortcomings and simultaneously of some magical cultural ingredient possessed by South Korea and Japan and Taiwan that has brought them success. But the truth is that sub-Saharan Africa has been mired in a debilitating 1-to-1 ratio for decades, and that proportion of dependency would frustrate and complicate economic development anywhere. Asia, meanwhile, has seen its demographic load lighten overwhelmingly in the past thirty years. Getting to a 1-to-2.5 ratio doesn't make economic success inevitable. But, given a reasonably functional economic and political infrastructure, it certainly makes it a lot easier.

  This demographic logic also applies to companies, since any employer that offers pensions and benefits to its employees has to deal with the consequences of its nonworker-to-worker ratio, just as a country does. An employer that promised, back in the nineteen-fifties, to pay for its employees' health care when they were retired didn't set aside the money for that while they were working. It just paid the bills as they came in: money generated by current workers was used to pay for the costs of taking care of past workers. Pensions worked roughly the same way. On the day a company set up a pension plan, it was immediately on the hook for all the years of service accumulated by employees up to that point: the worker who was sixty-four when the pension was started got a pension when he retired at sixty-five, even though he had been in the system only a year. That debt is called a "past service" obligation, and in some cases in the nineteen-forties and fifties the past-service obligations facing employers were huge. At Ford, the amount reportedly came to two hundred million dollars, or just under three thousand dollars per employee. At Bethlehem Steel, it came to four thousand dollars per worker.

  Companies were required to put aside a little extra money every year to make up for that debt, with the hope of someday--twenty or thirty years down the line--becoming fully funded. In practice, though, that was difficult. Suppose that a company agrees to give its workers a pension of fifty dollars a month for every year of service. Several years later, after a round of contract negotiations, that multiple is raised to sixty dollars a month. That increase applies retroactively: now that company has a brand-new past-service obligation equal to another ten dollars for every month served by its wage employees. Or suppose the stock market goes into decline or interest rates fall, and the company discovers that its pension plan has less money than it had expected. Now it's behind again: it has to go back to using the money generated by current workers in order to take care of the costs of past workers. "You start off in the hole," Steven Sass, a pension expert at Boston College, "And the problem in these plans is that it's very difficult to dig your way out."

  Charlie Wilson's promise to his workers, then, contained an audacious assumption about G.M.'s dependency ratio: that the company would always have enough active workers to cover the costs of its retired workers--that it would always be like Ireland, and never like sub-Saharan Africa. Wilson's promise, in other words, was actually a gamble. Is it any wonder that the prospect of private pensions made people like Walter Reuther so nervous?

  The most influential management theorist of the twentieth century was Peter Drucker, who, in 1950, wrote an extraordinarily prescient article for Harper's entitled "The Mirage of Pensions." It ought to be reprinted for every steelworker, airline mechanic, and autoworker who is worried about his retirement. Drucker simply couldn't see how the pension plans on the table at companies like G.M. could ever work. "For such a plan to give real security, the financial strength of the company and its economic success must be reasonably secure for the next forty years," Drucker wrote. "But is there any one company or any one industry whose future can be predicted with certainty for even ten years ahead?" He concluded, "The recent pension plans thus offer no more security against the big bad wolf of old age than the little piggy's house of straw."

  3.

  In the mid-nineteen-fifties, the largest steel mill in the world was at Sparrows Point, just east of Baltimore, on the Chesapeake Bay. It was owned by Bethlehem Steel, one of the nation's grandest industrial enterprises. The steel for the Golden Gate Bridge came from Sparrows Point, as did the cables for the George Washington Bridge, and the materials for countless guns and planes and ships that helped win both world wars. Sparrows Point, a so-called integrated mill, used a method of making steel that dated back to the nineteenth century. Coke and iron, the raw materials, were combined in a blast furnace to make liquid pig iron. The pig iron was poured into a vast oven, known as an open-hearth furnace, to make molten steel. The steel was poured into pots to make ingots. The ingots were cooled, reheated, and fed into a half-mile-long rolling mill and turned into semi-finished shapes, which eventually became girders for the construction industry or wafer-thin sheets for beer cans or galvanized panels for the automobile industry. Open-hearth steelmaking was expensive and time-consuming. It required great amounts of energy, water, and space. Sparrows Point stretched four miles from one end to the other. Most important, it required lots and lots of people. Sparrows Point, at its height, employed tens of thousands of them. As Mark Reutter demonstrates in "Making Steel," his comprehensive history of Sparrows Point, it was not just a steel mill. It was a city.

  In 1956, Eugene Grace, the head of Bethlehem Steel, was the country's best- paid executive. Eleven of the country's eighteen top-earning executives that year, in fact, worked for Bethlehem Steel. In 1955, when the American Iron and Steel Institute had its annual meeting, at the Waldorf-Astoria, in New York, the No. 2 at Bethlehem Steel, Arthur Homer, made a bold forecast: domestic demand for steel, he said, would increase by fifty per cent over the next fifteen years. "As someone has said, the American people are wanters," he told the audience of twelve hundred industry executives. "Their wants are going to require a great deal of steel."

  But Big Steel didn't get bigger. It got smaller. Imports began to take a larger and larger share of the American steel market. The growing use of aluminum, concrete, and plastic cut deeply into the demand for steel. And the steelmaking process changed. Instead of laboriously making steel from scratch, with coke and iron ore, factories increasingly just melted down scrap metal. The open-hearth furnace was replaced with the basic oxygen furnace, which could make the same amount of steel in about a tenth of the time. Steelmakers switched to continuous casting, which meant that you skipped the ingot phase altogether and poured your steel products directly out of the furnace. As a result, steelmakers like Bethlehem were no longer hiring young workers to replace the people who retired. They were laying people off by the thousands. But every time they laid off another employee they turned a money-making steelworker into a money-losing retiree--and their dependency ratio got a little worse. According to Reutter, Bethlehem had a hundred and sixty-four thousand workers in 1957. By the mid-to-late-nineteen-eighties, it was down to thirty-five thousand workers, and employment at Sparrows Point had fallen to seventy-nine hundred. In 2001, Bethlehem, just shy of its hundredth birthday, declared bankruptcy. It had twelve thousand active employees and ninety thousand retirees and their spouses drawing benefits. It had reached what might be a record-setting dependency ratio of 7.5 pensioners for every worker.

  What happened to Bethlehem, of course, is what happened throughout American industry in the postwar period. Technology led to great advances in productivity, so that when the bulge of workers hired in the middle of the century retired and began drawing pensions, there was no one replacing them in the workforce. General Motors today makes more cars and trucks than it did in the early nineteen-sixties, but it does so with about a third of the employees. In 1962, G.M. had four hundred and sixty-four thousand U.S. employees and was paying benefits to forty thousand retirees and their spouses, for a dependency ratio of one pensioner to 11.6 employees. Last year, it had a hundred and forty-one thousand workers and paid benefits to four hundred and fifty-three thousand retirees, for a dependency ratio of 3.2 to 1.

  Looking at General Motors and the old-line steel companies in demographic terms substantially changes the way we understand their problems. It is a commonplace assumption, for instance, that they were undone by overly generous union contracts. But, when dependency ratios start getting up into the 3-to-1 to 7-to-1 range, the issue is not so much what you are paying each dependent as how many dependents you are paying. "There is this notion that there is a Cadillac being provided to all these retirees," Ron Bloom, a senior official at the United Steelworkers, says. "It's not true. The truth is seventy-five-year-old widows living on less than three hundred dollars to four hundred dollars a month. It's just that there's a lot of them."

  A second common assumption is that fading industrial giants like G.M. and Bethlehem are victims of their own managerial incompetence. In various ways, they undoubtedly are. But, with respect to the staggering burden of benefit obligations, what got them in trouble isn't what they did wrong; it is what they did right. They got in trouble in the nineteen-nineties because they were around in the nineteen-fifties--and survived to pay for the retirement of the workers they hired forty years ago. They got in trouble because they innovated, and became more efficient in their use of labor.

  "We are making as much steel as we made thirty years ago with twenty-five per cent of the workforce," Michael Locker, a steel-industry consultant, says. "And it is a much higher quality of steel, too. There is simply no comparison. That change recasts the industry and it recasts the workforce. You get this enormous bulge. It's abnormal. It's not predicted, and it's not funded. Is that the fault of the steelworkers? Is that the fault of the companies?"

  Here, surely, is the absurdity of a system in which individual employers are responsible for providing their own employee benefits. It penalizes companies for doing what they ought to do. General Motors, by American standards, has an old workforce: its average worker is much older than, say, the average worker at Google. That has an immediate effect: health-care costs are a linear function of age. The average cost of health insurance for an employee between the ages of thirty-five and thirty-nine is $3,759 a year, and for someone between the ages of sixty and sixty-four it is $7,622. This goes a long way toward explaining why G.M. has an estimated sixty-two billion dollars in health-care liabilities. The current arrangement discourages employers from hiring or retaining older workers. But don't we want companies to retain older workers--to hire on the basis of ability and not age? In fact, a system in which companies shoulder their own benefits is ultimately a system that penalizes companies for offering any benefits at all. Many employers have simply decided to let their workers fend for themselves. Given what has so publicly and disastrously happened to companies like General Motors, can you blame them?

  Or consider the continuous round of discounts and rebates that General Motors--a company that lost $8.6 billion last year--has been offering to customers. If you bought a Chevy Tahoe this summer, G.M. would give you zero-per-cent financing, or six thousand dollars cash back. Surely, if you are losing money on every car you sell, as G.M. is, cutting car prices still further in order to boost sales doesn't make any sense. It's like the old Borsht-belt joke about the haberdasher who lost money on every hat he made but figured he'd make up the difference on volume. The economically rational thing for G.M. to do would be to restructure, and sell fewer cars at a higher profit margin--and that's what G.M. tried to do this summer, announcing plans to shutter plants and buy out the contracts of thirty-five thousand workers. But buyouts, which turn active workers into pensioners, only worsen the company's dependency ratio. Last year, G.M. covered the costs of its four hundred and fifty-three thousand retirees and their dependents with the revenue from 4.5 million cars and trucks. How is G.M. better off covering the costs of four hundred and eighty-eighty thousand dependents with the revenue from, say, 4.2 million cars and trucks? This is the impossible predicament facing the company's C.E.O., Rick Wagoner. Demographic logic requires him to sell more cars and hire more workers; financial logic requires him to sell fewer cars and hire fewer workers.

  Under the circumstances, one of the great mysteries of contemporary American politics is why Wagoner isn't the nation's leading proponent of universal health care and expanded social welfare. That's the only way out of G.M.'s dilemma. But, from Wagoner's reticence on the issue, you'd think that it was still 1950, or that Wagoner believes he's the Prime Minister of Ireland. "One thing I've learned is that corporate America has got much more class solidarity than we do--meaning union people," the U.S.W.'s Ron Bloom says. "They really are afraid of getting thrown out of their country clubs, even though their objective ought to be maximizing value for their shareholders."

  David Bloom, the Harvard economist, once did a calculation in which he combined the dependency ratios of Africa and Western Europe. He found that they fit together almost perfectly; that is, Africa has plenty of young people and not a lot of older people and Western Europe has plenty of old people and not a lot of young people, and if you combine the two you have an even distribution of old and young. "It makes you think that if there is more international migration, that could smooth things out," Bloom said.

  Of course, you can't take the populations of different countries and different cultures and simply merge them, no matter how much demographic sense that might make. But you can do that with companies within an economy. If the retiree obligations of Bethlehem Steel had been pooled with those of the much younger industries that supplanted steel--aluminum, say, or plastic--Bethlehem Steel might have made it. If you combined the obligations of G.M., with its four hundred and fifty-three thousand retirees, and the American manufacturing operations of Toyota, with a mere two hundred and fifty-eight retirees, Toyota could help G.M. shoulder its burden, and thirty or forty years from now--when those G.M. retirees are dead and Toyota's now youthful workforce has turned gray--G.M. could return the favor. For that matter, if you pooled the obligations of every employer in the country, no company would go bankrupt just because it happened to employ older people, or it happened to have been around for a while, or it happened to have made the transformation from open-hearth furnaces and ingot-making to basic oxygen furnaces and continuous casting. This is what Walter Reuther and the other union heads understood more than fifty years ago: that in the free-market system it makes little sense for the burdens of insurance to be borne by one company. If the risks of providing for health care and old-age pensions are shared by all of us, then companies can succeed or fail based on what they do and not on the number of their retirees.

  4.

  When Bethlehem Steel filed for bankruptcy, it owed about four billion dollars to its pension plan, and had another three billion dollars in unmet health-care obligations. Two years later, in 2003, the pension fund was terminated and handed over to the federal government's Pension Benefit Guaranty Corporation. The assets of the company--Sparrows Point and a handful of other steel mills in the Midwest--were sold to the New York-based investor Wilbur Ross.

  Ross acted quickly. He set up a small trust fund to help defray Bethlehem's unmet retiree health-care costs, cut a deal with the union to streamline work rules, put in place a new 401(k) savings plan--and then started over. The new Bethlehem Steel had a dependency ratio of 0 to 1. Within about six months, it was profitable. The main problem with the American steel business wasn't the steel business, Ross showed. It was all the things that had nothing to do with the steel business.

  Not long ago, Ross sat in his sparse midtown office and explained what he had learned from his rescue of Bethlehem. Ross is in his sixties, a Yale- and Harvard-educated patrician with small rectangular glasses and impeccable manners. Outside his office, by the elevator, was a large sculpture of a bull, papered over from head to hoof with stock tables.

  "When we showed up to the Bethlehem board to approve the deal, they had an army of people there," Ross said. "The whole board was there, the whole senior management was there, people from Credit Suisse and Greenhill were there. They must have had about fifty or sixty people there for a deal that was already done. So my partner and I--just the two of us--show up, and they say, 'Well, we should wait for the rest of your team.' And we said, 'There is no rest of the team, there is just the two of us.' It said the whole thing right there."

  Ross isn't a fan of old-style pensions, because they make it impossible to run a company efficiently. "When a company gets in trouble and restructures," he said, those underfunded pension funds "will eat it alive." And how much sense does employer-provided health insurance make? Bethlehem made promises to its employees, years ago, to give them medical insurance in exchange for their labor, and when the company ran into trouble those promises simply evaporated. "Every country against which we compete has universal health care," he said. "That means we probably face a fifteen-per-cent cost disadvantage versus foreigners for no other reason than historical accident. . . . The randomness of our system is just not going to work."

  When it comes to athletic prowess, don't believe your eyes.

  1.

  The first player picked in the 1996 National Basketball Association draft was a slender, six-foot guard from Georgetown University named Allen Iverson. Iverson was thrilling. He was lightning quick, and could stop and start on a dime. He would charge toward the basket, twist and turn and writhe through the arms and legs of much taller and heavier men, and somehow find a way to score. In his first season with the Philadelphia 76ers, Iverson was voted the N.B.A.'s Rookie of the Year. In every year since 2000, he has been named to the N.B.A.'s All-Star team. In the 2000-01 season, he finished first in the league in scoring and steals, led his team to the second-best record in the league, and was named, by the country's sportswriters and broadcasters, basketball's Most Valuable Player. He is currently in the midst of a four-year, seventy-seven-million-dollar contract. Almost everyone who knows basketball and who watches Iverson play thinks that he's one of the best players in the game.

  But how do we know that we're watching a great player? That's an easier question to answer when it comes to, say, golf or tennis, where players compete against one another, under similar circumstances, week after week. Nobody would dispute that Roger Federer is the world's best tennis player. Baseball is a little more complicated, since it's a team sport. Still, because the game consists of a sequence of discrete, ritualized encounters between pitcher and hitter, it lends itself to statistical rankings and analysis. Most tasks that professionals perform, though, are surprisingly hard to evaluate. Suppose that we wanted to measure something in the real world, like the relative skill of New York City's heart surgeons. One obvious way would be to compare the mortality rates of the patients on whom they operate--except that substandard care isn't necessarily fatal, so a more accurate measure might be how quickly patients get better or how few complications they have after surgery. But recovery time is a function as well of how a patient is treated in the intensive-care unit, which reflects the capabilities not just of the doctor but of the nurses in the I.C.U. So now we have to adjust for nurse quality in our assessment of surgeon quality. We'd also better adjust for how sick the patients were in the first place, and since well-regarded surgeons often treat the most difficult cases, the best surgeons might well have the poorest patient recovery rates. In order to measure something you thought was fairly straightforward, you really have to take into account a series of things that aren't so straightforward.

  Basketball presents many of the same kinds of problems. The fact that Allen Iverson has been one of the league's most prolific scorers over the past decade, for instance, could mean that he is a brilliant player. It could mean that he's selfish and takes shots rather than passing the ball to his teammates. It could mean that he plays for a team that races up and down the court and plays so quickly that he has the opportunity to take many more shots than he would on a team that plays more deliberately. Or he might be the equivalent of an average surgeon with a first-rate I.C.U.: maybe his success reflects the fact that everyone else on his team excels at getting rebounds and forcing the other team to turn over the ball. Nor does the number of points that Iverson scores tell us anything about his tendency to do other things that contribute to winning and losing games; it doesn't tell us how often he makes a mistake and loses the ball to the other team, or commits a foul, or blocks a shot, or rebounds the ball. Figuring whether one basketball player is better than another is a challenge similar to figuring out whether one heart surgeon is better than another: you have to find a way to interpret someone's individual statistics in the context of the team that they're on and the task that they are performing.

  In "The Wages of Wins" (Stanford; $29.95), the economists David J. Berri, Martin B. Schmidt, and Stacey L. Brook set out to solve the Iverson problem. Weighing the relative value of fouls, rebounds, shots taken, turnovers, and the like, they've created an algorithm that, they argue, comes closer than any previous statistical measure to capturing the true value of a basketball player. The algorithm yields what they call a Win Score, because it expresses a player's worth as the number of wins that his contributions bring to his team. According to their analysis, Iverson's finest season was in 2004-05, when he was worth ten wins, which made him the thirty-sixth-best player in the league. In the season in which he won the Most Valuable Player award, he was the ninety-first-best player in the league. In his worst season (2003-04), he was the two-hundred-and-twenty-seventh-best player in the league. On average, for his career, he has ranked a hundred and sixteenth. In some years, Iverson has not even been the best player on his own team. Looking at the findings that Berri, Schmidt, and Brook present is enough to make one wonder what exactly basketball experts--coaches, managers, sportswriters--know about basketball.

  2.

  Basketball experts clearly appreciate basketball. They understand the gestalt of the game, in the way that someone who has spent a lifetime thinking about and watching, say, modern dance develops an understanding of that art form. They're able to teach and coach and motivate; to make judgments and predictions about a player's character and resolve and stage of development. But the argument of "The Wages of Wins" is that this kind of expertise has real limitations when it comes to making precise evaluations of individual performance, whether you're interested in the consistency of football quarterbacks or in testing claims that N.B.A. stars "turn it on" during playoffs. The baseball legend Ty Cobb, the authors point out, had a lifetime batting average of .366, almost thirty points higher than the former San Diego Padres outfielder Tony Gwynn, who had a lifetime batting average of .338:

So Cobb hit safely 37 percent of the time while Gwynn hit safely on 34 percent of his at bats. If all you did was watch these players, could you say who was a better hitter? Can one really tell the difference between 37 percent and 34 percent just staring at the players play? To see the problem with the non-numbers approach to player evaluation, consider that out of every 100 at bats, Cobb got three more hits than Gwynn. That's it, three hits.

  Michael Lewis made a similar argument in his 2003 best-seller, "Moneyball," about how the so-called sabermetricians have changed the evaluation of talent in baseball. Baseball is sufficiently transparent, though, that the size of the discrepancies between intuitive and statistically aided judgment tends to be relatively modest. If you mistakenly thought that Gwynn was better than Cobb, you were still backing a terrific hitter. But "The Wages of Wins" suggests that when you move into more complex situations, like basketball, the limitations of "seeing" become enormous. Jermaine O'Neal, a center for the Indiana Pacers, finished third in the Most Valuable Player voting in 2004. His Win Score that year put him forty-fourth in the league. In 2004-05, the forward Antoine Walker made as much money as the point guard Jason Kidd, even though Walker produced 0.6 wins for Atlanta and Boston and Kidd produced nearly twenty wins for New Jersey. The Win Score algorithm suggests that Ray Allen has had nearly as good a career as Kobe Bryant, whom many consider the top player in the game, and that the journeyman forward Jerome Williams was actually among the strongest players of his generation.

  Most egregious is the story of a young guard for the Chicago Bulls named Ben Gordon. Last season, Gordon finished second in the Rookie of the Year voting and was named the league's top "sixth man"--that is, the best non-starter--because he averaged an impressive 15.1 points per game in limited playing time. But Gordon rebounds less than he should, turns over the ball frequently, and makes such a low percentage of his shots that, of the "s top thirty-three scorers--that is, players who score at least one point for every two minutes on the floor--Gordon's Win Score ranked him dead last.

  The problem for basketball experts is that, in a situation with many variables, it's difficult to know how much weight to assign to each variable. Buying a house is agonizing because we look at the size, the location, the back yard, the proximity to local schools, the price, and so on, and we're unsure which of those things matters most. Assessing heart-attack risk is a notoriously difficult task for similar reasons. A doctor can analyze a dozen different factors. But how much weight should be given to a patient's cholesterol level relative to his blood pressure? In the face of such complexity, people construct their own arbitrary algorithms--they assume that every factor is of equal importance, or randomly elevate one or two factors for the sake of simplifying matters--and we make mistakes because those arbitrary algorithms are, well, arbitrary.

  Berri, Schmidt, and Brook argue that the arbitrary algorithms of basketball experts elevate the number of points a player scores above all other considerations. In one clever piece of research, they analyze the relationship between the statistics of rookies and the number of votes they receive in the All-Rookie Team balloting. If a rookie increases his scoring by ten per cent--regardless of how efficiently he scores those points--the number of votes he'll get will increase by twenty-three per cent. If he increases his rebounds by ten per cent, the number of votes he'll get will increase by six per cent. Every other factor, like turnovers, steals, assists, blocked shots, and personal fouls--factors that can have a significant influence on the outcome of a game--seemed to bear no statistical relationship to judgments of merit at all. It's not even the case that high scorers help their team by drawing more fans. As the authors point out, that's only true on the road. At home, attendance is primarily a function of games won. Basketball's decision-makers, it seems, are simply irrational.

  What is the difference between choking and panicking? Why are there dozens of varieties of mustard-but only one variety of ketchup? What do football players teach us about how to hire teachers? What does hair dye tell us about the history of the 20th century?

  Here is the bittersweet tale of the inventor of the birth control pill, and the dazzling inventions of the pasta sauce pioneer Howard Moscowitz. Gladwell sits with Ron Popeil, the king of the American kitchen, as he sells rotisserie ovens, and divines the secrets of Cesar Millan, the "dog whisperer" who can calm savage animals with the touch of his hand. He explores intelligence tests and ethnic profiling and "hindsight bias" and why it was that everyone in Silicon Valley once tripped over themselves to hire the same college graduate.

  "Good writing," Gladwell says in his preface, "does not succeed or fail on the strength of its ability to persuade. It succeeds or fails on the strength of its ability to engage you, to make you think, to give you a glimpse into someone else's head." What the Dog Saw is yet another example of the buoyant spirit and unflagging curiosity that have made Malcolm Gladwell our most brilliant investigator of the hidden extraordinary.

  A sociologist offers an anatomy of explanations.

  1.

  Little Timothy is playing with his older brother Geoffrey, when he comes running to his mother.

  "Mommy, Mommy, " he starts in. "I was playing with my truck, and then Geoffrey came and he said it was his turn to play with the truck even though it's my truck and then he pushed me."

  "Timothy!" his mother says, silencing him. "Don't be a tattletale."

  Timothy has heard that phrase--"Don't be a tattletale"--countless times, and it always stops him short. He has offered his mother an eyewitness account of a crime. His mother, furthermore, in no way disputes the truth of his story. Yet what does she do? She rejects it in favor of a simplistic social formula: Don't be a tattletale. It makes no sense. Timothy's mother would never use such a formula to trump a story if she were talking to his father. On the contrary, his mother and father tattle to each other about Geoffrey all the time. And, if Timothy were to tattle on Geoffrey to his best friend, Bruce, Bruce wouldn't reject the story in favor of a formula, either. Narratives are the basis of Timothy's friendship with Bruce. They explain not just effects but causes. They matter--except in this instance, of a story told by Timothy to Mommy about Geoffrey, in which Mommy is suddenly indifferent to stories altogether. What is this don't-be-a-tattletale business about?

  In "Why?" (Princeton; $24.95), the Columbia University scholar Charles Tilly sets out to make sense of our reasons for giving reasons. In the tradition of the legendary sociologist Erving Goffman, Tilly seeks to decode the structure of everyday social interaction, and the result is a book that forces readers to reëxamine everything from the way they talk to their children to the way they argue about politics.

  In Tilly's view, we rely on four general categories of reasons. The first is what he calls conventions--conventionally accepted explanations. Tilly would call "Don't be a " a convention. The second is stories, and what distinguishes a story ("I was playing with my truck, and then Geoffrey came in . . .") is a very specific account of cause and effect. Tilly cites the sociologist Francesca Polletta's interviews with people who were active in the civil-rights sit-ins of the nineteen-sixties. Polletta repeatedly heard stories that stressed the spontaneity of the protests, leaving out the role of civil-rights organizations, teachers, and churches. That's what stories do. As Tilly writes, they circumscribe time and space, limit the number of actors and actions, situate all causes "in the consciousness of the actors," and elevate the personal over the institutional.

  Then there are codes, which are high-level conventions, formulas that invoke sometimes recondite procedural rules and categories. If a loan officer turns you down for a mortgage, the reason he gives has to do with your inability to conform to a prescribed standard of creditworthiness. Finally, there are technical accounts: stories informed by specialized knowledge and authority. An academic history of civil-rights sit-ins wouldn't leave out the role of institutions, and it probably wouldn't focus on a few actors and actions; it would aim at giving patient and expert attention to every sort of nuance and detail.

  Tilly argues that we make two common errors when it comes to understanding reasons. The first is to assume that some kinds of reasons are always better than others--that there is a hierarchy of reasons, with conventions (the least sophisticated) at the bottom and technical accounts at the top. That's wrong, Tilly says: each type of reason has its own role.

  Tilly's second point flows from the first, and it's that the reasons people give aren't a function of their character--that is, there aren't people who always favor technical accounts and people who always favor stories. Rather, reasons arise out of situations and roles. Imagine, he says, the following possible responses to one person's knocking some books off the desk of another:

  1. Sorry, buddy. I'm just plain awkward.
  2. I'm sorry. I didn't see your book.
  3. Nuts! I did it again.
  4. Why did you put that book there?
  5. I told you to stack up your books neatly.

  The lesson is not that the kind of person who uses reason No. 1 or No. 2 is polite and the kind of person who uses reason No. 4 or No. 5 is a jerk. The point is that any of us might use any of those five reasons depending on our relation to the person whose books we knocked over. Reason-giving, Tilly says, reflects, establishes, repairs, and negotiates relationships. The husband who uses a story to explain his unhappiness to his wife--"Ever since I got my new job, I feel like I've just been so busy that I haven't had time for us"--is attempting to salvage the relationship. But when he wants out of the marriage, he'll say, "It's not you--it's me." He switches to a convention. As his wife realizes, it's not the content of what he has said that matters. It's his shift from the kind of reason-giving that signals commitment to the kind that signals disengagement. Marriages thrive on stories. They die on conventions.

  Consider the orgy of reason-giving that followed Vice-President Dick Cheney's quail-hunting accident involving his friend Harry Whittington. Allies of the Vice-President insisted that the media were making way too much of it. "Accidents happen," they said, relying on a convention. Cheney, in a subsequent interview, looked penitently into the camera and said, "The image of him falling is something I'll never be able to get out of my mind. I fired, and there's Harry falling. And it was, I'd have to say, one of the worst days of my life." Cheney told a story. Some of Cheney's critics, meanwhile, focussed on whether he conformed to legal and ethical standards. Did he have a valid license? Was he too slow to notify the White House? They were interested in codes. Then came the response of hunting experts. They retold the narrative of Cheney's accident, using their specialized knowledge of hunting procedure. The Cheney party had three guns, and on a quail shoot, some of them said, you should never have more than two. Why did Whittington retrieve the downed bird? A dog should have done that. Had Cheney's shotgun been aimed more than thirty degrees from the ground, as it should have been? And what were they doing in the bush at five-thirty in the afternoon, when the light isn't nearly good enough for safe hunting? The experts gave a technical account.

  Here are four kinds of reasons, all relational in nature. If you like Cheney and are eager to relieve him of responsibility, you want the disengagement offered by a convention. For a beleaguered P.R. agent, the first line of defense in any burgeoning scandal is, inevitably, There is no story here. When, in Cheney's case, this failed, the Vice-President had to convey his concern and regret while not admitting that he had done anything procedurally wrong. Only a story can accomplish that. Anything else--to shrug and say that accidents happen, for instance--would have been perceived as unpardonably callous. Cheney's critics, for their part, wanted the finality and precision of a code: he acted improperly. And hunting experts wanted to display their authority and educate the public about how to hunt safely, so they retold the story of Cheney's accident with the benefit of their specialized knowledge.

  Effective reason-giving, then, involves matching the kind of reason we give to the particular role that we happen to be playing at the time a reason is necessary. The fact that Timothy's mother accepts tattling from his father but rejects it from Timothy is not evidence of capriciousness; it just means that a husband's relationship to his wife gives him access to a reasongiving category that a son's role does not. The lesson "Don't be a tattletale"--which may well be one of the hardest childhood lessons to learn--is that in the adult world it is sometimes more important to be appropriate than it is to be truthful.

  2.

  Two years ago, a young man named Anthony mugged a woman named Anne on a London street. Anthony was caught and convicted, and a few days before he was sentenced he sat down with Anne for a face-to-face meeting, as an exercise in what is known as "restorative justice." The meeting was videotaped by a criminal-justice research group, and to watch the video is to get an even deeper sense of the usefulness of Tilly's thinking.

  "We're going to talk about what's happened," the policeman moderating the meeting begins. "Who's been affected, and how they've been affected, and see what we can do to make things better."

  Anthony starts. He has a shaved head, a tattoo on his neck, and multiple piercings in his eyebrows and ears. Beside him is his partner, Christy, holding their baby boy. "What happened is I had a bad week. Been out of work for a couple of weeks. Had my kneecap broken. . . . I only had my dad in this country, who I don't get on with. We had no gas in our flat. Me and Christy were arguing all that morning. The baby had been screaming. We were hungry." His story comes out painfully and haltingly. "It was a bit too much. All my friends I was asking to loan me a couple of pounds. They just couldn't afford to give it to me. . . . I don't know what got into me. I just reached over and took your bag. And I'm really sorry for it. And if there is anything I can do to make up for it, I'm willing to do it. I know you probably don't want me anywhere near you."

  Anne has been listening closely, her husband, Terry, next to her. Now she tells her side of the story. She heard a sound like male laughter. She turned, and felt her purse being pulled away. She saw a man pulling up his hood. She ran after him, feeling like a "complete idiot." In the struggle over her bag, her arm was injured. She is a journalist and has since had difficulty typing. "The mugging was very small," she says. "But the effect is not going away as fast as I expected. . . . It makes life one notch less bearable."

  It was Christy's turn. She got the call at home. She didn't know exactly what had happened. She took the baby and walked to the police station, angry and frightened. "We got ourselves in a situation where we were relying on the state, and we just can't live off the money," Christy says. "And that's not your problem." She starts to cry. "He's not a drug addict," she continues, looking at her husband. Anthony takes the baby from her and holds him. "If we go to court on Monday, and he does get three years for what he's done, or six years, that's his problem. He done it. And he's got to pay for what he's done. I wake up and hear him cry"--she looks at the baby--"and it kills me. I'm in a situation where I can't do anything to make this better. . . . I just want you to know. The first thing he said to me when he walked in was 'I apologized.' And I said, 'That makes what difference?' "

  Watching the conference is a strange experience, because it is utterly foreign to the criminal process of which it is ostensibly a part. There is none of the oppressive legalese of the courtroom. Nothing is "alleged"; there are no "perpetrators." The formal back-and-forth between questioner and answerer, the emotionally protective structure of courtroom procedure, is absent. Anne and Terry sit on comfortable chairs facing Christy and Anthony. They have a conversation, not a confrontation. They are telling stories, in Tilly's sense of that word: repairing their relationship by crafting a cause-and-effect account of what happened on the street.

  3.

  Why is such storytelling, in the wake of a crime, so important? Because, Tilly would argue, some social situations don't lend themselves to the easy reconciliation of reason and role. In Jonathan Franzen's novel "The Corrections," for example, one of the characters, Gary, is in the midst of a frosty conversation with his wife, Caroline. Gary had the sense, Franzen writes, "that Caroline was on the verge of accusing him of being 'depressed,' and he was afraid that if the idea that he was depressed gained currency, he would forfeit his right to his opinions. . . . Every word he spoke would become a symptom of disease; he would never again win an argument." Gary was afraid, in other words, that a technical account of his behavior--the explanation that he was clinically depressed--would trump his efforts to use the stories and conventions that permitted him to be human. But what was his wife to do? She wanted him to change.

  When we say that two parties in a conflict are "talking past each other," this is what we mean: that both sides have a legitimate attachment to mutually exclusive reasons. Proponents of abortion often rely on a convention (choice) and a technical account (concerning the viability of a fetus in the first trimester). Opponents of abortion turn the fate of each individual fetus into a story: a life created and then abruptly terminated. Is it any surprise that the issue has proved to be so intractable? If you believe that stories are the most appropriate form of reason-giving, then those who use conventions and technical accounts will seem morally indifferent--regardless of whether you agree with them. And, if you believe that a problem is best adjudicated through conventions or technical accounts, it is hard not to look upon storytellers as sensationalistic and intellectually unserious. By Tilly's logic, abortion proponents who want to engage their critics will have to become better storytellers--and that, according to the relational principles of such reason-giving, may require them to acknowledge an emotional connection between a mother and a fetus. (Ironically, many of the same members of the religious right who have so emphatically demonstrated the emotional superiority of stories when it comes to abortion insist, when it comes to Genesis, on a reading of the Bible as a technical account. Thus do creationists, in the service of reasongiving exigency, force the Holy Scripture to do double duty as a high-school biology textbook.)

  Tilly argues that these conflicts are endemic to the legal system. Laws are established in opposition to stories. In a criminal trial, we take a complicated narrative of cause and effect and match it to a simple, impersonal code: first-degree murder, or second-degree murder, or manslaughter. The impersonality of codes is what makes the law fair. But it is also what can make the legal system so painful for victims, who find no room for their voices and their anger and their experiences. Codes punish, but they cannot heal.

  So what do you do? You put Anne and her husband in a room with Anthony and Christy and their baby boy and you let them talk. In a series of such experiments, conducted in Britain and Australia by the criminologists Lawrence Sherman and Heather Strang, restorative-justice programs have shown encouraging results in reducing recidivism rates among offenders and psychological trauma among victims. If you view the tape of the Anthony-Anne exchange, it's not hard to see why. Sherman said that when the Lord Chief Justice of England and Wales watched it at home one night he wept.

  "If there is anything I can do, please say it," Anthony says.

  "I think most of what you can do is between the two of you, actually," Anne says to Anthony and Christy. "I think if you can put your lives back together again, then that's what needs to be done."

  The moderator tells them all to take a break and help themselves to "Metropolitan Police tea and coffee and chocolate biscuits."

  Anne asks Christy how old the baby is, and where they are living. It turns out that their apartment has been condemned.Terry stands up and offers the baby a chocolate biscuit, and the adults experience the kind of moment that adults have in the company of babies, where nothing matters except the child in front of them.

  Why problems like homelessness may be easier to solve than to manage.

  1.

  Murray Barr was a bear of a man, an ex-marine, six feet tall and heavyset, and when he fell down--which he did nearly every day--it could take two or three grown men to pick him up. He had straight black hair and olive skin. On the street, they called him Smokey. He was missing most of his teeth. He had a wonderful smile. People loved Murray.

  His chosen drink was vodka. Beer he called "horse piss." On the streets of downtown Reno, where he lived, he could buy a two-hundred-and-fifty-millilitre bottle of cheap vodka for a dollar-fifty. If he was flush, he could go for the seven-hundred-and-fifty-millilitre bottle, and if he was broke he could always do what many of the other homeless people of Reno did, which is to walk through the casinos and finish off the half-empty glasses of liquor left at the gaming tables.

  "If he was on a runner, we could pick him up several times a day," Patrick O'Bryan, who is a bicycle cop in downtown Reno, said. "And he's gone on some amazing runners. He would get picked up, get detoxed, then get back out a couple of hours later and start up again. A lot of the guys on the streets who've been drinking, they get so angry. They are so incredibly abrasive, so violent, so abusive. Murray was such a character and had such a great sense of humor that we somehow got past that. Even when he was abusive, we'd say, 'Murray, you know you love us,' and he'd say, 'I know--and go back to swearing at us."

  "I've been a police officer for fifteen years," O'Bryan's partner, Steve Johns, said. "I picked up Murray my whole career. Literally."

  Johns and O'Bryan pleaded with Murray to quit drinking. A few years ago, he was assigned to a treatment program in which he was under the equivalent of house arrest, and he thrived. He got a job and worked hard. But then the program ended. "Once he graduated out, he had no one to report to, and he needed that," O'Bryan said. "I don't know whether it was his military background. I suspect that it was. He was a good cook. One time, he accumulated savings of over six thousand dollars. Showed up for work religiously. Did everything he was supposed to do. They said, 'Congratulations,' and put him back on the street. He spent that six thousand in a week or so."

  Often, he was too intoxicated for the drunk tank at the jail, and he'd get sent to the emergency room at either Saint Mary's or Washoe Medical Center. Marla Johns, who was a social worker in the emergency room at Saint Mary's, saw him several times a week. "The ambulance would bring him in. We would sober him up, so he would be sober enough to go to jail. And we would call the police to pick him up. In fact, that's how I met my husband." Marla Johns is married to Steve Johns.

  "He was like the one constant in an environment that was ever changing," she went on. "In he would come. He would grin that half-toothless grin. He called me 'my angel.' I would walk in the room, and he would smile and say, 'Oh, my angel, I'm so happy to see you.' We would joke back and forth, and I would beg him to quit drinking and he would laugh it off. And when time went by and he didn't come in I would get worried and call the coroner's office. When he was sober, we would find out, oh, he's working someplace, and my husband and I would go and have dinner where he was working. When my husband and I were dating, and we were going to get married, he said, 'Can I come to the wedding?' And I almost felt like he should. My joke was 'If you are sober you can come, because I can't afford your bar bill.' When we started a family, he would lay a hand on my pregnant belly and bless the child. He really was this kind of light."

  In the fall of 2003, the Reno Police Department started an initiative designed to limit panhandling in the downtown core. There were articles in the newspapers, and the police department came under harsh criticism on local talk radio. The crackdown on panhandling amounted to harassment, the critics said. The homeless weren't an imposition on the city; they were just trying to get by. "One morning, I'm listening to one of the talk shows, and they're just trashing the police department and going on about how unfair it is," O'Bryan said. "And I thought, Wow, I've never seen any of these critics in one of the alleyways in the middle of the winter looking for bodies." O'Bryan was angry. In downtown Reno, food for the homeless was plentiful: there was a Gospel kitchen and Catholic Services, and even the local McDonald's fed the hungry. The panhandling was for liquor, and the liquor was anything but harmless. He and Johns spent at least half their time dealing with people like Murray; they were as much caseworkers as police officers. And they knew they weren't the only ones involved. When someone passed out on the street, there was a "One down" call to the paramedics. There were four people in an ambulance, and the patient sometimes stayed at the hospital for days, because living on the streets in a state of almost constant intoxication was a reliable way of getting sick. None of that, surely, could be cheap.

  O'Bryan and Johns called someone they knew at an ambulance service and then contacted the local hospitals. "We came up with three names that were some of our chronic inebriates in the downtown area, that got arrested the most often," O'Bryan said. "We tracked those three individuals through just one of our two hospitals. One of the guys had been in jail previously, so he'd only been on the streets for six months. In those six months, he had accumulated a bill of a hundred thousand dollars--and that's at the smaller of the two hospitals near downtown Reno. It's pretty reasonable to assume that the other hospital had an even larger bill. Another individual came from Portland and had been in Reno for three months. In those three months, he had accumulated a bill for sixty-five thousand dollars. The third individual actually had some periods of being sober, and had accumulated a bill of fifty thousand."

  The first of those people was Murray Barr, and Johns and O'Bryan realized that if you totted up all his hospital bills for the ten years that he had been on the streets--as well as substance-abuse-treatment costs, doctors' fees, and other expenses--Murray Barr probably ran up a medical bill as large as anyone in the state of Nevada.

  "It cost us one million dollars not to do something about Murray," O'Bryan said.

  2.

  Fifteen years ago, after the Rodney King beating, the Los Angeles Police Department was in crisis. It was accused of racial insensitivity and ill discipline and violence, and the assumption was that those problems had spread broadly throughout the rank and file. In the language of statisticians, it was thought that L.A.P.D.'s troubles had a "normal" distribution--that if you graphed them the result would look like a bell curve, with a small number of officers at one end of the curve, a small number at the other end, and the bulk of the problem situated in the middle. The bell-curve assumption has become so much a part of our mental architecture that we tend to use it to organize experience automatically.

  But when the L.A.P.D. was investigated by a special commission headed by Warren Christopher, a very different picture emerged. Between 1986 and 1990, allegations of excessive force or improper tactics were made against eighteen hundred of the eighty-five hundred officers in the L.A.P.D. The broad middle had scarcely been accused of anything. Furthermore, more than fourteen hundred officers had only one or two allegations made against them--and bear in mind that these were not proven charges, that they happened in a four-year period, and that allegations of excessive force are an inevitable feature of urban police work. (The N.Y.P.D. receives about three thousand such complaints a year.) A hundred and eighty-three officers, however, had four or more complaints against them, forty-four officers had six or more complaints, sixteen had eight or more, and one had sixteen complaints. If you were to graph the troubles of the L.A.P.D., it wouldn't look like a bell curve. It would look more like a hockey stick. It would follow what statisticians call a "power law" distribution--where all the activity is not in the middle but at one extreme.

  The Christopher Commission's report repeatedly comes back to what it describes as the extreme concentration of problematic officers. One officer had been the subject of thirteen allegations of excessive use of force, five other complaints, twenty-eight "use of force reports" (that is, documented, internal accounts of inappropriate behavior), and one shooting. Another had six excessive-force complaints, nineteen other complaints, ten use-of-force reports, and three shootings. A third had twenty-seven use-of-force reports, and a fourth had thirty-five. Another had a file full of complaints for doing things like "striking an arrestee on the back of the neck with the butt of a shotgun for no apparent reason while the arrestee was kneeling and handcuffed," beating up a thirteen-year-old juvenile, and throwing an arrestee from his chair and kicking him in the back and side of the head while he was handcuffed and lying on his stomach.

  The report gives the strong impression that if you fired those forty-four cops the L.A.P.D. would suddenly become a pretty well-functioning police department. But the report also suggests that the problem is tougher than it seems, because those forty-four bad cops were so bad that the institutional mechanisms in place to get rid of bad apples clearly weren't working. If you made the mistake of assuming that the department's troubles fell into a normal distribution, you'd propose solutions that would raise the performance of the middle--like better training or better hiring--when the middle didn't need help. For those hard-core few who did need help, meanwhile, the medicine that helped the middle wouldn't be nearly strong enough.

  In the nineteen-eighties, when homelessness first surfaced as a national issue, the assumption was that the problem fit a normal distribution: that the vast majority of the homeless were in the same state of semi-permanent distress. It was an assumption that bred despair: if there were so many homeless, with so many problems, what could be done to help them? Then, fifteen years ago, a young Boston College graduate student named Dennis Culhane lived in a shelter in Philadelphia for seven weeks as part of the research for his dissertation. A few months later he went back, and was surprised to discover that he couldn't find any of the people he had recently spent so much time with. "It made me realize that most of these people were getting on with their own lives," he said.

  Culhane then put together a database--the first of its kind--to track who was coming in and out of the shelter system. What he discovered profoundly changed the way homelessness is understood. Homelessness doesn't have a normal distribution, it turned out. It has a power-law distribution. "We found that eighty per cent of the homeless were in and out really quickly," he said. "In Philadelphia, the most common length of time that someone is homeless is one day. And the second most common length is two days. And they never come back. Anyone who ever has to stay in a shelter involuntarily knows that all you think about is how to make sure you never come back."

  The next ten per cent were what Culhane calls episodic users. They would come for three weeks at a time, and return periodically, particularly in the winter. They were quite young, and they were often heavy drug users. It was the last ten per cent--the group at the farthest edge of the curve--that interested Culhane the most. They were the chronically homeless, who lived in the shelters, sometimes for years at a time. They were older. Many were mentally ill or physically disabled, and when we think about homelessness as a social problem--the people sleeping on the sidewalk, aggressively panhandling, lying drunk in doorways, huddled on subway grates and under bridges--it's this group that we have in mind. In the early nineteen-nineties, Culhane's database suggested that New York City had a quarter of a million people who were homeless at some point in the previous half decade --which was a surprisingly high number. But only about twenty-five hundred were chronically homeless.

  It turns out, furthermore, that this group costs the health-care and social-services systems far more than anyone had ever anticipated. Culhane estimates that in New York at least sixty-two million dollars was being spent annually to shelter just those twenty-five hundred hard-core homeless. "It costs twenty-four thousand dollars a year for one of these shelter beds," Culhane said. "We're talking about a cot eighteen inches away from the next cot." Boston Health Care for the Homeless Program, a leading service group for the homeless in Boston, recently tracked the medical expenses of a hundred and nineteen chronically homeless people. In the course of five years, thirty-three people died and seven more were sent to nursing homes, and the group still accounted for 18,834 emergency-room visits--at a minimum cost of a thousand dollars a visit. The University of California, San Diego Medical Center followed fifteen chronically homeless inebriates and found that over eighteen months those fifteen people were treated at the hospital's emergency room four hundred and seventeen times, and ran up bills that averaged a hundred thousand dollars each. One person--San Diego's counterpart to Murray Barr--came to the emergency room eighty-seven times.

  "If it's a medical admission, it's likely to be the guys with the really complex pneumonia," James Dunford, the city of San Diego's emergency medical director and the author of the observational study, said. "They are drunk and they aspirate and get vomit in their lungs and develop a lung abscess, and they get hypothermia on top of that, because they're out in the rain. They end up in the intensive-care unit with these very complicated medical infections. These are the guys who typically get hit by cars and buses and trucks. They often have a neurosurgical catastrophe as well. So they are very prone to just falling down and cracking their head and getting a subdural hematoma, which, if not drained, could kill them, and it's the guy who falls down and hits his head who ends up costing you at least fifty thousand dollars. Meanwhile, they are going through alcoholic withdrawal and have devastating liver disease that only adds to their inability to fight infections. There is no end to the issues. We do this huge drill. We run up big lab fees, and the nurses want to quit, because they see the same guys come in over and over, and all we're doing is making them capable of walking down the block."

  The homelessness problem is like the L.A.P.D.'s bad-cop problem. It's a matter of a few hard cases, and that's good news, because when a problem is that concentrated you can wrap your arms around it and think about solving it. The bad news is that those few hard cases are hard. They are falling-down drunks with liver disease and complex infections and mental illness. They need time and attention and lots of money. But enormous sums of money are already being spent on the chronically homeless, and Culhane saw that the kind of money it would take to solve the homeless problem could well be less than the kind of money it took to ignore it. Murray Barr used more health-care dollars, after all, than almost anyone in the state of Nevada. It would probably have been cheaper to give him a full-time nurse and his own apartment.

  The leading exponent for the power-law theory of homelessness is Philip Mangano, who, since he was appointed by President Bush in 2002, has been the executive director of the U.S. Interagency Council on Homelessness, a group that oversees the programs of twenty federal agencies. Mangano is a slender man, with a mane of white hair and a magnetic presence, who got his start as an advocate for the homeless in Massachusetts. In the past two years, he has crisscrossed the United States, educating local mayors and city councils about the real shape of the homelessness curve. Simply running soup kitchens and shelters, he argues, allows the chronically homeless to remain chronically homeless. You build a shelter and a soup kitchen if you think that homelessness is a problem with a broad and unmanageable middle. But if it's a problem at the fringe it can be solved. So far, Mangano has convinced more than two hundred cities to radically reëvaluate their policy for dealing with the homeless.

  "I was in St. Louis recently," Mangano said, back in June, when he dropped by New York on his way to Boise, Idaho. "I spoke with people doing services there. They had a very difficult group of people they couldn't reach no matter what they offered. So I said, Take some of your money and rent some apartments and go out to those people, and literally go out there with the key and say to them, 'This is the key to an apartment. If you come with me right now I am going to give it to you, and you are going to have that apartment.' And so they did. And one by one those people were coming in. Our intent is to take homeless policy from the old idea of funding programs that serve homeless people endlessly and invest in results that actually end homelessness."

  Mangano is a history buff, a man who sometimes falls asleep listening to old Malcolm X speeches, and who peppers his remarks with references to the civil-rights movement and the Berlin Wall and, most of all, the fight against slavery. "I am an abolitionist," he says. "My office in Boston was opposite the monument to the 54th Regiment on the Boston Common, up the street from the Park Street Church, where William Lloyd Garrison called for immediate abolition, and around the corner from where Frederick Douglass gave that famous speech at the Tremont Temple. It is very much ingrained in me that you do not manage a social wrong. You should be ending it."

  3.

  The old Y.M.C.A. in downtown Denver is on Sixteenth Street, just east of the central business district. The main building is a handsome six-story stone structure that was erected in 1906, and next door is an annex that was added in the nineteen-fifties. On the ground floor there is a gym and exercise rooms. On the upper floors there are several hundred apartments--brightly painted one-bedrooms, efficiencies, and S.R.O.-style rooms with microwaves and refrigerators and central airconditioning--and for the past several years those apartments have been owned and managed by the Colorado Coalition for the Homeless.

  Even by big-city standards, Denver has a serious homelessness problem. The winters are relatively mild, and the summers aren't nearly as hot as those of neighboring New Mexico or Utah, which has made the city a magnet for the indigent. By the city's estimates, it has roughly a thousand chronically homeless people, of whom three hundred spend their time downtown, along the central Sixteenth Street shopping corridor or in nearby Civic Center Park. Many of the merchants downtown worry that the presence of the homeless is scaring away customers. A few blocks north, near the hospital, a modest, low-slung detox center handles twenty-eight thousand admissions a year, many of them homeless people who have passed out on the streets, either from liquor or--as is increasingly the case--from mouthwash. "Dr. ----Dr. Tich, they call it--is the brand of mouthwash they use," says Roxane White, the manager of the city's social services. "You can imagine what that does to your gut."

  Eighteen months ago, the city signed up with Mangano. With a mixture of federal and local funds, the C.C.H. inaugurated a new program that has so far enrolled a hundred and six people. It is aimed at the Murray Barrs of Denver, the people costing the system the most. C.C.H. went after the people who had been on the streets the longest, who had a criminal record, who had a problem with substance abuse or mental illness. "We have one individual in her early sixties, but looking at her you'd think she's eighty," Rachel Post, the director of substance treatment at the C.C.H., said. (Post changed some details about her clients in order to protect their identity.) "She's a chronic alcoholic. A typical day for her is she gets up and tries to find whatever 's going to drink that day. She falls down a lot. There's another person who came in during the first week. He was on methadone maintenance. He'd had psychiatric treatment. He was incarcerated for eleven years, and lived on the streets for three years after that, and, if that's not enough, he had a hole in his heart."

  The recruitment strategy was as simple as the one that Mangano had laid out in St. Louis: Would you like a free apartment? The enrollees got either an efficiency at the Y.M.C.A. or an apartment rented for them in a building somewhere else in the city, provided they agreed to work within the rules of the program. In the basement of the Y, where the racquetball courts used to be, the coalition built a command center, staffed with ten caseworkers. Five days a week, between eight-thirty and ten in the morning, the caseworkers meet and painstakingly review the status of everyone in the program. On the wall around the conference table are several large white boards, with lists of doctor's appointments and court dates and medication schedules. "We need a staffing ratio of one to ten to make it work," Post said. "You go out there and you find people and assess how 're doing in their residence. Sometimes we're in contact with someone every day. Ideally, we want to be in contact every couple of days. We've got about fifteen people we're really worried about now."

  The cost of services comes to about ten thousand dollars per homeless client per year. An efficiency apartment in Denver averages $376 a month, or just over forty-five hundred a year, which means that you can house and care for a chronically homeless person for at most fifteen thousand dollars, or about a third of what he or she would cost on the street. The idea is that once the people in the program get stabilized they will find jobs, and start to pick up more and more of their own rent, which would bring someone's annual cost to the program closer to six thousand dollars. As of today, seventy-five supportive housing slots have already been added, and the city's homeless plan calls for eight hundred more over the next ten years.

  The reality, of course, is hardly that neat and tidy. The idea that the very sickest and most troubled of the homeless can be stabilized and eventually employed is only a hope. Some of them plainly won't be able to get there: these are, after all, hard cases. "We've got one man, he's in his twenties," Post said. "Already, he has cirrhosis of the liver. One time he blew a blood alcohol of .49, which is enough to kill most people. The first place we had he brought over all his friends, and they partied and trashed the place and broke a window. Then we gave him another apartment, and he did the same thing."

  Post said that the man had been sober for several months. But he could relapse at some point and perhaps trash another apartment, and they'd have to figure out what to do with him next. Post had just been on a conference call with some people in New York City who run a similar program, and they talked about whether giving clients so many chances simply encourages them to behave irresponsibly. For some people, it probably does. But what was the alternative? If this young man was put back on the streets, he would cost the system even more money. The current philosophy of welfare holds that government assistance should be temporary and conditional, to avoid creating dependency. But someone who blows .49 on a Breathalyzer and has cirrhosis of the liver at the age of twenty-seven doesn't respond to incentives and sanctions in the usual way. "The most complicated people to work with are those who have been homeless for so long that going back to the streets just isn't scary to them," Post said. "The summer comes along and they say, 'I don't need to follow your rules.' " Power-law homelessness policy has to do the opposite of normal-distribution social policy. It should create dependency: you want people who have been outside the system to come inside and rebuild their lives under the supervision of those ten caseworkers in the basement of the Y.M.C.A.

  That is what is so perplexing about power-law homeless policy. From an economic perspective the approach makes perfect sense. But from a moral perspective it doesn't seem fair. Thousands of people in the Denver area no doubt live day to day, work two or three jobs, and are eminently deserving of a helping hand--and no one offers them the key to a new apartment. Yet that's just what the guy screaming obscenities and swigging Dr. Tich gets. When the welfare mom's time on public assistance runs out, we cut her off. Yet when the homeless man trashes his apartment we give him another. Social benefits are supposed to have some kind of moral justification. We give them to widows and disabled veterans and poor mothers with small children. Giving the homeless guy passed out on the sidewalk an apartment has a different rationale. It's simply about efficiency.

  We also believe that the distribution of social benefits should not be arbitrary. We don't give only to some poor mothers, or to a random handful of disabled veterans. We give to everyone who meets a formal criterion, and the moral credibility of government assistance derives, in part, from this universality. But the Denver homelessness program doesn't help every chronically homeless person in Denver. There is a waiting list of six hundred for the supportive-housing program; it will be years before all those people get apartments, and some may never get one. There isn't enough money to go around, and to try to help everyone a little bit--to observe the principle of universality--isn't as cost-effective as helping a few people a lot. Being fair, in this case, means providing shelters and soup kitchens, and shelters and soup kitchens don't solve the problem of homelessness. Our usual moral intuitions are little use, then, when it comes to a few hard cases. Power-law problems leave us with an unpleasant choice. We can be true to our principles or we can fix the problem. We cannot do both.

  4.

  A few miles northwest of the old Y.M.C.A. in downtown Denver, on the Speer Boulevard off-ramp from I-25, there is a big electronic sign by the side of the road, connected to a device that remotely measures the emissions of the vehicles driving past. When a car with properly functioning pollution-control equipment passes, the sign flashes "Good." When a car passes that is well over the acceptable limits, the sign flashes "Poor." If you stand at the Speer Boulevard exit and watch the sign for any length of time, you'll find that virtually every car scores "Good." An Audi A4 --"Good." A Buick Century--"Good." A Toyota Corolla--"Good." A Ford Taurus--"Good." A Saab 9-5--"Good," and on and on, until after twenty minutes or so, some beat-up old Ford Escort or tricked-out Porsche drives by and the sign flashes "Poor." The picture of the smog problem you get from watching the Speer Boulevard sign and the picture of the homelessness problem you get from listening in on the morning staff meetings at the Y.M.C.A. are pretty much the same. Auto emissions follow a power-law distribution, and the air-pollution example offers another look at why we struggle so much with problems centered on a few hard cases.

  Most cars, especially new ones, are extraordinarily clean. A 2004 Subaru in good working order has an exhaust stream that's just .06 per cent carbon monoxide, which is negligible. But on almost any highway, for whatever reason--age, ill repair, deliberate tampering by the owner--a small number of cars can have carbon-monoxide levels in excess of ten per cent, which is almost two hundred times higher. In Denver, five per cent of the vehicles on the road produce fifty-five per cent of the automobile pollution.

  "Let's say a car is fifteen years old," Donald Stedman says. Stedman is a chemist and automobile-emissions specialist at the University of Denver. His laboratory put up the sign on Speer Avenue. "Obviously, the older a car is the more likely it is to become broken. It's the same as human beings. And by broken we mean any number of mechanical malfunctions--the computer's not working anymore, fuel injection is stuck open, the catalyst 's not unusual that these failure modes result in high emissions. We have at least one car in our database which was emitting seventy grams of hydrocarbon per mile, which means that you could almost drive a Honda Civic on the exhaust fumes from that car. It's not just old cars. It's new cars with high mileage, like taxis. One of the most successful and least publicized control measures was done by a district attorney in L.A. back in the nineties. He went to LAX and discovered that all of the Bell Cabs were gross emitters. One of those cabs emitted more than its own weight of pollution every year."

  In Stedman's view, the current system of smog checks makes little sense. A million motorists in Denver have to go to an emissions center every year--take time from work, wait in line, pay fifteen or twenty-five dollars--for a test that more than ninety per cent of them don't need. "Not everybody gets tested for breast cancer," Stedman says. "Not everybody takes an AIDS test." On-site smog checks, furthermore, do a pretty bad job of finding and fixing the few outliers. Car enthusiasts--with high-powered, high-polluting sports cars--have been known to drop a clean engine into their car on the day they get it tested. Others register their car in a faraway town without emissions testing or arrive at the test site "hot"--having just come off hard driving on the freeway--which is a good way to make a dirty engine appear to be clean. Still others randomly pass the test when they shouldn't, because dirty engines are highly variable and sometimes burn cleanly for short durations. There is little evidence, Stedman says, that the city's regime of inspections makes any difference in air quality.

  He proposes mobile testing instead. Twenty years ago, he invented a device the size of a suitcase that uses infrared light to instantly measure and then analyze the emissions of cars as they drive by on the highway. The Speer Avenue sign is attached to one of Stedman's devices. He says that cities should put half a dozen or so of his devices in vans, park them on freeway off-ramps around the city, and have a police car poised to pull over anyone who fails the test. A half-dozen vans could test thirty thousand cars a day. For the same twenty-five million dollars that Denver's motorists now spend on on-site testing, Stedman estimates, the city could identify and fix twenty-five thousand truly dirty vehicles every year, and within a few years cut automobile emissions in the Denver metropolitan area by somewhere between thirty-five and forty per cent. The city could stop managing its smog problem and start ending it.

  Why don't we all adopt the Stedman method? There's no moral impediment here. We're used to the police pulling people over for having a blown headlight or a broken side mirror, and it wouldn't be difficult to have them add pollution-control devices to their list. Yet it does run counter to an instinctive social preference for thinking of pollution as a problem to which we all contribute equally. We have developed institutions that move reassuringly quickly and forcefully on collective problems. Congress passes a law. The Environmental Protection Agency promulgates a regulation. The auto industry makes its cars a little cleaner, and--presto--the air gets better. But Stedman doesn't much care about what happens in Washington and Detroit. The challenge of controlling air pollution isn't so much about the laws as it is about compliance with them. It's a policing problem, rather than a policy problem, and there is something ultimately unsatisfying about his proposed solution. He wants to end air pollution in Denver with a half-dozen vans outfitted with a contraption about the size of a suitcase. Can such a big problem have such a small-bore solution?

  That's what made the findings of the Christopher Commission so unsatisfying. We put together blue-ribbon panels when we're faced with problems that seem too large for the normal mechanisms of bureaucratic repair. We want sweeping reforms. But what was the commission's most memorable observation? It was the story of an officer with a known history of doing things like beating up handcuffed suspects who nonetheless received a performance review from his superior stating that he "usually conducts himself in a manner that inspires respect for the law and instills public confidence." This is what you say about an officer when you haven't actually read his file, and the implication of the Christopher Commission's report was that the L.A.P.D. might help solve its problem simply by getting its police captains to read the files of their officers. The L.A.P.D.'s problem was a matter not of policy but of compliance. The department needed to adhere to the rules it already had in place, and that's not what a public hungry for institutional transformation wants to hear. Solving problems that have power-law distributions doesn't just violate our moral intuitions; it violates our political intuitions as well. It's hard not to conclude, in the end, that the reason we treated the homeless as one hopeless undifferentiated group for so long is not simply that we didn't know better. It's that we didn't want to know better. It was easier the old way.

  Power-law solutions have little appeal to the right, because they involve special treatment for people who do not deserve special treatment; and they have little appeal to the left, because their emphasis on efficiency over fairness suggests the cold number-crunching of Chicago-school cost-benefit analysis. Even the promise of millions of dollars in savings or cleaner air or better police departments cannot entirely compensate for such discomfort. In Denver, John Hickenlooper, the city's enormously popular mayor, has worked on the homelessness issue tirelessly during the past couple of years. He spent more time on the subject in his annual State of the City address this past summer than on any other topic. He gave the speech, with deliberate symbolism, in the city's downtown Civic Center Park, where homeless people gather every day with their shopping carts and garbage bags. He has gone on local talk radio on many occasions to discuss what the city is doing about the issue. He has commissioned studies to show what a drain on the city's resources the homeless population has become. But, he says, "there are still people who stop me going into the supermarket and say, 'I can't believe you're going to help those homeless people, those bums.'"

  5.

  Early one morning a year ago, Marla Johns got a call from her husband, Steve. He was at work. "He called and woke me up," Johns remembers. "He was choked up and crying on the phone. And I thought that something had happened with another police officer. I said, 'Oh, my gosh, what happened?' He said, 'Murray died last night.' " He died of intestinal bleeding. At the police department that morning, some of the officers gave Murray a moment of silence.

  "There are not many days that go by that I don't have a thought of him," she went on. "Christmas comes-- and I used to buy him a Christmas present. Make sure he had warm gloves and a blanket and a coat. There was this mutual respect. There was a time when another intoxicated patient jumped off the gurney and was coming at me, and Murray jumped off his gurney and shook his fist and said, 'Don't you touch my angel.' You know, when he was monitored by the system he did fabulously. He would be on house arrest and he would get a job and he would save money and go to work every day, and he wouldn't drink. He would do all the things he was supposed to do. There are some people who can be very successful members of society if someone monitors them. Murray needed someone to be in charge of him."

  But, of course, Reno didn't have a place where Murray could be given the structure he needed. Someone must have decided that it cost too much.

  What pit bulls can teach us about profiling.

  1.

  One afternoon last February, Guy Clairoux picked up his two-and-a half-year-old son, Jayden, from day care and walked him back to their house in the west end of Ottawa, Ontario. They were almost home. Jayden was straggling behind, and, as his father's back was turned, a pit bull jumped over a back-yard fence and lunged at Jayden. "The dog had his head in its mouth and started to do this shake," Clairoux's wife, JoAnn Hartley, said later. As she watched in horror, two more pit bulls jumped over the fence, joining in the assault. She and Clairoux came running, and he punched the first of the dogs in the head, until it dropped Jayden, and then he threw the boy toward his mother. Hartley fell on her son, protecting him with her body. "JoAnn!" Clairoux cried out, as all three dogs descended on his wife. "Cover your neck, cover your neck." A neighbor, sitting by her window, screamed for help. Her partner and a friend, Mario Gauthier, ran outside. A neighborhood boy grabbed his hockey stick and threw it to Gauthier. He began hitting one of the dogs over the head, until the stick broke. "They wouldn't stop," Gauthier said. "As soon as you'd stop, they'd attack again. I've never seen a dog go so crazy. They were like Tasmanian devils." The police came. The dogs were pulled away, and the Clairouxes and one of the rescuers were taken to the hospital. Five days later, the Ontario legislature banned the ownership of pit bulls. "Just as we wouldn't let a great white shark in a swimming pool," the province's attorney general, Michael Bryant, had said, "maybe we shouldn't have these animals on the civilized streets."

  Pit bulls, descendants of the bulldogs used in the nineteenth century for bull baiting and dogfighting, have been bred for "gameness," and thus a lowered inhibition to aggression. Most dogs fight as a last resort, when staring and growling fail. A pit bull is willing to fight with little or no provocation. Pit bulls seem to have a high tolerance for pain, making it possible for them to fight to the point of exhaustion. Whereas guard dogs like German shepherds usually attempt to restrain those they perceive to be threats by biting and holding, pit bulls try to inflict the maximum amount of damage on an opponent. They bite, hold, shake, and tear. They don't growl or assume an aggressive facial expression as warning. They just attack. "They are often insensitive to behaviors that usually stop aggression," one scientific review of the breed states. "For example, dogs not bred for fighting usually display defeat in combat by rolling over and exposing a light underside. On several occasions, pit bulls have been reported to disembowel dogs offering this signal of submission." In epidemiological studies of dog bites, the pit bull is overrepresented among dogs known to have seriously injured or killed human beings, and, as a result, pit bulls have been banned or restricted in several Western European countries, China, and numerous cities and municipalities across North America. Pit bulls are dangerous.

  Of course, not all pit bulls are dangerous. Most don't bite anyone. Meanwhile, Dobermans and Great Danes and German shepherds and Rottweilers are frequent biters as well, and the dog that recently mauled a Frenchwoman so badly that she was given the world's first face transplant was, of all things, a Labrador retriever. When we say that pit bulls are dangerous, we are making a generalization, just as insurance companies use generalizations when they charge young men more for car insurance than the rest of us (even though many young men are perfectly good drivers), and doctors use generalizations when they tell overweight middle-aged men to get their cholesterol checked (even though many overweight middle-aged men won't experience heart trouble). Because we don't know which dog will bite someone or who will have a heart attack or which drivers will get in an accident, we can make predictions only by generalizing. As the legal scholar Frederick Schauer has observed, "painting with a broad brush" is "an often inevitable and frequently desirable dimension of our decision-making lives."

  Another word for generalization, though, is "stereotype," and stereotypes are usually not considered desirable dimensions of our decision-making lives. The process of moving from the specific to the general is both necessary and perilous. A doctor could, with some statistical support, generalize about men of a certain age and weight. But what if generalizing from other traits--such as high blood pressure, family history, and smoking--saved more lives? Behind each generalization is a choice of what factors to leave in and what factors to leave out, and those choices can prove surprisingly complicated. After the attack on Jayden Clairoux, the Ontario government chose to make a generalization about pit bulls. But it could also have chosen to generalize about powerful dogs, or about the kinds of people who own powerful dogs, or about small children, or about back-yard fences--or, indeed, about any number of other things to do with dogs and people and places. How do we know when we've made the right generalization?

  2.

  In July of last year, following the transit bombings in London, the New York City Police Department announced that it would send officers into the subways to conduct random searches of passengers' bags. On the face of it, doing random searches in the hunt for terrorists--as opposed to being guided by generalizations--seems like a silly idea. As a columnist in New York wrote at the time, "Not just 'most' but nearly every jihadi who has attacked a Western European or American target is a young Arab or Pakistani man. In other words, you can predict with a fair degree of certainty what an Al Qaeda terrorist looks like. Just as we have always known what Mafiosi look like--even as we understand that only an infinitesimal fraction of Italian-Americans are members of the mob."

  But wait: do we really know what mafiosi look like? In "The Godfather," where most of us get our knowledge of the Mafia, the male members of the Corleone family were played by Marlon Brando, who was of Irish and French ancestry, James Caan, who is Jewish, and two Italian-Americans, Al Pacino and John Cazale. To go by "The Godfather," mafiosi look like white men of European descent, which, as generalizations go, isn't terribly helpful. Figuring out what an Islamic terrorist looks like isn't any easier. Muslims are not like the Amish: they don't come dressed in identifiable costumes. And they don't look like basketball players; they don't come in predictable shapes and sizes. Islam is a religion that spans the globe.

  "We have a policy against racial profiling," Raymond Kelly, New York City's police commissioner, told me. "I put it in here in March of the first year I was here. It's the wrong thing to do, and it's also ineffective. If you look at the London bombings, you have three British citizens of Pakistani descent. You have Germaine Lindsay, who is Jamaican. You have the next crew, on July 21st, who are East African. You have a Chechen woman in Moscow in early 2004 who blows herself up in the subway station. So whom do you profile? Look at New York City. Forty per cent of New Yorkers are born outside the country. Look at the diversity here. Who am I supposed to profile?"

  Kelly was pointing out what might be called profiling's "category problem." Generalizations involve matching a category of people to a behavior or trait--overweight middle-aged men to heart-attack risk, young men to bad driving. But, for that process to work, you have to be able both to define and to identify the category you are generalizing about. "You think that terrorists aren't aware of how easy it is to be characterized by ethnicity?" Kelly went on. "Look at the 9/11 hijackers. They came here. They shaved. They went to topless bars. They wanted to blend in. They wanted to look like they were part of the American dream. These are not dumb people. Could a terrorist dress up as a Hasidic Jew and walk into the subway, and not be profiled? Yes. I think profiling is just nuts."

  3.

  Pit-bull bans involve a category problem, too, because pit bulls, as it happens, aren't a single breed. The name refers to dogs belonging to a number of related breeds, such as the American Staffordshire terrier, the Staffordshire bull terrier, and the American pit bull terrier--all of which share a square and muscular body, a short snout, and a sleek, short-haired coat. Thus the Ontario ban prohibits not only these three breeds but any "dog that has an appearance and physical characteristics that are substantially similar" to theirs; the term of art is "pit bull-type" dogs. But what does that mean? Is a cross between an American pit bull terrier and a golden retriever a pit bull-type dog or a golden retriever-type dog? If thinking about muscular terriers as pit bulls is a generalization, then thinking about dangerous dogs as anything substantially similar to a pit bull is a generalization about a generalization. "The way a lot of these laws are written, pit bulls are whatever they say they are," Lora Brashears, a kennel manager in Pennsylvania, says. "And for most people it just means big, nasty, scary dog that bites."

  The goal of pit-bull bans, obviously, isn't to prohibit dogs that look like pit bulls. The pit-bull appearance is a proxy for the pit-bull temperament--for some trait that these dogs share. But "pit bullness" turns out to be elusive as well. The supposedly troublesome characteristics of the pit-bull type--its gameness, its determination, its insensitivity to pain--are chiefly directed toward other dogs. Pit bulls were not bred to fight humans. On the contrary: a dog that went after spectators, or its handler, or the trainer, or any of the other people involved in making a dogfighting dog a good dogfighter was usually put down. (The rule in the pit-bull world was "Man-eaters die.")

  A Georgia-based group called the American Temperament Test Society has put twenty-five thousand dogs through a ten-part standardized drill designed to assess a dog's stability, shyness, aggressiveness, and friendliness in the company of people. A handler takes a dog on a six-foot lead and judges its reaction to stimuli such as gunshots, an umbrella opening, and a weirdly dressed stranger approaching in a threatening way. Eighty-four per cent of the pit bulls that have been given the test have passed, which ranks pit bulls ahead of beagles, Airedales, bearded collies, and all but one variety of dachshund. "We have tested somewhere around a thousand pit-bull-type dogs," Carl Herkstroeter, the president of the A.T.T.S., says. "I've tested half of them. And of the number I've tested I have disqualified one pit bull because of aggressive tendencies. They have done extremely well. They have a good temperament. They are very good with children." It can even be argued that the same traits that make the pit bull so aggressive toward other dogs are what make it so nice to humans. "There are a lot of pit bulls these days who are licensed therapy dogs," the writer Vicki Hearne points out. "Their stability and resoluteness make them excellent for work with people who might not like a more bouncy, flibbertigibbet sort of dog. When pit bulls set out to provide comfort, they are as resolute as they are when they fight, but what they are resolute about is being gentle. And, because they are fearless, they can be gentle with anybody."

  Then which are the pit bulls that get into trouble? "The ones that the legislation is geared toward have aggressive tendencies that are either bred in by the breeder, trained in by the trainer, or reinforced in by the owner," Herkstroeter says. A mean pit bull is a dog that has been turned mean, by selective breeding, by being cross-bred with a bigger, human-aggressive breed like German shepherds or Rottweilers, or by being conditioned in such a way that it begins to express hostility to human beings. A pit bull is dangerous to people, then, not to the extent that it expresses its essential pit bullness but to the extent that it deviates from it. A pit-bull ban is a generalization about a generalization about a trait that is not, in fact, general. That's a category problem.

  4.

  One of the puzzling things about New York City is that, after the enormous and well-publicized reductions in crime in the mid-nineteen-nineties, the crime rate has continued to fall. In the past two years, for instance, murder in New York has declined by almost ten per cent, rape by twelve per cent, and burglary by more than eighteen per cent. Just in the last year, auto theft went down 11.8 per cent. On a list of two hundred and forty cities in the United States with a population of a hundred thousand or more, New York City now ranks two hundred-and-twenty-second in crime, down near the bottom with Fontana, California, and Port St. Lucie, Florida. In the nineteen-nineties, the crime decrease was attributed to big obvious changes in city life and government--the decline of the drug trade, the gentrification of Brooklyn, the successful implementation of "broken windows" policing. But all those big changes happened a decade ago. Why is crime still falling?

  The explanation may have to do with a shift in police tactics. The N.Y.P.D. has a computerized map showing, in real time, precisely where serious crimes are being reported, and at any moment the map typically shows a few dozen constantly shifting high-crime hot spots, some as small as two or three blocks square. What the N.Y.P.D. has done, under Commissioner Kelly, is to use the map to establish "impact zones," and to direct newly graduated officers--who used to be distributed proportionally to precincts across the city--to these zones, in some cases doubling the number of officers in the immediate neighborhood. "We took two-thirds of our graduating class and linked them with experienced officers, and focussed on those areas," Kelly said. "Well, what has happened is that over time we have averaged about a thirty-five-per-cent crime reduction in impact zones."

  For years, experts have maintained that the incidence of violent crime is "inelastic" relative to police presence--that people commit serious crimes because of poverty and psychopathology and cultural dysfunction, along with spontaneous motives and opportunities. The presence of a few extra officers down the block, it was thought, wouldn't make much difference. But the N.Y.P.D. experience suggests otherwise. More police means that some crimes are prevented, others are more easily solved, and still others are displaced--pushed out of the troubled neighborhood--which Kelly says is a good thing, because it disrupts the patterns and practices and social networks that serve as the basis for lawbreaking. In other words, the relation between New York City (a category) and criminality (a trait) is unstable, and this kind of instability is another way in which our generalizations can be derailed.

  Why, for instance, is it a useful rule of thumb that Kenyans are good distance runners? It's not just that it's statistically supportable today. It's that it has been true for almost half a century, and that in Kenya the tradition of distance running is sufficiently rooted that something cataclysmic would have to happen to dislodge it. By contrast, the generalization that New York City is a crime-ridden place was once true and now, manifestly, isn't. People who moved to sunny retirement communities like Port St. Lucie because they thought they were much safer than New York are suddenly in the position of having made the wrong bet.

  The instability issue is a problem for profiling in law enforcement as well. The law professor David Cole once tallied up some of the traits that Drug Enforcement Administration agents have used over the years in making generalizations about suspected smugglers. Here is a sample:

Arrived late at night; arrived early in the morning; arrived in afternoon; one of the first to deplane; one of the last to deplane; deplaned in the middle; purchased ticket at the airport; made reservation on short notice; bought coach ticket; bought first-class ticket; used one-way ticket; used round-trip ticket; paid for ticket with cash; paid for ticket with small denomination currency; paid for ticket with large denomination currency; made local telephone calls after deplaning; made long distance telephone call after deplaning; pretended to make telephone call; traveled from New York to Los Angeles; traveled to Houston; carried no luggage; carried brand-new luggage; carried a small bag; carried a medium-sized bag; carried two bulky garment bags; carried two heavy suitcases; carried four pieces of luggage; overly protective of luggage; disassociated self from luggage; traveled alone; traveled with a companion; acted too nervous; acted too calm; made eye contact with officer; avoided making eye contact with officer; wore expensive clothing and jewelry; dressed casually; went to restroom after deplaning; walked rapidly through airport; walked slowly through airport; walked aimlessly through airport; left airport by taxi; left airport by limousine; left airport by private car; left airport by hotel courtesy van.

  Some of these reasons for suspicion are plainly absurd, suggesting that there's no particular rationale to the generalizations used by D.E.A. agents in stopping suspected drug smugglers. A way of making sense of the list, though, is to think of it as a catalogue of unstable traits. Smugglers may once have tended to buy one-way tickets in cash and carry two bulky suitcases. But they don't have to. They can easily switch to round-trip tickets bought with a credit card, or a single carry-on bag, without losing their capacity to smuggle. There's a second kind of instability here as well. Maybe the reason some of them switched from one-way tickets and two bulky suitcases was that law enforcement got wise to those habits, so the smugglers did the equivalent of what the jihadis seemed to have done in London, when they switched to East Africans because the scrutiny of young Arab and Pakistani men grew too intense. It doesn't work to generalize about a relationship between a category and a trait when that relationship isn't stable--or when the act of generalizing may itself change the basis of the generalization.

  Before Kelly became the New York police commissioner, he served as the head of the U.S. Customs Service, and while he was there he overhauled the criteria that border-control officers use to identify and search suspected smugglers. There had been a list of forty-three suspicious traits. He replaced it with a list of six broad criteria. Is there something suspicious about their physical appearance? Are they nervous? Is there specific intelligence targeting this person? Does the drug-sniffing dog raise an alarm? Is there something amiss in their paperwork or explanations? Has contraband been found that implicates this person?

  You'll find nothing here about race or gender or ethnicity, and nothing here about expensive jewelry or deplaning at the middle or the end, or walking briskly or walking aimlessly. Kelly removed all the unstable generalizations, forcing customs officers to make generalizations about things that don't change from one day or one month to the next. Some percentage of smugglers will always be nervous, will always get their story wrong, and will always be caught by the dogs. That's why those kinds of inferences are more reliable than the ones based on whether smugglers are white or black, or carry one bag or two. After Kelly's reforms, the number of searches conducted by the Customs Service dropped by about seventy-five per cent, but the number of successful seizures improved by twenty-five per cent. The officers went from making fairly lousy decisions about smugglers to making pretty good ones. "We made them more efficient and more effective at what they were doing," Kelly said.

  5.

  Does the notion of a pit-bull menace rest on a stable or an unstable generalization? The best data we have on breed dangerousness are fatal dog bites, which serve as a useful indicator of just how much havoc certain kinds of dogs are causing. Between the late nineteen-seventies and the late nineteen-nineties, more than twenty-five breeds were involved in fatal attacks in the United States. Pit-bull breeds led the pack, but the variability from year to year is considerable. For instance, in the period from 1981 to 1982 fatalities were caused by five pit bulls, three mixed breeds, two St. Bernards, two German-shepherd mixes, a pure-bred German shepherd, a husky type, a Doberman, a Chow Chow, a Great Dane, a wolf-dog hybrid, a husky mix, and a pit-bull mix--but no Rottweilers. In 1995 and 1996, the list included ten Rottweilers, four pit bulls, two German shepherds, two huskies, two Chow Chows, two wolf-dog hybrids, two shepherd mixes, a Rottweiler mix, a mixed breed, a Chow Chow mix, and a Great Dane. The kinds of dogs that kill people change over time, because the popularity of certain breeds changes over time. The one thing that doesn't change is the total number of the people killed by dogs. When we have more problems with pit bulls, it's not necessarily a sign that pit bulls are more dangerous than other dogs. It could just be a sign that pit bulls have become more numerous.

  "I've seen virtually every breed involved in fatalities, including Pomeranians and everything else, except a beagle or a basset hound," Randall Lockwood, a senior vice-president of the A.S.P.C.A. and one of the country's leading dogbite experts, told me. "And there's always one or two deaths attributable to malamutes or huskies, although you never hear people clamoring for a ban on those breeds. When I first started looking at fatal dog attacks, they largely involved dogs like German shepherds and shepherd mixes and St. Bernards--which is probably why Stephen King chose to make Cujo a St. Bernard, not a pit bull. I haven't seen a fatality involving a Doberman for decades, whereas in the nineteen-seventies they were quite common. If you wanted a mean dog, back then, you got a Doberman. I don't think I even saw my first pit-bull case until the middle to late nineteen-eighties, and I didn't start seeing Rottweilers until I'd already looked at a few hundred fatal dog attacks. Now those dogs make up the preponderance of fatalities. The point is that it changes over time. It's a reflection of what the dog of choice is among people who want to own an aggressive dog."

  There is no shortage of more stable generalizations about dangerous dogs, though. A 1991 study in Denver, for example, compared a hundred and seventy-eight dogs with a history of biting people with a random sample of a hundred and seventy-eight dogs with no history of biting. The breeds were scattered: German shepherds, Akitas, and Chow Chows were among those most heavily represented. (There were no pit bulls among the biting dogs in the study, because Denver banned pit bulls in 1989.) But a number of other, more stable factors stand out. The biters were 6.2 times as likely to be male than female, and 2.6 times as likely to be intact than neutered. The Denver study also found that biters were 2.8 times as likely to be chained as unchained. "About twenty per cent of the dogs involved in fatalities were chained at the time, and had a history of long-term chaining," Lockwood said. "Now, are they chained because they are aggressive or aggressive because they are chained? It's a bit of both. These are animals that have not had an opportunity to become socialized to people. They don't necessarily even know that children are small human beings. They tend to see them as prey."

  In many cases, vicious dogs are hungry or in need of medical attention. Often, the dogs had a history of aggressive incidents, and, overwhelmingly, dog-bite victims were children (particularly small boys) who were physically vulnerable to attack and may also have unwittingly done things to provoke the dog, like teasing it, or bothering it while it was eating. The strongest connection of all, though, is between the trait of dog viciousness and certain kinds of dog owners. In about a quarter of fatal dog-bite cases, the dog owners were previously involved in illegal fighting. The dogs that bite people are, in many cases, socially isolated because their owners are socially isolated, and they are vicious because they have owners who want a vicious dog. The junk-yard German shepherd--which looks as if it would rip your throat out--and the German-shepherd guide dog are the same breed. But they are not the same dog, because they have owners with different intentions.

  "A fatal dog attack is not just a dog bite by a big or aggressive dog," Lockwood went on. "It is usually a perfect storm of bad human-canine interactions--the wrong dog, the wrong background, the wrong history in the hands of the wrong person in the wrong environmental situation. I've been involved in many legal cases involving fatal dog attacks, and, certainly, it's my impression that these are generally cases where everyone is to blame. You've got the unsupervised three-year-old child wandering in the neighborhood killed by a starved, abused dog owned by the dogfighting boyfriend of some woman who doesn't know where her child is. It's not old Shep sleeping by the fire who suddenly goes bonkers. Usually there are all kinds of other warning signs."

  6.

  Jayden Clairoux was attacked by Jada, a pit-bull terrier, and her two pit-bull-bullmastiff puppies, Agua and Akasha. The dogs were owned by a twenty-one-year-old man named Shridev Café, who worked in construction and did odd jobs. Five weeks before the Clairoux attack, Café's three dogs got loose and attacked a sixteen-year-old boy and his four-year-old half brother while they were ice skating. The boys beat back the animals with a snow shovel and escaped into a neighbor's house. Café was fined, and he moved the dogs to his seventeen-year-old girlfriend's house. This was not the first time that he ran into trouble last year; a few months later, he was charged with domestic assault, and, in another incident, involving a street brawl, with aggravated assault. "Shridev has personal issues," Cheryl Smith, a canine-behavior specialist who consulted on the case, says. "He's certainly not a very mature person." Agua and Akasha were now about seven months old. The court order in the wake of the first attack required that they be muzzled when they were outside the home and kept in an enclosed yard. But Café did not muzzle them, because, he said later, he couldn't afford muzzles, and apparently no one from the city ever came by to force him to comply. A few times, he talked about taking his dogs to obedience classes, but never did. The subject of neutering them also came up--particularly Agua, the male--but neutering cost a hundred dollars, which he evidently thought was too much money, and when the city temporarily confiscated his animals after the first attack it did not neuter them, either, because Ottawa does not have a policy of preëmptively neutering dogs that bite people.

  On the day of the second attack, according to some accounts, a visitor came by the house of Café's girlfriend, and the dogs got wound up. They were put outside, where the snowbanks were high enough so that the back-yard fence could be readily jumped. Jayden Clairoux stopped and stared at the dogs, saying, "Puppies, puppies." His mother called out to his father. His father came running, which is the kind of thing that will rile up an aggressive dog. The dogs jumped the fence, and Agua took Jayden's head in his mouth and started to shake. It was a textbook dog-biting case: unneutered, ill-trained, charged-up dogs, with a history of aggression and an irresponsible owner, somehow get loose, and set upon a small child. The dogs had already passed through the animal bureaucracy of Ottawa, and the city could easily have prevented the second attack with the right kind of generalization--a generalization based not on breed but on the known and meaningful connection between dangerous dogs and negligent owners. But that would have required someone to track down Shridev Café, and check to see whether he had bought muzzles, and someone to send the dogs to be neutered after the first attack, and an animal-control law that insured that those whose dogs attack small children forfeit their right to have a dog. It would have required, that is, a more exacting set of generalizations to be more exactingly applied. It's always easier just to ban the breed.
GO TO TOP MENU

  The social logic of Ivy League admissions.

  1.

  I applied to college one evening, after dinner, in the fall of my senior year in high school. College applicants in Ontario, in those days, were given a single sheet of paper which listed all the universities in the province. It was my job to rank them in order of preference. Then I had to mail the sheet of paper to a central college-admissions office. The whole process probably took ten minutes. My school sent in my grades separately. I vaguely remember filling out a supplementary two-page form listing my interests and activities. There were no S.A.T. scores to worry about, because in Canada we didn't have to take the S.A.T.s. I don't know whether anyone wrote me a recommendation. I certainly never asked anyone to. Why would I? It wasn't as if I were applying to a private club.

  I put the University of Toronto first on my list, the University of Western Ontario second, and Queen's University third. I was working off a set of brochures that I'd sent away for. My parents' contribution consisted of my father's agreeing to drive me one afternoon to the University of Toronto campus, where we visited the residential college I was most interested in. I walked around. My father poked his head into the admissions office, chatted with the admissions director, and--I imagine--either said a few short words about the talents of his son or (knowing my father) remarked on the loveliness of the delphiniums in the college flower beds. Then we had ice cream. I got in.

  Am I a better or more successful person for having been accepted at the University of Toronto, as opposed to my second or third choice? It strikes me as a curious question. In Ontario, there wasn't a strict hierarchy of colleges. There were several good ones and several better ones and a number of programs--like computer science at the University of Waterloo--that were world-class. But since all colleges were part of the same public system and tuition everywhere was the same (about a thousand dollars a year, in those days), and a B average in high school pretty much guaranteed you a spot in college, there wasn't a sense that anything great was at stake in the choice of which college we attended. The issue was whether we attended college, and--most important--how seriously we took the experience once we got there. I thought everyone felt this way. You can imagine my confusion, then, when I first met someone who had gone to Harvard.

  There was, first of all, that strange initial reluctance to talk about the matter of college at all--a glance downward, a shuffling of the feet, a mumbled mention of Cambridge. "Did you go to Harvard?" I would ask. I had just moved to the United States. I didn't know the rules. An uncomfortable nod would follow. Don't define me by my school, they seemed to be saying, which implied that their school actually could define them. And, of course, it did. Wherever there was one Harvard graduate, another lurked not far behind, ready to swap tales of late nights at the Hasty Pudding, or recount the intricacies of the college--application essay, or wonder out loud about the whereabouts of Prince So-and-So, who lived down the hall and whose family had a place in the South of France that you would not believe. In the novels they were writing, the precocious and sensitive protagonist always went to Harvard; if he was troubled, he dropped out of Harvard; in the end, he returned to Harvard to complete his senior thesis. Once, I attended a wedding of a Harvard alum in his fifties, at which the best man spoke of his college days with the groom as if neither could have accomplished anything of greater importance in the intervening thirty years. By the end, I half expected him to take off his shirt and proudly display the large crimson "H" tattooed on his chest. What is this "Harvard" of which you Americans speak so reverently?

  2.

  In 1905, Harvard College adopted the College Entrance Examination Board tests as the principal basis for admission, which meant that virtually any academically gifted high--school senior who could afford a private college had a straightforward shot at attending. By 1908, the freshman class was seven per cent Jewish, nine per cent Catholic, and forty-five per cent from public schools, an astonishing transformation for a school that historically had been the preserve of the New England boarding-school complex known in the admissions world as St. Grottlesex.

  As the sociologist Jerome Karabel writes in "The Chosen" (Houghton Mifflin; $28), his remarkable history of the admissions process at Harvard, Yale, and Princeton, that meritocratic spirit soon led to a crisis. The enrollment of Jews began to rise dramatically. By 1922, they made up more than a fifth of Harvard's freshman class. The administration and alumni were up in arms. Jews were thought to be sickly and grasping, grade-grubbing and insular. They displaced the sons of wealthy Wasp alumni, which did not bode well for fund-raising. A. Lawrence Lowell, Harvard's president in the nineteen-twenties, stated flatly that too many Jews would destroy the school: "The summer hotel that is ruined by admitting Jews meets its fate . . . because they drive away the Gentiles, and then after the Gentiles have left, they leave also."

  The difficult part, however, was coming up with a way of keeping Jews out, because as a group they were academically superior to everyone else. Lowell's first idea--a quota limiting Jews to fifteen per cent of the student body--was roundly criticized. Lowell tried restricting the number of scholarships given to Jewish students, and made an effort to bring in students from public schools in the West, where there were fewer Jews. Neither strategy worked. Finally, Lowell--and his counterparts at Yale and Princeton--realized that if a definition of merit based on academic prowess was leading to the wrong kind of student, the solution was to change the definition of merit. Karabel argues that it was at this moment that the history and nature of the Ivy League took a significant turn.

  The admissions office at Harvard became much more interested in the details of an applicant's personal life. Lowell told his admissions officers to elicit information about the "character" of candidates from "persons who know the applicants well," and so the letter of reference became mandatory. Harvard started asking applicants to provide a photograph. Candidates had to write personal essays, demonstrating their aptitude for leadership, and list their extracurricular activities. "Starting in the fall of 1922," Karabel writes, "applicants were required to answer questions on "Race and Color,' "Religious Preference,' "Maiden Name of Mother,' "Birthplace of Father,' and "What change, if any, has been made since birth in your own name or that of your father? (Explain fully).' "

  At Princeton, emissaries were sent to the major boarding schools, with instructions to rate potential candidates on a scale of 1 to 4, where 1 was "very desirable and apparently exceptional material from every point of view" and 4 was "undesirable from the point of view of character, and, therefore, to be excluded no matter what the results of the entrance examinations might be." The personal interview became a key component of admissions in order, Karabel writes, "to ensure that "undesirables' were identified and to assess important but subtle indicators of background and breeding such as speech, dress, deportment and physical appearance." By 1933, the end of Lowell's term, the percentage of Jews at Harvard was back down to fifteen per cent.

  If this new admissions system seems familiar, that's because it is essentially the same system that the Ivy League uses to this day. According to Karabel, Harvard, Yale, and Princeton didn't abandon the elevation of character once the Jewish crisis passed. They institutionalized it.

  Starting in 1953, Arthur Howe, Jr., spent a decade as the chair of admissions at Yale, and Karabel describes what happened under his guidance:

  The admissions committee viewed evidence of "manliness" with particular enthusiasm. One boy gained admission despite an academic prediction of 70 because "there was apparently something manly and distinctive about him that had won over both his alumni and staff interviewers." Another candidate, admitted despite his schoolwork being "mediocre in comparison with many others," was accepted over an applicant with a much better record and higher exam scores because, as Howe put it, "we just thought he was more of a guy." So preoccupied was Yale with the appearance of its students that the form used by alumni interviewers actually had a physical characteristics checklist through 1965. Each year, Yale carefully measured the height of entering freshmen, noting with pride the proportion of the class at six feet or more.

  At Harvard, the key figure in that same period was Wilbur Bender, who, as the dean of admissions, had a preference for "the boy with some athletic interests and abilities, the boy with physical vigor and coordination and grace." Bender, Karabel tells us, believed that if Harvard continued to suffer on the football field it would contribute to the school's reputation as a place with "no college spirit, few good fellows, and no vigorous, healthy social life," not to mention a "surfeit of "pansies,' "decadent esthetes' and "precious sophisticates.' " Bender concentrated on improving Harvard's techniques for evaluating "intangibles" and, in particular, its "ability to detect homosexual tendencies and serious psychiatric problems."

  By the nineteen-sixties, Harvard's admissions system had evolved into a series of complex algorithms. The school began by lumping all applicants into one of twenty-two dockets, according to their geographical origin. (There was one docket for Exeter and Andover, another for the eight Rocky Mountain states.) Information from interviews, references, and student essays was then used to grade each applicant on a scale of 1 to 6, along four dimensions: personal, academic, extracurricular, and athletic. Competition, critically, was within each docket, not between dockets, so there was no way for, say, the graduates of Bronx Science and Stuyvesant to shut out the graduates of Andover and Exeter. More important, academic achievement was just one of four dimensions, further diluting the value of pure intellectual accomplishment. Athletic ability, rather than falling under "extracurriculars," got a category all to itself, which explains why, even now, recruited athletes have an acceptance rate to the Ivies at well over twice the rate of other students, despite S.A.T. scores that are on average more than a hundred points lower. And the most important category? That mysterious index of "personal" qualities. According to Harvard's own analysis, the personal rating was a better predictor of admission than the academic rating. Those with a rank of 4 or worse on the personal scale had, in the nineteen-sixties, a rejection rate of ninety-eight per cent. Those with a personal rating of 1 had a rejection rate of 2.5 per cent. When the Office of Civil Rights at the federal education department investigated Harvard in the nineteen-eighties, they found handwritten notes scribbled in the margins of various candidates' files. "This young woman could be one of the brightest applicants in the pool but there are several references to shyness," read one. Another comment reads, "Seems a tad frothy." One application--and at this point you can almost hear it going to the bottom of the pile--was notated, "Short with big ears."

  3.

  Social scientists distinguish between what are known as treatment effects and selection effects. The Marine Corps, for instance, is largely a treatment-effect institution. It doesn't have an enormous admissions office grading applicants along four separate dimensions of toughness and intelligence. It's confident that the experience of undergoing Marine Corps basic training will turn you into a formidable soldier. A modelling agency, by contrast, is a selection-effect institution. You don't become beautiful by signing up with an agency. You get signed up by an agency because you're beautiful.

  At the heart of the American obsession with the Ivy League is the belief that schools like Harvard provide the social and intellectual equivalent of Marine Corps basic training--that being taught by all those brilliant professors and meeting all those other motivated students and getting a degree with that powerful name on it will confer advantages that no local state university can provide. Fuelling the treatment-effect idea are studies showing that if you take two students with the same S.A.T. scores and grades, one of whom goes to a school like Harvard and one of whom goes to a less selective college, the Ivy Leaguer will make far more money ten or twenty years down the road.

  The extraordinary emphasis the Ivy League places on admissions policies, though, makes it seem more like a modeling agency than like the Marine Corps, and, sure enough, the studies based on those two apparently equivalent students turn out to be flawed. How do we know that two students who have the same S.A.T. scores and grades really are equivalent? It's quite possible that the student who goes to Harvard is more ambitious and energetic and personable than the student who wasn't let in, and that those same intangibles are what account for his better career success. To assess the effect of the Ivies, it makes more sense to compare the student who got into a top school with the student who got into that same school but chose to go to a less selective one. Three years ago, the economists Alan Krueger and Stacy Dale published just such a study. And they found that when you compare apples and apples the income bonus from selective schools disappears.

  "As a hypothetical example, take the University of Pennsylvania and Penn State, which are two schools a lot of students choose between," Krueger said. "One is Ivy, one is a state school. Penn is much more highly selective. If you compare the students who go to those two schools, the ones who go to Penn have higher incomes. But let's look at those who got into both types of schools, some of whom chose Penn and some of whom chose Penn State. Within that set it doesn't seem to matter whether you go to the more selective school. Now, you would think that the more ambitious student is the one who would choose to go to Penn, and the ones choosing to go to Penn State might be a little less confident in their abilities or have a little lower family income, and both of those factors would point to people doing worse later on. But they don't."

  Krueger says that there is one exception to this. Students from the very lowest economic strata do seem to benefit from going to an Ivy. For most students, though, the general rule seems to be that if you are a hardworking and intelligent person you'll end up doing well regardless of where you went to school. You'll make good contacts at Penn. But Penn State is big enough and diverse enough that you can make good contacts there, too. Having Penn on your résumé opens doors. But if you were good enough to get into Penn you're good enough that those doors will open for you anyway. "I can see why families are really concerned about this," Krueger went on. "The average graduate from a top school is making nearly a hundred and twenty thousand dollars a year, the average graduate from a moderately selective school is making ninety thousand dollars. That's an enormous difference, and I can see why parents would fight to get their kids into the better school. But I think they are just assigning to the school a lot of what the student is bringing with him to the school."

  Bender was succeeded as the dean of admissions at Harvard by Fred Glimp, who, Karabel tells us, had a particular concern with academic underperformers. "Any class, no matter how able, will always have a bottom quarter," Glimp once wrote. "What are the effects of the psychology of feeling average, even in a very able group? Are there identifiable types with the psychological or what--not tolerance to be "happy' or to make the most of education while in the bottom quarter?" Glimp thought it was critical that the students who populated the lower rungs of every Harvard class weren't so driven and ambitious that they would be disturbed by their status. "Thus the renowned (some would say notorious) Harvard admission practice known as the "happy-bottom-quarter' policy was born," Karabel writes.

  It's unclear whether or not Glimp found any students who fit that particular description. (He wondered, in a marvellously honest moment, whether the answer was "Harvard sons.") But Glimp had the realism of the modelling scout. Glimp believed implicitly what Krueger and Dale later confirmed: that the character and performance of an academic class is determined, to a significant extent, at the point of admission; that if you want to graduate winners you have to admit winners; that if you want the bottom quarter of your class to succeed you have to find people capable of succeeding in the bottom quarter. Karabel is quite right, then, to see the events of the nineteen-twenties as the defining moment of the modern Ivy League. You are whom you admit in the élite-education business, and when Harvard changed whom it admitted, it changed Harvard. Was that change for the better or for the worse?

  4.

  In the wake of the Jewish crisis, Harvard, Yale, and Princeton chose to adopt what might be called the "best graduates" approach to admissions. France's École Normale Supérieure, Japan's University of Tokyo, and most of the world's other élite schools define their task as looking for the best students--that is, the applicants who will have the greatest academic success during their time in college. The Ivy League schools justified their emphasis on character and personality, however, by arguing that they were searching for the students who would have the greatest success after college. They were looking for leaders, and leadership, the officials of the Ivy League believed, was not a simple matter of academic brilliance. "Should our goal be to select a student body with the highest possible proportions of high-ranking students, or should it be to select, within a reasonably high range of academic ability, a student body with a certain variety of talents, qualities, attitudes, and backgrounds?" Wilbur Bender asked. To him, the answer was obvious. If you let in only the brilliant, then you produced bookworms and bench scientists: you ended up as socially irrelevant as the University of Chicago (an institution Harvard officials looked upon and shuddered). "Above a reasonably good level of mental ability, above that indicated by a 550-600 level of S.A.T. score," Bender went on, "the only thing that matters in terms of future impact on, or contribution to, society is the degree of personal inner force an individual has."

  It's easy to find fault with the best-graduates approach. We tend to think that intellectual achievement is the fairest and highest standard of merit. The Ivy League process, quite apart from its dubious origins, seems subjective and opaque. Why should personality and athletic ability matter so much? The notion that "the ability to throw, kick, or hit a ball is a legitimate criterion in determining who should be admitted to our greatest research universities," Karabel writes, is "a proposition that would be considered laughable in most of the world's countries." At the same time that Harvard was constructing its byzantine admissions system, Hunter College Elementary School, in New York, required simply that applicants take an exam, and if they scored in the top fifty they got in. It's hard to imagine a more objective and transparent procedure.

  But what did Hunter achieve with that best-students model? In the nineteen-eighties, a handful of educational researchers surveyed the students who attended the elementary school between 1948 and 1960. This was a group with an average I.Q. of 157--three and a half standard deviations above the mean--who had been given what, by any measure, was one of the finest classroom experiences in the world. As graduates, though, they weren't nearly as distinguished as they were expected to be. "Although most of our study participants are successful and fairly content with their lives and accomplishments," the authors conclude, "there are no superstars . . . and only one or two familiar names." The researchers spend a great deal of time trying to figure out why Hunter graduates are so disappointing, and end up sounding very much like Wilbur Bender. Being a smart child isn't a terribly good predictor of success in later life, they conclude. "Non-intellective" factors--like motivation and social skills--probably matter more. Perhaps, the study suggests, "after noting the sacrifices involved in trying for national or world-class leadership in a field, H.C.E.S. graduates decided that the intelligent thing to do was to choose relatively happy and successful lives." It is a wonderful thing, of course, for a school to turn out lots of relatively happy and successful graduates. But Harvard didn't want lots of relatively happy and successful graduates. It wanted superstars, and Bender and his colleagues recognized that if this is your goal a best-students model isn't enough.

  Most élite law schools, to cite another example, follow a best-students model. That's why they rely so heavily on the L.S.A.T. Yet there's no reason to believe that a person's L.S.A.T. scores have much relation to how good a lawyer he will be. In a recent research project funded by the Law School Admission Council, the Berkeley researchers Sheldon Zedeck and Marjorie Shultz identified twenty-six "competencies" that they think effective lawyering demands--among them practical judgment, passion and engagement, legal-research skills, questioning and interviewing skills, negotiation skills, stress management, and so on--and the L.S.A.T. picks up only a handful of them. A law school that wants to select the best possible lawyers has to use a very different admissions process from a law school that wants to select the best possible law students. And wouldn't we prefer that at least some law schools try to select good lawyers instead of good law students?

  This search for good lawyers, furthermore, is necessarily going to be subjective, because things like passion and engagement can't be measured as precisely as academic proficiency. Subjectivity in the admissions process is not just an occasion for discrimination; it is also, in better times, the only means available for giving us the social outcome we want. The first black captain of the Yale football team was a man named Levi Jackson, who graduated in 1950. Jackson was a hugely popular figure on campus. He went on to be a top executive at Ford, and is credited with persuading the company to hire thousands of African-Americans after the 1967 riots. When Jackson was tapped for the exclusive secret society Skull and Bones, he joked, "If my name had been reversed, I never would have made it." He had a point. The strategy of discretion that Yale had once used to exclude Jews was soon being used to include people like Levi Jackson.

  In the 2001 book "The Game of Life," James L. Shulman and William Bowen (a former president of Princeton) conducted an enormous statistical analysis on an issue that has become one of the most contentious in admissions: the special preferences given to recruited athletes at selective universities. Athletes, Shulman and Bowen demonstrate, have a large and growing advantage in admission over everyone else. At the same time, they have markedly lower G.P.A.s and S.A.T. scores than their peers. Over the past twenty years, their class rankings have steadily dropped, and they tend to segregate themselves in an "athletic culture" different from the culture of the rest of the college. Shulman and Bowen think the preference given to athletes by the Ivy League is shameful.

  Halfway through the book, however, Shulman and Bowen present "Glad18" finding. Male athletes, despite their lower S.A.T. scores and grades, and despite the fact that many of them are members of minorities and come from lower socioeconomic backgrounds than other students, turn out to earn a lot more than their peers. Apparently, athletes are far more likely to go into the high-paying financial-services sector, where they succeed because of their personality and psychological makeup. In what can only be described as a textbook example of burying the lead, Bowen and Shulman write:

  One of these characteristics can be thought of as drive--a strong desire to succeed and unswerving determination to reach a goal, whether it be winning the next game or closing a sale. Similarly, athletes tend to be more energetic than the average person, which translates into an ability to work hard over long periods of time--to meet, for example, the workload demands placed on young people by an investment bank in the throes of analyzing a transaction. In addition, athletes are more likely than others to be highly competitive, gregarious and confident of their ability to work well in groups (on teams).

  Shulman and Bowen would like to argue that the attitudes of selective colleges toward athletes are a perversion of the ideals of American élite education, but that's because they misrepresent the actual ideals of American élite education. The Ivy League is perfectly happy to accept, among others, the kind of student who makes a lot of money after graduation. As the old saying goes, the definition of a well-rounded Yale graduate is someone who can roll all the way from New Haven to Wall Street.

  5.

  I once had a conversation with someone who worked for an advertising agency that represented one of the big luxury automobile brands. He said that he was worried that his client's new lower-priced line was being bought disproportionately by black women. He insisted that he did not mean this in a racist way. It was just a fact, he said. Black women would destroy the brand's cachet. It was his job to protect his client from the attentions of the socially undesirable.

  This is, in no small part, what Ivy League admissions directors do. They are in the luxury-brand-management business, and "The Chosen," in the end, is a testament to just how well the brand managers in Cambridge, New Haven, and Princeton have done their job in the past seventy-five years. In the nineteen twenties, when Harvard tried to figure out how many Jews they had on campus, the admissions office scoured student records and assigned each suspected Jew the designation j1 (for someone who was "conclusively Jewish"), j2 (where the "preponderance of evidence" pointed to Jewishness), or j3 (where Jewishness was a "possibility"). In the branding world, this is called customer segmentation. In the Second World War, as Yale faced plummeting enrollment and revenues, it continued to turn down qualified Jewish applicants. As Karabel writes, "In the language of sociology, Yale judged its symbolic capital to be even more precious than its economic capital." No good brand manager would sacrifice reputation for short-term gain. The admissions directors at Harvard have always, similarly, been diligent about rewarding the children of graduates, or, as they are quaintly called, "legacies." In the 1985-92 period, for instance, Harvard admitted children of alumni at a rate more than twice that of non-athlete, non-legacy applicants, despite the fact that, on virtually every one of the school's magical ratings scales, legacies significantly lagged behind their peers. Karabel calls the practice "unmeritocratic at best and profoundly corrupt at worst," but rewarding customer loyalty is what luxury brands do. Harvard wants good graduates, and part of their definition of a good graduate is someone who is a generous and loyal alumnus. And if you want generous and loyal alumni you have to reward them. Aren't the tremendous resources provided to Harvard by its alumni part of the reason so many people want to go to Harvard in the first place? The endless battle over admissions in the United States proceeds on the assumption that some great moral principle is at stake in the matter of whom schools like Harvard choose to let in--that those who are denied admission by the whims of the admissions office have somehow been harmed. If you are sick and a hospital shuts its doors to you, you are harmed. But a selective school is not a hospital, and those it turns away are not sick. Élite schools, like any luxury brand, are an aesthetic experience--an exquisitely constructed fantasy of what it means to belong to an élite --and they have always been mindful of what must be done to maintain that experience.

  In the nineteen-eighties, when Harvard was accused of enforcing a secret quota on Asian admissions, its defense was that once you adjusted for the preferences given to the children of alumni and for the preferences given to athletes, Asians really weren't being discriminated against. But you could sense Harvard's exasperation that the issue was being raised at all. If Harvard had too many Asians, it wouldn't be Harvard, just as Harvard wouldn't be Harvard with too many Jews or pansies or parlor pinks or shy types or short people with big ears.
GO TO TOP MENU

  How Rick Warren built his ministry.

  1.

  On the occasion of the twenty-fifth anniversary of Saddleback Church, Rick Warren hired the Anaheim Angels' baseball stadium. He wanted to address his entire congregation at once, and there was no way to fit everyone in at Saddleback, where the crowds are spread across services held over the course of an entire weekend. So Warren booked the stadium and printed large, silver-black-and-white tickets, and, on a sunny Sunday morning last April, the tens of thousands of congregants of one of America's largest churches began to file into the stands. They were wearing shorts and T-shirts and buying Cokes and hamburgers from the concession stands, if they had not already tailgated in the parking lot. On the field, a rock band played loudly and enthusiastically. Just after one o'clock, a voice came over the public-address system--"RIIIICK WARRRREN"--and Warren bounded onto the stage, wearing black slacks, a red linen guayabera shirt, and wraparound NASCAR sunglasses. The congregants leaped to their feet."You know," Warren said, grabbing the microphone, "there are two things I've always wanted to do in a stadium." He turned his body sideways, playing an imaginary guitar, and belted out the first few lines of Jimi Hendrix's "Purple Haze." His image was up on the Jumbotrons in right and left fields, just below the Verizon and Pepsi and Budweiser logos. He stopped and grinned. "The other thing is, I want to do a wave!" He pointed to the bleachers, and then to the right-field seats, and around and around the stadium the congregation rose and fell, in four full circuits. "You are the most amazing church in America!" Warren shouted out, when they had finally finished. "AND I LOVE YOU!"

  2.

  Rick Warren is a large man, with a generous stomach. He has short, spiky hair and a goatee. He looks like an ex-athlete, or someone who might have many tattoos. He is a hugger, enfolding those he meets in his long arms and saying things like "Hey, man." According to Warren, from sixth grade through college there wasn't a day in his life that he wasn't president of something, and that makes sense, because he's always the one at the center of the room talking or laughing, with his head tilted way back, or crying, which he does freely. In the evangelical tradition, preachers are hard or soft. Billy Graham, with his piercing eyes and protruding chin and Bible clenched close to his chest, is hard. So was Martin Luther King, Jr., who overwhelmed his audience with his sonorous, forcefully enunciated cadences. Warren is soft. His sermons are conversational, delivered in a folksy, raspy voice. He talks about how he loves Krispy Kreme doughnuts, drives a four-year-old Ford, and favors loud Hawaiian shirts, even at the pulpit, because, he says, "they do not itch."

  In December of 1979, when Warren was twenty-five years old, he and his wife, Kay, took their four-month-old baby and drove in a U-Haul from Texas to Saddleback Valley, in Orange County, because Warren had read that it was one of the fastest-growing counties in the country. He walked into the first real-estate office he found and introduced himself to the first agent he saw, a man named Don Dale. He was looking for somewhere to live, he said.

  "Do you have any money to rent a house?" Dale asked.
"Not much, but we can borrow some," Warren replied.
"Do you have a job?"
"No. I don't have a job."
"What do you do for a living?"
"I'm a minister."
"So you have a church?"
"Not yet."

  Dale found him an apartment that very day, of course: Warren is one of those people whose lives have an irresistible forward momentum. In the car on the way over, he recruited Dale as the first member of his still nonexistent church, of course. And when he held his first public service, three months later, he stood up in front of two hundred and five people he barely knew in a high-school gymnasium--this shiny-faced preacher fresh out of seminary--and told them that one day soon their new church would number twenty thousand people and occupy a campus of fifty acres. Today, Saddleback Church has twenty thousand members and occupies a campus of a hundred and twenty acres. Once, Warren wanted to increase the number of small groups at Saddleback--the groups of six or seven that meet for prayer and fellowship during the week--by three hundred. He went home and prayed and, as he tells it, God said to him that what he really needed to do was increase the number of small groups by three thousand, which is just what he did. Then, a few years ago, he wrote a book called "The Purpose-Driven Life," a genre of book that is known in the religious-publishing business as "Christian Living," and that typically sells thirty or forty thousand copies a year. Warren's publishers came to see him at Saddleback, and sat on the long leather couch in his office, and talked about their ideas for the book. "You guys don't understand," Warren told them. "This is a hundred-million-copy book." Warren remembers stunned silence: "Their jaws dropped." But now, nearly three years after its publication, "The Purpose-Driven Life" has sold twenty-three million copies. It is among the best-selling nonfiction hardcover books in American history. Neither the New York Times, the Los Angeles Times, nor the Washington Post has reviewed it. Warren's own publisher didn't see it coming. Only Warren had faith. "The best of the evangelical tradition is that you don't plan your way forward--you prophesy your way forward," the theologian Leonard Sweet says. "Rick's prophesying his way forward."

  Not long after the Anaheim service, Warren went back to his office on the Saddleback campus. He put his feet up on the coffee table. On the wall in front of him were framed originals of the sermons of the nineteenth-century preacher Charles Spurgeon, and on the bookshelf next to him was his collection of hot sauces. "I had dinner with Jack Welch last Sunday night," he said. "He came to church, and we had dinner. I've been kind of mentoring him on his spiritual journey. And he said to me, 'Rick, you are the biggest thinker I have ever met in my life. The only other person I know who thinks globally like you is Rupert Murdoch.' And I said, 'That's interesting. I'm Rupert's pastor! Rupert published my book!'" Then he tilted back his head and gave one of those big Rick Warren laughs.

  3.

  Churches, like any large voluntary organization, have at their core a contradiction. In order to attract newcomers, they must have low barriers to entry. They must be unintimidating, friendly, and compatible with the culture they are a part of. In order to retain their membership, however, they need to have an identity distinct from that culture. They need to give their followers a sense of community--and community, exclusivity, a distinct identity are all, inevitably, casualties of growth. As an economist would say, the bigger an organization becomes, the greater a free-rider problem it has. If I go to a church with five hundred members, in a magnificent cathedral, with spectacular services and music, why should I volunteer or donate any substantial share of my money? What kind of peer pressure is there in a congregation that large? If the barriers to entry become too low--and the ties among members become increasingly tenuous--then a church as it grows bigger becomes weaker.

  One solution to the problem is simply not to grow, and, historically, churches have sacrificed size for community. But there is another approach: to create a church out of a network of lots of little church cells--exclusive, tightly knit groups of six or seven who meet in one another's homes during the week to worship and pray. The small group as an instrument of community is initially how Communism spread, and in the postwar years Alcoholics Anonymous and its twelve-step progeny perfected the small-group technique. The small group did not have a designated leader who stood at the front of the room. Members sat in a circle. The focus was on discussion and interaction--not one person teaching and the others listening--and the remarkable thing about these groups was their power. An alcoholic could lose his job and his family, he could be hospitalized, he could be warned by half a dozen doctors--and go on drinking. But put him in a room of his peers once a week--make him share the burdens of others and have his burdens shared by others--and he could do something that once seemed impossible.

  When churches--in particular, the megachurches that became the engine of the evangelical movement, in the nineteen-seventies and eighties--began to adopt the cellular model, they found out the same thing. The small group was an extraordinary vehicle of commitment. It was personal and flexible. It cost nothing. It was convenient, and every worshipper was able to find a small group that precisely matched his or her interests. Today, at least forty million Americans are in a religiously based small group, and the growing ranks of small-group membership have caused a profound shift in the nature of the American religious experience."

  As I see it, one of the most unfortunate misunderstandings of our time has been to think of small intentional communities as groups 'within' the church," the philosopher Dick Westley writes in one of the many books celebrating the rise of small-group power. "When are we going to have the courage to publicly proclaim what everyone with any experience with small groups has known all along: they are not organizations 'within' the church; they are church."

  Ram Cnaan, a professor of social work at the University of Pennsylvania, recently estimated the replacement value of the charitable work done by the average American church--that is, the amount of money it would take to equal the time, money, and resources donated to the community by a typical congregation--and found that it came to about a hundred and forty thousand dollars a year. In the city of Philadelphia, for example, that works out to an annual total of two hundred and fifty million dollars' worth of community "good"; on a national scale, the contribution of religious groups to the public welfare is, as Cnaan puts it, "staggering." In the past twenty years, as the enthusiasm for publicly supported welfare has waned, churches have quietly and steadily stepped in to fill the gaps. And who are the churchgoers donating all that time and money? People in small groups. Membership in a small group is a better predictor of whether people volunteer or give money than how often they attend church, whether they pray, whether they've had a deep religious experience, or whether they were raised in a Christian home. Social action is not a consequence of belief, in other words. I don't give because I believe in religious charity. I give because I belong to a social structure that enforces an ethic of giving. "Small groups are networks," the Princeton sociologist Robert Wuthnow, who has studied the phenomenon closely, says. "They create bonds among people. Expose people to needs, provide opportunities for volunteering, and put people in harm's way of being asked to volunteer. That's not to say that being there for worship is not important. But, even in earlier research, I was finding that if people say all the right things about being a believer but aren't involved in some kind of physical social setting that generates interaction, they are just not as likely to volunteer."

  Rick Warren came to the Saddle-back Valley just as the small-group movement was taking off. He was the son of a preacher--a man who started seven churches in and around Northern California and was enough of a carpenter to have built a few dozen more with his own hands--and he wanted to do what his father had done: start a church from scratch.

  For the first three months, he went from door to door in the neighborhood around his house, asking people why they didn't attend church. Churches were boring and irrelevant to everyday life, he was told. They were unfriendly to visitors. They were too interested in money. They had inadequate children's programs. So Warren decided that in his new church people would play and sing contemporary music, not hymns. (He could find no one, Warren likes to say, who listened to organ music in the car.) He would wear the casual clothes of his community. The sermons would be practical and funny and plainspoken, and he would use video and drama to illustrate his message. And when an actual church was finally built--Saddleback used seventy-nine different locations in its first thirteen years, from high-school auditoriums to movie theatres and then tents before building a permanent home--the church would not look churchy: no pews, or stained glass, or lofty spires. Saddleback looks like a college campus, and the main sanctuary looks like the school gymnasium. Parking is plentiful. The chairs are comfortable. There are loudspeakers and television screens everywhere broadcasting the worship service, and all the doors are open, so anyone can slip in or out, at any time, in the anonymity of the enormous crowds. Saddle-back is a church with very low barriers to entry.

  But beneath the surface is a network of thousands of committed small groups. "Orange County is virtually a desert in social-capital terms," the Harvard political scientist Robert Putnam, who has taken a close look at the Saddleback success story, says. "The rate of mobility is really high. It has long and anonymous commutes. It's a very friendless place, and this church offers serious heavy friendship. It's a very interesting experience to talk to some of those groups. There were these eight people and they were all mountain bikers--mountain bikers for God. They go biking together, and they are one another's best friends. If one person's wife gets breast cancer, he can go to the others for support. If someone loses a job, the others are there for him. They are deeply best friends, in a larger social context where it is hard to find a best friend."

  Putnam goes on, "Warren didn't invent the cellular church. But he's brought it to an amazing level of effectiveness. The real job of running Saddleback is the recruitment and training and retention of the thousands of volunteer leaders for all the small groups it has. That's the surprising thing to me--that they are able to manage that. Those small groups are incredibly vulnerable, and complicated to manage. How to keep all those little dinghies moving in the same direction is, organizationally, a major accomplishment."

  At Saddleback, members are expected to tithe, and to volunteer. Sunday-school teachers receive special training and a police background check. Recently, Warren decided that Saddleback would feed every homeless person in Orange County three meals a day for forty days. Ninety-two hundred people volunteered. Two million pounds of food were collected, sorted, and distributed.

  It may be easy to start going to Saddleback. But it is not easy to stay at Saddleback. "Last Sunday, we took a special offering called Extend the Vision, for people to give over and above their normal offering," Warren said. "We decided we would not use any financial consultants, no high-powered gimmicks, no thermometer on the wall. It was just 'Folks, you know you need to give.' Sunday's offering was seven million dollars in cash and fifty-three million dollars in commitments. That's one Sunday. The average commitment was fifteen thousand dollars a family. That's in addition to their tithe. When people say megachurches are shallow, I say you have no idea. These people are committed."

  Warren's great talent is organizational. He's not a theological innovator. When he went from door to door, twenty-five years ago, he wasn't testing variants on the Christian message. As far as he was concerned, the content of his message was non-negotiable. Theologically, Warren is a straight-down-the-middle evangelical. What he wanted to learn was how to construct an effective religious institution. His interest was sociological. Putnam compares Warren to entrepreneurs like Ray Kroc and Sam Walton, pioneers not in what they sold but in how they sold. The contemporary thinker Warren cites most often in conversation is the management guru Peter Drucker, who has been a close friend of his for years. Before Warren wrote "The Purpose-Driven Life," he wrote a book called "The Purpose-Driven Church," which was essentially a how-to guide for church builders. He's run hundreds of training seminars around the world for ministers of small-to-medium-sized churches. At the beginning of the Internet boom, he created a Web site called pastors.com, on which he posted his sermons for sale for four dollars each. There were many pastors in the world, he reasoned, who were part time. They had a second, nine-to-five job and families of their own, and what little free time they had was spent ministering to their congregation. Why not help them out with Sunday morning? The Web site now gets nearly four hundred thousand hits a day.

  "I went to South Africa two years ago," Warren said. "We did the purpose-driven-church training, and we simulcast it to ninety thousand pastors across Africa. After it was over, I said, 'Take me out to a village and show me some churches.'"

  In the first village they went to, the local pastor came out, saw Warren, and said, "I know who you are. You're Pastor Rick."

  "And I said, 'How do you know who I am?' " Warren recalled. "He said, 'I get your sermons every week.' And I said, 'How? You don't even have electricity here.' And he said, 'We're putting the Internet in every post office in South Africa. Once a week, I walk an hour and a half down to the post office. I download it. Then I teach it. You are the only training I have ever received.'"

  A typical evangelist, of course, would tell stories about reaching ordinary people, the unsaved laity. But a typical evangelist is someone who goes from town to town, giving sermons to large crowds, or preaching to a broad audience on television. Warren has never pastored any congregation but Saddleback, and he refuses to preach on television, because that would put him in direct competition with the local pastors he has spent the past twenty years cultivating. In the argot of the New Economy, most evangelists follow a business-to-consumer model: b-to-c. Warren follows a business-to-business model: b-to-b. He reaches the people who reach people. He's a builder of religious networks. "I once heard Drucker say this," Warren said. "'Warren is not building a tent revival ministry, like the old-style evangelists. He's building an army, like the Jesuits.'"

  4.

  To write "The Purpose-Driven Life," Warren holed up in an office in a corner of the Saddleback campus, twelve hours a day for seven months. "I would get up at four-thirty, arrive at my special office at five, and I would write from five to five," he said. "I'm a people person, and it about killed me to be alone by my-self. By eleven-thirty, my A.D.D. would kick in. I would do anything not to be there. It was like birthing a baby." The book didn't tell any stories. It wasn't based on any groundbreaking new research or theory or theological insight. "I'm just not that good a writer," Warren said. "I'm a pastor. There's nothing new in this book. But sometimes as I was writing it I would break down in tears. I would be weeping, and I would feel like God was using me."

  The book begins with an inscription: "This book is dedicated to you. Before you were born, God planned this moment in your life. It is no accident that you are holding this book. God longs for you to discover the life he created you to live--here on earth, and forever in eternity." Five sections follow, each detailing one of God's purposes in our lives--"You Were Planned for God's Pleasure"; "You Were Formed for God's Family"; "You Were Created to Become Like Christ"; "You Were Shaped for Serving God"; "You Were Made for a Mission"--and each of the sections, in turn, is divided into short chapters ("Understanding Your Shape" or "Using What God Gave You" or "How Real Servants Act"). The writing is simple and unadorned. The scriptural interpretation is literal: "Noah had never seen rain, because prior to the Flood, God irrigated the earth from the ground up." The religious vision is uncomplicated and accepting: "God wants to be your best friend." Warren's Christianity, like his church, has low barriers to entry: "Wherever you are reading this, I invite you to bow your head and quietly whisper the prayer that will change your eternity. Jesus, I believe in you and I receive you. Go ahead. If you sincerely meant that prayer, congratulations! Welcome to the family of God! You are now ready to discover and start living God's purpose for your life."

  It is tempting to interpret the book's message as a kind of New Age self-help theology. Warren's God is not awesome or angry and does not stand in judgment of human sin. He's genial and mellow. "Warren's God 'wants to be your best friend,' and this means, in turn, that God's most daunting property, the exercise of eternal judgment, is strategically downsized," the critic Chris Lehmann writes, echoing a common complaint:

  "When Warren turns his utility-minded feel-speak upon the symbolic iconography of the faith, the results are offensively bathetic: "When Jesus stretched his arms wide on the cross, he was saying, 'I love you this much.' " But God needs to be at a greater remove than a group hug."

  The self-help genre, however, is fundamentally inward-focussed. M. Scott Peck's "The Road Less Traveled"--the only spiritual work that, in terms of sales, can even come close to "The Purpose-Driven Life"--begins with the sentence "Life is difficult." That's a self-help book: it focusses the reader on his own experience. Warren's first sentence, by contrast, is "It's not about you," which puts it in the spirit of traditional Christian devotional literature, which focusses the reader outward, toward God. In look and feel, in fact, "The Purpose-Driven Life" is less twenty-first-century Orange County than it is the nineteenth century of Warren's hero, the English evangelist Charles Spurgeon. Spurgeon was the Warren of his day: the pastor of a large church in London, and the author of best-selling devotional books. On Sunday, good Christians could go and hear Spurgeon preach at the Metropolitan Tabernacle. But during the week they needed something to replace the preacher, and so Spurgeon, in one of his best-known books, "Morning and Evening," wrote seven hundred and thirty-two short homilies, to be read in the morning and the evening of each day of the year. The homilies are not complex investigations of theology. They are opportunities for spiritual reflection. (Sample Spurgeonism: "Every child of God is where God has placed him for some purpose, and the practical use of this first point is to lead you to inquire for what practical purpose has God placed each one of you where you now are." Sound familiar?) The Oxford Times described one of Spurgeon's books as "a rich store of topics treated daintily, with broad humour, with quaint good sense, yet always with a subdued tone and high moral aim," and that describes "The Purpose-Driven Life" as well. It's a spiritual companion. And, like "Morning and Evening," it is less a book than a program. It's divided into forty chapters, to be read during "Forty Days of Purpose." The first page of the book is called "My Covenant." It reads, "With God's help, I commit the next 40 days of my life to discovering God's purpose for my life."

  Warren departs from Spurgeon, though, in his emphasis on the purpose-driven life as a collective experience. Below the boxed covenant is a space for not one signature but three: "Your name," "Partner's name," and then Rick Warren's signature, already printed, followed by a quotation from Ecclesiastes 4:9:

  "Two are better off than one, because together they can work more effectively. If one of them falls down, the other can help him up. . . . Two people can resist an attack that would defeat one person alone. A rope made of three cords is hard to break."

  "The Purpose-Driven Life" is meant to be read in groups. If the vision of faith sometimes seems skimpy, that's because the book is supposed to be supplemented by a layer of discussion and reflection and debate. It is a testament to Warren's intuitive understanding of how small groups work that this is precisely how "The Purpose-Driven Life" has been used. It spread along the network that he has spent his career putting together, not from person to person but from group to group. It presold five hundred thousand copies. It averaged more than half a million copies in sales a month in its first two years, which is possible only when a book is being bought in lots of fifty or a hundred or two hundred. Of those who bought the book as individuals, nearly half have bought more than one copy, sixteen per cent have bought four to six copies, and seven per cent have bought ten or more. Twenty-five thousand churches have now participated in the congregation-wide "40 Days of Purpose" campaign, as have hundreds of small groups within companies and organizations, from the N.B.A. to the United States Postal Service.

  "I remember the first time I met Rick," says Scott Bolinder, the head of Zondervan, the Christian publishing division of HarperCollins and the publisher of "The Purpose-Driven Life." "He was telling me about pastors.com. This is during the height of the dot-com boom. I was thinking, What's your angle? He had no angle. He said, 'I love pastors. I know what they go through.' I said, 'What do you put on there?' He said, 'I put my sermons with a little disclaimer on there: "You are welcome to preach it any way you can. I only ask one thing--I ask that you do it better than I did."' So then fast-forward seven years: he's got hundreds of thousands of pastors who come to this Web site. And he goes, 'By the way, my church and I are getting ready to do forty days of purpose. If you want to join us, I'm going to preach through this and put my sermons up. And I've arranged with my publisher that if you do join us with this campaign they will sell the book to you for a low price.' That became the tipping point--being able to launch that book with eleven hundred churches, right from the get-go. They became the evangelists for the book."

  The book's high-water mark came earlier this year, when a fugitive named Brian Nichols, who had shot and killed four people in an Atlanta courthouse, accosted a young single mother, Ashley Smith, outside her apartment, and held her captive in her home for seven hours.

  "I asked him if I could read," Smith said at the press conference after her ordeal was over, and so she went and got her copy of "The Purpose-Driven Life" and turned to the chapter she was reading that day. It was Chapter 33, "How Real Servants Act." It begins:

  "We serve God by serving others.

  The world defines greatness in terms of power, possessions, prestige, and position. If you can demand service from others, you've arrived. In our self-serving culture with its me-first mentality, acting like a servant is not a popular concept.

  Jesus, however, measured greatness in terms of service, not status. God determines your greatness by how many people you serve, not how many people serve you."

  Nichols listened and said, "Stop. Will you read it again?"

  Smith read it to him again. They talked throughout the night. She made him pancakes. "I said, 'Do you believe in miracles? Because if you don't believe in miracles -- you are here for a reason. You're here in my apartment for some reason.' " She might as well have been quoting from "The Purpose-Driven Life." She went on, "You don't think you're supposed to be sitting here right in front of me listening to me tell you, you know, your reason for being here?" When morning came, Nichols let her go.

  Hollywood could not have scripted a better testimonial for "The Purpose-Driven Life." Warren's sales soared further. But the real lesson of that improbable story is that it wasn't improbable at all. What are the odds that a young Christian--a woman who, it turns out, sends her daughter to Hebron Church, in Dacula, Georgia--isn't reading "The Purpose-Driven Life"? And is it surprising that Ashley Smith would feel compelled to read aloud from the book to her captor, and that, in the discussion that followed, Nichols would come to some larger perspective on his situation? She and Nichols were in a small group, and reading aloud from "The Purpose-Driven Life" is what small groups do.

  5.

  Not long ago, the sociologist Christian Smith decided to find out what American evangelicals mean when they say that they believe in a "Christian America." The phrase seems to suggest that evangelicals intend to erode the separation of church and state. But when Smith asked a representative sample of evangelicals to explain the meaning of the phrase, the most frequent explanation was that America was founded by people who sought religious liberty and worked to establish religious freedom. The second most frequent explanation offered was that a majority of Americans of earlier generations were sincere Christians, which, as Smith points out, is empirically true. Others said what they meant by a Christian nation was that the basic laws of American government reflected Christian principles--which sounds potentially theocratic, except that when Smith asked his respondents to specify what they meant by basic laws they came up with representative government and the balance of powers.

  "In other words," Smith writes, "the belief that America was once a Christian nation does not necessarily mean a commitment to making it a 'Christian' nation today, whatever that might mean. Some evangelicals do make this connection explicitly. But many discuss America's Christian heritage as a simple fact of history that they are not particularly interested in or optimistic about reclaiming. Further, some evangelicals think America never was a Christian nation; some think it still is; and others think it should not be a Christian nation, whether or not it was so in the past or is now."

  As Smith explored one issue after another with the evangelicals--gender equality, education, pluralism, and politics--he found the same scattershot pattern. The Republican Party may have been adept at winning the support of evangelical voters, but that affinity appears to be as much cultural as anything; the Party has learned to speak the evangelical language. Scratch the surface, and the appearance of homogeneity and ideological consistency disappears. Evangelicals want children to have the right to pray in school, for example, and they vote for conservative Republicans who support that right. But what do they mean by prayer? The New Testament's most left-liberal text, the Lord's Prayer--which, it should be pointed out, begins with a call for utopian social restructuring ("Thy will be done, On earth as it is in Heaven"), then welfare relief ("Give us this day our daily bread"), and then income redistribution ("Forgive us our debts as we also have forgiven our debtors"). The evangelical movement isn't a movement, if you take movements to be characterized by a coherent philosophy, and that's hardly surprising when you think of the role that small groups have come to play in the evangelical religious experience. The answers that Smith got to his questions are the kind of answers you would expect from people who think most deeply about their faith and its implications on Tuesday night, or Wednesday, with five or six of their closest friends, and not Sunday morning, in the controlling hands of a pastor.

  "Small groups cultivate spirituality, but it is a particular kind of spirituality," Robert Wuthnow writes. "They cannot be expected to nurture faith in the same way that years of theological study, meditation and reflection might." He says, "They provide ways of putting faith in practice. For the most part, their focus is on practical applications, not on abstract knowledge, or even on ideas for the sake of ideas themselves."

  We are so accustomed to judging a social movement by its ideological coherence that the vagueness at the heart of evangelicalism sounds like a shortcoming. Peter Drucker calls Warren's network an army, like the Jesuits. But the Jesuits marched in lockstep and held to an all-encompassing and centrally controlled creed. The members of Warren's network don't all dress the same, and they march to the tune only of their own small group, and they agree, fundamentally, only on who the enemy is. It's not an army. It's an insurgency.

  In the wake of the extraordinary success of "The Purpose-Driven Life," Warren says, he underwent a period of soul-searching. He had suddenly been given enormous wealth and influence and he did not know what he was supposed to do with it. "God led me to Psalm 72, which is Solomon's prayer for more influence," Warren says. "It sounds pretty selfish. Solomon is already the wisest and wealthiest man in the world. He's the King of Israel at the apex of its glory. And in that psalm he says, 'God, I want you to make me more powerful and influential.' It looks selfish until he says, 'So that the King may support the widow and orphan, care for the poor, defend the defenseless, speak up for the immigrant, the foreigner, be a friend to those in prison.' Out of that psalm, God said to me that the purpose of influence is to speak up for those who have no influence. That changed my life. I had to repent. I said, I'm sorry, widows and orphans have not been on my radar. I live in Orange County. I live in the Saddleback Valley, which is all gated communities. There aren't any homeless people around. They are thirteen miles away, in Santa Ana, not here." He gestured toward the rolling green hills outside. "I started reading through Scripture. I said, How did I miss the two thousand verses on the poor in the Bible? So I said, I will use whatever affluence and influence that you give me to help those who are marginalized."

  He and his wife, Kay, decided to reverse tithe, giving away ninety per cent of the tens of millions of dollars they earned from "The Purpose-Driven Life." They sat down with gay community leaders to talk about fighting AIDS. Warren has made repeated trips to Africa. He has sent out volunteers to forty-seven countries around the world, test-piloting experiments in microfinance and H.I.V. prevention and medical education. He decided to take the same networks he had built to train pastors and spread the purpose-driven life and put them to work on social problems.

  "There is only one thing big enough to handle the world's problems, and that is the millions and millions of churches spread out around the world," he says. "I can take you to thousands of villages where they don't have a school. They don't have a grocery store, don't have a fire department. But they have a church. They have a pastor. They have volunteers. The problem today is distribution. In the tsunami, millions of dollars of foodstuffs piled up on the shores and people couldn't get it into the places that needed it, because they didn't have a network. Well, the biggest distribution network in the world is local churches. There are millions of them, far more than all the franchises in the world. Put together, they could be a force for good."

  That is, in one sense, a typical Warren pronouncement--bold to the point of audacity, like telling his publisher that his book will sell a hundred million copies. In another sense, it is profoundly modest. When Warren's nineteenth-century evangelical predecessors took on the fight against slavery, they brought to bear every legal, political, and economic lever they could get their hands on. But that was a different time, and that was a different church. Today's evangelicalism is a network, and networks, for better or worse, are informal and personal.

  At the Anaheim stadium service, Warren laid out his plan for attacking poverty and disease. He didn't talk about governments, though, or the United Nations, or structures, or laws. He talked about the pastors he had met in his travels around the world. He brought out the President of Rwanda, who stood up at the microphone--a short, slender man in an immaculate black suit--and spoke in halting English about how Warren was helping him rebuild his country. When he was finished, the crowd erupted in applause, and Rick Warren walked across the stage and enfolded him in his long arms.
GO TO TOP MENU

  Project Delta aims to create the perfect cookie.

  1.

  Steve Gundrum launched Project Delta at a small dinner last fall at Il Fornaio, in Burlingame, just down the road from the San Francisco Airport. It wasn't the first time he'd been to Il Fornaio, and he made his selection quickly, with just a glance at the menu; he is the sort of person who might have thought about his choice in advance -- maybe even that morning, while shaving. He would have posed it to himself as a question -- Ravioli alla Lucana?--and turned it over in his mind, assembling and disassembling the dish, ingredient by ingredient, as if it were a model airplane. Did the Pecorino pepato really belong? What if you dropped the basil? What would the ravioli taste like if you froze it, along with the ricotta and the Parmesan, and tried to sell it in the supermarket? And then what would you do about the fennel?

  Gundrum is short and round. He has dark hair and a mustache and speaks with the flattened vowels of the upper Midwest. He is voluble and excitable and doggedly unpretentious, to the point that your best chance of seeing him in a suit is probably Halloween. He runs Mattson, one of the country's foremost food research-and-development firms, which is situated in a low-slung concrete-and-glass building in a nondescript office park in Silicon Valley. Gundrum's office is a spare, windowless room near the rear, and all day long white-coated technicians come to him with prototypes in little bowls, or on skewers, or in Tupperware containers. His job is to taste and advise, and the most common words out of his mouth are "I have an idea." Just that afternoon, Gundrum had ruled on the reformulation of a popular spinach dip (which had an unfortunate tendency to smell like lawn clippings) and examined the latest iteration of a low-carb kettle corn for evidence of rhythmic munching (the metronomic hand-to-mouth cycle that lies at the heart of any successful snack experience). Mattson created the shelf-stable Mrs. Fields Chocolate Chip Cookie, the new Boca Burger products for Kraft Foods, Orville Redenbacher's Butter Toffee Popcorn Clusters, and so many other products that it is impossible to walk down the aisle of a supermarket and not be surrounded by evidence of the company's handiwork.

  That evening, Gundrum had invited two of his senior colleagues at Mattson -- Samson Hsia and Carol Borba -- to dinner, along with Steven Addis, who runs a prominent branding firm in the Bay Area. They sat around an oblong table off to one side of the dining room, with the sun streaming in the window, and Gundrum informed them that he intended to reinvent the cookie, to make something both nutritious and as "indulgent" as the premium cookies on the supermarket shelf. "We want to delight people," he said. "We don't want some ultra-high-nutrition power bar, where you have to rationalize your consumption." He said it again: "We want to delight people."

  As everyone at the table knew, a healthful, good-tasting cookie is something of a contradiction. A cookie represents the combination of three unhealthful ingredients--sugar, white flour, and shortening. The sugar adds sweetness, bulk, and texture: along with baking powder, it produces the tiny cell structures that make baked goods light and fluffy. The fat helps carry the flavor. If you want a big hit of vanilla, or that chocolate taste that really blooms in the nasal cavities, you need fat. It also keeps the strands of gluten in the flour from getting too tightly bound together, so that the cookie stays chewable. The ¦our, of course, gives the batter its structure, and, with the sugar, provides the base for the browning reaction that occurs during baking. You could replace the standard white flour with wheat flour, which is higher in fibre, but fibre adds grittiness. Over the years, there have been many attempts to resolve these contradictions -- from Snackwells and diet Oreos to the dry, grainy hockey pucks that pass for cookies in health-food stores -- but in every case ¦flavor or fluffiness or tenderness has been compromised. Steve Gundrum was undeterred. He told his colleagues that he wanted Project Delta to create the world's great-est cookie. He wanted to do it in six months. He wanted to enlist the biggest players in the American food industry. And how would he come up with this wonder cookie? The old-fashioned way. He wanted to hold a bakeoff.

  2.

  The standard protocol for inventing something in the food industry is called the matrix model. There is a department for product development, which comes up with a new idea, and a department for process development, which figures out how to realize it, and then, down the line, departments for packing, quality assurance, regulatory affairs, chemistry, microbiology, and so on. In a conventional bakeoff, Gundrum would have pitted three identical matrixes against one another and compared the results. But he wasn't satisfied with the unexamined assumption behind the conventional bakeoff -- that there was just one way of inventing something new.

  Gundrum had a particular interest, as it happened, in software. He had read widely about it, and once, when he ran into Steve Jobs at an Apple store in the Valley, chatted with him for forty-five minutes on technical matters relating to the Apple operating system. He saw little difference between what he did for a living and what the soft-ware engineers in the surrounding hills of Silicon Valley did. "Lines of code are no different from a recipe," he explains. "It's the same thing. You add a little salt, and it tastes better. You write a little piece of code, and it makes the software work faster." But in the software world, Gundrum knew, there were ongoing debates about the best way to come up with new code.

  On the one hand, there was the "open source" movement. Its patron saint was Linus Torvald, the Norwegian hacker who decided to build a free version of Unix, the hugely complicated operating system that runs many of the world's large computers. Torvald created the basic implementation of his version, which he called Linux, posted it online, and invited people to contribute to its development. Over the years, thousands of programmers had helped, and Linux was now considered as good as proprietary versions of Unix. "Given enough eyeballs all bugs are shallow" was the Linux mantra: a thousand people working for an hour each can do a better job writing and fixing code than a single person working for a thou-sand hours, because the chances are that among those thousand people you can find precisely the right expert for every problem that comes up.

  On the other hand, there was the "extreme programming" movement, known as XP, which was led by a legendary programmer named Kent Beck. He called for breaking a problem into the smallest possible increments, and proceeding as simply and modestly as possible. He thought that programmers should work in pairs, two to a computer, passing the keyboard back and forth. Between Beck and Torvald were countless other people, arguing for slightly different variations. But everyone in the software world agreed that trying to get people to be as creative as possible was, as often as not, a social problem: it depended not just on who was on the team but on how the team was organized.

  "I remember once I was working with a printing company in Chicago," Beck says. "The people there were having a terrible problem with their technology. I got there, and I saw that the senior people had these corner offices, and they were working separately and doing things separately that they had trouble integrating later on. So I said, 'Find a space where you can work together.' So they found a corner of the machine room. It was a raised floor, ice cold. They just loved it. They would go there five hours a day, making lots of progress. I flew home. They hired me for my technical expertise. And I told them to rearrange the office furniture, and that was the most valuable thing I could offer them."

  It seemed to Gundrum that people in the food world had a great deal to learn from all this. They had become adept at solving what he called "science projects" -- problems that required straightforward, linear applications of expensive German machinery and armies of white-coated people with advanced degrees in engineering. Cool Whip was a good example: a product processed so exquisitely -- with air bubbles of such fantastic uniformity and stability -- that it remains structurally sound for months, at high elevation and at low elevation, frozen and thawed and then refrozen. But coming up with a healthy cookie, which required finessing the inherent contradictions posed by sugar, flour, and shortening, was the kind of problem that the food industry had more trouble with. Gundrum recalled one brainstorming session that a client of his, a major food company, had convened. "This is no joke," he said. "They played a tape where it sounded like the wind was blowing and the birds were chirping. And they posed us out on a dance floor, and we had to hold our arms out like we were trees and close our eyes, and the ideas were supposed to grow like fruits off the limbs of the trees. Next to me was the head of R. & D., and he looked at me and said: 'What the hell are we doing here?'"

  For Project Delta, Gundrum decreed that there would be three teams, each representing a different methodology of invention. He had read Kent Beck's writings, and decided that the first would be the XP team. He enlisted two of Mattson's brightest young associates -- Peter Dea and Dan Howell. Dea is a food scientist, who worked as a confectionist before coming to Mattson. He is tall and spare, with short dark hair. "Peter is really good at hitting the high note," Gundrum said. "If a product needs to have a particular flavor profile, he's really good at getting that one dimension and getting it right." Howell is a culinarian-goateed and talkative, a man of enthusiasms who uses high-end Mattson equipment to make an exceptional cup of espresso every afternoon. He started his career as a barista at Starbucks, and then realized that his vocation lay elsewhere. "A customer said to me, 'What do you want to be doing? Because you clearly don't want to be here,'" Howell said. "I told him, 'I want to be sitting in a room working on a better non-fat pudding.' "

  The second team was headed by Barb Stuckey, an executive vice-president of marketing at Mattson and one of the firm's stars. She is slender and sleek, with short blond hair. She tends to think out loud, and, because she thinks quickly, she ends up talking quickly, too-in nervous brilliant bursts. Stuckey, Gundrum decided, would represent "managed" research and development--a traditional hierarchical team, as opposed to a partnership like Dea and Howell's. She would work with Doug Berg, who runs one of Mattson's product-development teams. Stuckey would draw the big picture. Berg would serve as sounding board and project director. His team would execute their conceptions.

  Then Gundrum was at a technology conference in California and heard the software pioneer Mitch Kapor talking about the open-source revolution. Afterward, Gundrum approached Kapor. "I said to Mitch, 'What do you think? Can I apply this--some of the same principles--outside of software and bring it to the food industry?'" Gundrum recounted. "He stopped and said, 'Why the hell not!'" So Gundrum invited an élite group of food-industry bakers and scientists to collaborate online. They would be the third team. He signed up a senior person from Mars, Inc., someone from R. & D. at Kraft, the marketing manager for Nestlé Toll House refrigerated/frozen cookie dough, a senior director of R. & D. at Birds Eye Foods, the head of the innovation program for Kellogg's Morning Foods, the director of seasoning at McCormick, a cookie maven formerly at Keebler, and six more high-level specialists. Mattson's innovation manager, Carol Borba, who began her career as a line cook at Bouley, in Manhattan, was given the role of project manager. Two Mattson staffers were assigned to carry out the group's recommendations. This was the Dream Team. It is quite possible that this was the most talented group of people ever to work together in the history of the food industry.

  Soon after the launch of Project Delta, Steve Gundrum and his colleague Samson Hsia were standing around, talking about the current products in the supermarket which they particularly admire. "I like the Uncrustable line from Smuckers," Hsia said. "It's a frozen sandwich without any crust. It eats very well. You can put it in a lunchbox frozen, and it will be unfrozen by lunchtime." Hsia is a trim, silver-haired man who is said to know as much about emulsions as anyone in the business. "There's something else," he said, suddenly. "We just saw it last week. It's made by Jennie-O. It's turkey in a bag." This was a turkey that was seasoned, plumped with brine, and sold in a heat-resistant plastic bag: the customer simply has to place it in the oven. Hsia began to stride toward the Mattson kitchens, because he realized they actually had a Jennie-O turkey in the back. Gundrum followed, the two men weaving their way through the maze of corridors that make up the Mattson offices. They came to a large freezer. Gundrum pulled out a bright-colored bag. Inside was a second, clear bag, and inside that bag was a twelve-pound turkey. "This is one of my favorite innovations of the last year," Gundrum said, as Hsia nodded happily. "There is material science involved. There is food science involved. There is positioning involved. You can take this thing, throw it in your oven, and people will be blown away. It's that good. If I was Butterball, I'd be terrified."

  Jennie-O had taken something old and made it new. But where had that idea come from? Was it a team? A committee? A lone turkey genius? Those of us whose only interaction with such innovations is at the point of sale have a naïve faith in human creativity; we suppose that a world capable of coming up with turkey in a bag is capable of coming up with the next big thing as well--a healthy cookie, a faster computer chip, an automobile engine that gets a hundred miles to the gallon. But if you're the one responsible for those bright new ideas there is no such certainty. You come up with one great idea, and the process is so miraculous that all you do is puzzle over how on earth you ever did it, and worry whether you'll ever be able to do it again.

  3.

  The Mattson kitchens are a series of large, connecting rooms, running along the back of the building. There is a pilot plant in one corner -- containing a mini version of the equipment that, say, Heinz would use to make canned soup, a soft-serve ice-cream machine, an industrial-strength pasta-maker, a colloid mill for making oil-and-water emulsions, a flash pasteurizer, and an eighty-five-thousand-dollar Japanese-made coextruder for, among other things, pastry-and-filling combinations. At any given time, the firm may have as many as fifty or sixty projects under way, so the kitchens are a hive of activity, with pressure cookers filled with baked beans bubbling in one corner, and someone rushing from one room to another carrying a tray of pizza slices with experimental toppings.

  Dea and Howell, the XP team, took over part of one of the kitchens, setting up at a long stainless-steel lab bench. The countertop was crowded with tins of flour, a big white plastic container of wheat dextrin, a dozen bottles of liquid sweeteners, two plastic bottles of Kirkland olive oil, and, somewhat puzzlingly, three varieties of single-malt Scotch. The Project Delta brief was simple. All cookies had to have fewer than a hundred and thirty calories per serving. Carbohydrates had to be under 17.5 grams, saturated fat under two grams, fibre more than one gram, protein more than two grams, and so on; in other words, the cookie was to be at least fifteen per cent superior to the supermarket average in the major nutritional categories. To Dea and Howell, that suggested oatmeal, and crispy, as opposed to soft. "I've tried lots of cookies that are sold as soft and I never like them, because they're trying to be something that they're not," Dea explained. "A soft cookie is a fresh cookie, and what you are trying to do with soft is be a fresh cookie that's a month old. And that means you need to fake the freshness, to engineer the cookie."

  The two decided to focus on a kind of oatmeal-chocolate-chip hybrid, with liberal applications of roasted soy nuts, toffee, and caramel. A straight oatmeal-raisin cookie or a straight low-cal chocolate-chip cookie was out of the question. This was a reflection of what might be called the Hidden Valley Ranch principle, in honor of a story that Samson Hsia often told about his years working on salad dressing when he was at Clorox. The couple who owned Hidden Valley Ranch, near Santa Barbara, had come up with a seasoning blend of salt, pepper, onion, garlic, and parsley flakes that was mixed with equal parts mayonnaise and buttermilk to make what was, by all accounts, an extraordinary dressing. Clorox tried to bottle it, but found that the buttermilk could not coexist, over any period of time, with the mayonnaise. The way to fix the problem, and preserve the texture, was to make the combination more acidic. But when you increased the acidity you ruined the flavor. Clorox's food engineers worked on Hidden Valley Ranch dressing for close to a decade. They tried different kinds of processing and stability control and endless cycles of consumer testing before they gave up and simply came out with a high-acid Hidden Valley Ranch dressing -- which promptly became a runaway best-seller. Why? Because consumers had never tasted real Hidden Valley Ranch dressing, and as a result had no way of knowing that what they were eating was inferior to the original. For those in the food business, the lesson was unforgettable: if something was new, it didn't have to be perfect. And, since healthful, indulgent cookies couldn't be perfect, they had to be new: hence oatmeal, chocolate chips, toffee, and caramel.

  Cookie development, at the Mattson level, is a matter of endless iteration, and Dea and Howell began by baking version after version in quick succession -- establishing the cookie size, the optimal baking time, the desired variety of chocolate chips, the cut of oats (bulk oats? rolled oats? groats?), the varieties of flour, and the toffee dosage, while testing a variety of high-tech supplements, notably inulin, a fibre source derived from chicory root. As they worked, they made notes on tablet P.C.s, which gave them a running electronic record of each version. "With food, there's a large circle of pretty good, and we're solidly in pretty good," Dea announced, after several intensive days of baking. A tray of cookies was cooling in front of him on the counter. "Typically, that's when you take it to the customers."

  In this case, the customer was Gundrum, and the next week Howell marched over to Gundrum's office with two Ziploc bags of cookies in his hand. There was a package of Chips Ahoy! on the table, and Howell took one out. "We've been eating these versus Chips Ahoy!," he said.

  The two cookies looked remarkably alike. Gundrum tried one of each. "The Chips Ahoy!, it's tasty," he said. "When you eat it, the starch hydrates in your mouth. The XP doesn't have that same granulated-sugar kind of mouth feel."

  "It's got more fat than us, though, and subsequently it's shorter in texture," Howell said. "And so, when you break it, it breaks more nicely. Ours is a little harder to break."

  By "shorter in texture," he meant that the cookie "popped" when you bit into it. Saturated fats are solid fats, and give a cookie crispness. Parmesan cheese is short-textured. Brie is long. A shortbread like a Lorna Doone is a classic short-textured cookie. But the XP cookie had, for health reasons, substituted unsaturated fats for saturated fats, and unsaturated fats are liquid. They make the dough stickier, and inevitably compromise a little of that satisfying pop.

  "The whole-wheat flour makes us a little grittier, too," Howell went on. "It has larger particulates." He broke open one of the Chips Ahoy!. "See how fine the grain is? Now look at one of our cookies. The particulates are larger. It is part of what we lose by going with a healthy profile. If it was just sugar and ¦our, for instance, the carbohydrate chains are going to be shorter, and so they will dissolve more quickly in your mouth. Whereas with more fibre you get longer carbohydrate chains and they don't dissolve as quickly, and you get that slightly tooth-packing feel."

  "It looks very wholesome, like something you would want to feed your kids," Gundrum said, finally. They were still only in the realm of pretty good.

  4.

  Team Stuckey, meanwhile, was having problems of its own. Barb Stuckey's first thought had been a tea cookie, or, more specifically, a chai cookie -- something with cardamom and cinnamon and vanilla and cloves and a soft dairy note. Doug Berg was dispatched to run the experiment. He and his team did three or four rounds of prototypes. The result was a cookie that tasted, astonishingly, like a cup of chai, which was, of course, its problem. Who wanted a cookie that tasted like a cup of chai? Stuckey called a meeting in the Mattson trophy room, where samples of every Mattson product that has made it to market are displayed. After everyone was done tasting the cookies, a bag of them sat in the middle of the table for forty-five minutes--and no one reached to take a second bite. It was a bad sign.

  "You know, before the election Good Housekeeping had this cookie bakeoff," Stuckey said, as the meeting ended. "Laura Bush's entry was full of chocolate chips and had familiar ingredients. And Teresa Heinz went with pumpkin-spice cookies. I remember thinking, That's just like the Democrats! So not mainstream! I wanted her to win. But she's chosen this cookie that's funky and weird and out of the box. And I kind of feel the same way about the tea cookie. It's too far out, and will lose to something that's more comfortable for consumers."

  Stuckey's next thought involved strawberries and a shortbread base. But shortbread was virtually impossible under the nutritional guidelines: there was no way to get that smooth butter-flour-sugar combination. So Team Stuckey switched to something closer to a strawberry-cobbler cookie, which had the Hidden Valley Ranch advantage that no one knew what a strawberry-cobbler cookie was supposed to taste like. Getting the carbohydrates down to the required 17.5 grams, though, was a struggle, because of how much flour and fruit cobbler requires. The obvious choice to replace the flour was almonds. But nuts have high levels of both saturated and unsaturated fat. "It became a balancing act," Anne Cristofano, who was doing the bench work for Team Stuckey, said. She baked batch after batch, playing the carbohydrates (first the flour, and then granulated sugar, and finally various kinds of what are called sugar alcohols, low-calorie sweeteners derived from hydrogenizing starch) against the almonds. Cristofano took a version to Stuckey. It didn't go well.

  "We're not getting enough strawberry impact from the fruit alone," Stuckey said. "We have to find some way to boost the strawberry." She nibbled some more. "And, because of the low fat and all that stuff, I don't feel like we're getting that pop."

  The Dream Team, by any measure, was the overwhelming Project Delta favorite. This was, after all, the Dream Team, and if any idea is ingrained in our thinking it is that the best way to solve a difficult problem is to bring the maximum amount of expertise to bear on it. Sure enough, in the early going the Dream Team was on fire. The members of the Dream Team did not doggedly fix on a single idea, like Dea and Howell, or move in fits and starts from chai sugar cookies to strawberry shortbread to strawberry cobbler, like Team Stuckey. It came up with thirty-four ideas, representing an astonishing range of cookie philosophies: a chocolate cookie with gourmet cocoa, high-end chocolate chips, pecans, raisins, Irish steel-cut oats, and the new Ultragrain White Whole Wheat flour; a bite-size oatmeal cookie with a Ceylon cinnamon filling, or chili and tamarind, or pieces of dried peaches with a cinnamon-and-ginger dusting; the classic seven-layer bar with oatmeal instead of graham crackers, coated in chocolate with a choice of coffee flavors; a "wellness" cookie, with an oatmeal base, soy and whey proteins, inulin and oat beta glucan and a combination of erythritol and sugar and sterol esters--and so on.

  In the course of spewing out all those new ideas, however, the Dream Team took a difficult turn. A man named J. Hugh McEvoy (a.k.a. Chef J.), out of Chicago, tried to take control of the discussion. He wanted something exotic -- not a health-food version of something already out there. But in the e-mail discussions with others on the team his sense of what constituted exotic began to get really exotic -- "Chinese star anise plus fennel plus Pastis plus dark chocolate." Others, emboldened by his example, began talking about a possible role for zucchini or wasabi peas. Meanwhile, a more conservative faction, mindful of the Project Delta mandate to appeal to the whole family, started talking up peanut butter. Within a few days, the tensions were obvious:

  From: Chef J.

  Subject: <no subject>

  Please keep in mind that less than 10 years ago, espresso, latte and dulce de leche were EXOTIC flavors / products that were considered unsuitable for the mainstream. And let's not even mention CHIPOTLE.

  From: Andy Smith

  Subject: Bought any Ben and Jerry's recently?

  While we may not want to invent another Oreo or Chips Ahoy!, last I looked, World's Best Vanilla was B&J's # 2 selling flavor and Haagen Dazs' Vanilla (their top seller) outsold Dulce 3 to 1.

  From: Chef J.

  Subject: <no subject>Yes. Gourmet Vanilla does outsell any new flavor. But we must remember that DIET vanilla does not and never has. It is the high end, gourmet segment of ice cream that is growing. Diet Oreos were vastly outsold by new entries like Snackwells. Diet Snickers were vastly outsold by new entries like balance bars. New Coke failed miserably, while Red Bull is still growing.

  What flavor IS Red Bull, anyway?

  Eventually, Carol Borba, the Dream Team project leader, asked Gundrum whether she should try to calm things down. He told her no; the group had to find its "own kind of natural rhythm." He wanted to know what fifteen high-powered bakers thrown together on a project felt like, and the answer was that they felt like chaos. They took twice as long as the XP team. They created ten times the headache.

  Worse, no one in the open-source group seemed to be having any fun. "Quite honestly, I was expecting a bit more involvement in this," Howard Plein, of Edlong Dairy Flavors, confessed afterward. "They said, expect to spend half an hour a day. But without doing actual bench work -- all we were asked to do was to come up with ideas." He wanted to bake: he didn't enjoy being one of fifteen cogs in a machine. To Dan Fletcher, of Kellogg's, "the whole thing spun in place for a long time. I got frustrated with that. The number of people involved seemed unwieldy. You want some diversity of youth and experience, but you want to keep it close-knit as well. You get some depth in the process versus breadth. We were a mile wide and an inch deep." Chef J., meanwhile, felt thwarted by Carol Borba; he felt that she was pushing her favorite, a caramel turtle, to the detriment of better ideas. "We had the best people in the country involved," he says. "We were irrelevant. That's the weakness of it. Fifteen is too many. How much true input can any one person have when you are lost in the crowd?" In the end, the Dream Team whittled down its thirty-four possibilities to one: a chewy oatmeal cookie, with a pecan "thumbprint" in the middle, and ribbons of caramel-and-chocolate glaze. When Gundrum tasted it, he had nothing but praise for its "cookie hedonics." But a number of the team members were plainly unhappy with the choice. "It is not bad," Chef J. said. "But not bad doesn't win in the food business. There was nothing there that you couldn't walk into a supermarket and see on the shelf. Any Pepperidge Farm product is better than that. Any one."

  It may have been a fine cookie. But, since no single person played a central role in its creation, it didn't seem to anyone to be a fine cookie.

  The strength of the Dream Team -- the fact that it had so many smart people on it -- was also its weakness: it had too many smart people on it. Size provides expertise. But it also creates friction, and one of the truths Project Delta exposed is that we tend to overestimate the importance of expertise and underestimate the problem of friction. Gary Klein, a decision-making consultant, once examined this issue in depth at a nuclear power plant in North Carolina. In the nineteen-nineties, the power supply used to keep the reactor cool malfunctioned. The plant had to shut down in a hurry, and the shutdown went badly. So the managers brought in Klein's consulting group to observe as they ran through one of the crisis rehearsals mandated by federal regulators. "The drill lasted four hours," David Klinger, the lead consultant on the project, recalled. "It was in this big operations room, and there were between eighty and eighty-five people involved. We roamed around, and we set up a video camera, because we wanted to make sense of what was happening."

  When the consultants asked people what was going on, though, they couldn't get any satisfactory answers. "Each person only knew a little piece of the puzzle, like the radiation person knew where the radiation was, or the maintenance person would say, 'I'm trying to get this valve closed,' " Klinger said. "No one had the big picture. We started to ask questions. We said, 'What is your mission?' And if the person didn't have one, we said, 'Get out.' There were just too many people. We ended up getting that team down from eighty-five to thirty-five people, and the first thing that happened was that the noise in the room was dramatically reduced." The room was quiet and calm enough so that people could easily find those they needed to talk to. "At the very end, they had a big drill that the N.R.C. was going to regulate. The regulators said it was one of their hardest drills. And you know what? They aced it." Was the plant's management team smarter with thirty-five people on it than it was with eighty-five? Of course not, but the expertise of those additional fifty people was more than cancelled out by the extra confusion and noise they created.

  The open-source movement has had the same problem. The number of people involved can result in enormous friction. The software theorist Joel Spolsky points out that open-source software tends to have user interfaces that are difficult for ordinary people to use: "With Microsoft Windows, you right-click on a folder, and you're given the option to share that folder over the Web. To do the same thing with Apache, the open-source Web server, you've got to track down a file that has a different name and is stored in a different place on every system. Then you have to edit it, and it has its own syntax and its own little programming language, and there are lots of different comments, and you edit it the first time and it doesn't work and then you edit it the second time and it doesn't work."

  Because there are so many individual voices involved in an open-source project, no one can agree on the right way to do things. And, because no one can agree, every possible option is built into the software, thereby frustrating the central goal of good design, which is, after all, to understand what to leave out. Spolsky notes that almost all the successful open-source products have been attempts to clone some preexisting software program, like Microsoft's Internet Explorer, or Unix. "One of the reasons open source works well for Linux is that there isn't any real design work to be undertaken," he says. "They were doing what we would call chasing tail-lights." Open source was great for a science project, in which the goals were clearly defined and the technical hurdles easily identifiable. Had Project Delta been a Cool Whip bakeoff, an exercise in chasing tail-lights, the Dream Team would easily win. But if you want to design a truly innovative software program -- or a truly innovative cookie -- the costs of bigness can become overwhelming.

  In the frantic final weeks before the bakeoff, while the Dream Team was trying to fix a problem with crumbling, and hit on the idea of glazing the pecan on the face of the cookie, Dea and Howell continued to make steady, incremental improvements.

  "These cookies were baked five days ago," Howell told Gundrum, as he handed him a Ziploc bag. Dea was off somewhere in the Midwest, meeting with clients, and Howell looked apprehensive, stroking his goatee nervously as he stood by Gundrum's desk. "We used wheat dextrin, which I think gives us some crispiness advantages and some shelf-stability advantages. We have a little more vanilla in this round, which gives you that brown, rounding background note."

  Gundrum nodded. "The vanilla is almost like a surrogate for sugar," he said. "It potentiates the sweetness."

  "Last time, the leavening system was baking soda and baking powder," Howell went on. "I switched that to baking soda and monocalcium phosphate. That helps them rise a little bit better. And we baked them at a slightly higher temperature for slightly longer, so that we drove off a little bit more moisture."

  "How close are you?" Gundrum asked.

  "Very close," Howell replied.

  Gundrum was lost in thought for a moment. "It looks very wholesome. It looks like something you'd want to feed your kids. It has very good aroma. I really like the texture. My guess is that it eats very well with milk." He turned back to Howell, suddenly solicitous. "Do you want some milk?"

  Meanwhile, Barb Stuckey had a revelation. She was working on a tortilla-chip project, and had bags of tortilla chips all over her desk. "You have no idea how much engineering goes into those things," she said, holding up a tortilla chip. "It's greater than what it takes to build a bridge. It's crazy." And one of the clever things about cheese tortilla chips--particularly the low-fat versions--is how they go about distracting the palate. "You know how you put a chip in your mouth and the minute it hits your tongue it explodes with flavor?" Stuckey said. "It's because it's got this topical seasoning. It's got dried cheese powders and sugar and probably M.S.G. and all that other stuff on the outside of the chip."

  Her idea was to apply that technique to strawberry cobbler--to take large crystals of sugar, plate them with citric acid, and dust the cookies with them. "The minute they reach your tongue, you get this sweet-and-sour hit, and then you crunch into the cookie and get the rest--the strawberry and the oats," she said. The crystals threw off your taste buds. You weren't focussed on the fact that there was half as much fat in the cookie as there should be. Plus, the citric acid brought a tangy flavor to the dried strawberries: suddenly they felt fresh.

  Batches of the new strawberry-cobbler prototype were ordered up, with different formulations of the citric acid and the crystals. A meeting was called in the trophy room. Anne Cristofano brought two plastic bags filled with cookies. Stuckey was there, as was a senior Mattson food technologist named Karen Smithson, an outsider brought to the meeting in an advisory role. Smithson, a former pastry chef, was a little older than Stuckey and Cristofano, with an air of self-possession. She broke the seal on the first bag, and took a bite with her eyes half closed. The other two watched intently.

  "Umm," Smithson said, after the briefest of pauses. "That is pretty darn good. And this is one of the healthy cookies? I would not say, 'This is healthy.' I can't taste the trade-off." She looked up at Stuckey. "How old are they?"

  "Today," Stuckey replied.

  "O.K. . . ." This was a complicating fact. Any cookie tastes good on the day it's baked. The question was how it tasted after baking and packaging and shipping and sitting in a warehouse and on a supermarket shelf and finally in someone's cupboard.

  "What we're trying to do here is a shelf-stable cookie that will last six months," Stuckey said. "I think we're better off if we can make it crispy."

  Smithson thought for a moment. "You can have either a crispy, low-moisture cookie or a soft and chewy cookie," she said. "But you can't get the outside crisp and the inside chewy. We know that. The moisture will migrate. It will equilibrate over time, so you end up with a cookie that's consistent all the way through. Remember we did all that work on Mrs. Fields? That's what we learned."

  They talked for a bit, in technical terms, about various kinds of sugars and starches. Smithson didn't think that the stability issue was going to be a problem.

  "Isn't it compelling, visually?" Stuckey blurted out, after a lull in the conversation. And it was: the dried-strawberry chunks broke though the surface of the cookie, and the tiny citric-sugar crystals glinted in the light. "I just think you get so much more bang for the buck when you put the seasoning on the outside."

  "Yet it's not weird," Smithson said, nodding. She picked up another cookie. "The mouth feel is a combination of chewy and crunchy. With the flavors, you have the caramelized sugar, the brown-sugar notes. You have a little bit of a chew from the oats. You have a flavor from the strawberry, and it helps to have a combination of the sugar alcohol and the brown sugar. You know, sugars have different deliveries, and sometimes you get some of the sweetness right off and some of it continues on. You notice that a lot with the artificial sweeteners. You get the sweetness that doesn't go away, long after the other flavors are gone. With this one, the sweetness is nice. The flavors come together at the same time and fade at the same time, and then you have the little bright after-hits from the fruit and the citric crunchies, which are" -- she paused, looking for the right word -- "brilliant."

  5.

  The bakeoff took place in April. Mattson selected a representative sample of nearly three hundred households from around the country. Each was mailed bubble-wrapped packages containing all three entrants. The vote was close but unequivocal. Fourteen per cent of the households voted for the XP oatmeal-chocolate-chip cookie. Forty-one per cent voted for the Dream Team's oatmeal-caramel cookie. Forty-four per cent voted for Team Stuckey's strawberry cobbler.

  The Project Delta postmortem was held at Chaya Brasserie, a French-Asian fusion restaurant on the Embarcadero, in San Francisco. It was just Gundrum and Steven Addis, from the first Project Delta dinner, and their wives. Dan Howell was immersed in a confidential project for a big food conglomerate back East. Peter Dea was working with Cargill on a wellness product. Carol Borba was in Chicago, at a meeting of the Food Marketing Institute. Barb Stuckey was helping Ringling Brothers rethink the food at its concessions. "We've learned a lot about the circus," Gundrum said. Meanwhile, Addis's firm had created a logo and a brand name for Project Delta. Mattson has offered to license the winning cookie at no cost, as long as a percentage of its sales goes to a charitable foundation that Mattson has set up to feed the hungry. Someday soon, you should be able to go into a supermarket and buy Team Stuckey's strawberry-cobbler cookie.

  "Which one would you have voted for?" Addis asked Gundrum.

  "I have to say, they were all good in their own way," Gundrum replied. It was like asking a mother which of her children she liked best. "I thought Barb's cookie was a little too sweet, and I wish the open-source cookie was a little tighter, less crumbly. With XP, I think we would have done better, but we had a wardrobe malfunction. They used too much batter, overbaked it, and the cookie came out too hard and thick.

  "In the end, it was not so much which cookie won that interested him. It was who won--and why. Three people from his own shop had beaten a Dream Team, and the decisive edge had come not from the collective wisdom of a large group but from one person's ability to make a lateral connection between two previously unconnected objects -- a tortilla chip and a cookie. Was that just Barb being Barb? In large part, yes. But it was hard to believe that one of the Dream Team members would not have made the same kind of leap had they been in an environment quiet enough to allow them to think.

  "Do you know what else we learned?" Gundrum said. He was talking about a questionnaire given to the voters. "We were looking at the open-ended questions -- where all the families who voted could tell us what they were thinking. They all said the same thing -- all of them." His eyes grew wide. "They wanted better granola bars and breakfast bars. I would not have expected that." He fell silent for a moment, turning a granola bar over and around in his mind, assembling and disassembling it piece by piece, as if it were a model airplane. "I thought that they were pretty good," he said. "I mean, there are so many of them out there. But apparently people want them better."
GO TO TOP MENU

  The bad idea behind our failed health-care system.

  1.

  Tooth decay begins, typically, when debris becomes trapped between the teeth and along the ridges and in the grooves of the molars.=The food rots.=It becomes colonized with bacteria.=The bacteria feeds off sugars in the mouth and forms an acid that begins to eat away at the enamel of the teeth.=Slowly, the bacteria works its way through to the dentin, the inner structure, and from there the cavity begins to blossom three-dimensionally, spreading inward and sideways.=When the decay reaches the pulp tissue, the blood vessels, and the nerves that serve the tooth, the pain starts--an insistent throbbing.=The tooth turns brown.=It begins to lose its hard structure, to the point where a dentist can reach into a cavity with a hand instrument and scoop out the decay.=At the base of the tooth, the bacteria mineralizes into tartar, which begins to irritate the gums.=They become puffy and bright red and start to recede, leaving more and more of the tooth's root exposed.=When the infection works its way down to the bone, the structure holding the tooth in begins to collapse altogether.

  Several years ago, two Harvard researchers, Susan Starr Sered and Rushika Fernandopulle, set out to interview people without health-care coverage for a book they were writing, "Uninsured in America." They talked to as many kinds of people as they could find, collecting stories of untreated depression and struggling single mothers and chronically injured laborers--and the most common complaint they heard was about teeth.=Gina, a hairdresser in Idaho, whose husband worked as a freight manager at a chain store, had "a peculiar mannerism of keeping her mouth closed even when speaking." It turned out that she hadn't been able to afford dental care for three years, and one of her front teeth was rotting.=Daniel, a construction worker, pulled out his bad teeth with pliers.=Then, there was Loretta, who worked nights at a university research center in Mississippi, and was missing most of her teeth.="They'll break off after a while, and then you just grab a hold of them, and they work their way out," she explained to Sered and Fernandopulle.="It hurts so bad, because the tooth aches.=Then it's a relief just to get it out of there.=The hole closes up itself anyway.=So it's so much better."

  People without health insurance have bad teeth because, if you're paying for everything out of your own pocket, going to the dentist for a checkup seems like a luxury.=It isn't, of course.=The loss of teeth makes eating fresh fruits and vegetables difficult, and a diet heavy in soft, processed foods exacerbates more serious health problems, like diabetes.=The pain of tooth decay leads many people to use alcohol as a salve.=And those struggling to get ahead in the job market quickly find that the unsightliness of bad teeth, and the self-consciousness that results, can become a major barrier.=If your teeth are bad, you're not going to get a job as a receptionist, say, or a cashier.=You're going to be put in the back somewhere, far from the public eye.=What Loretta, Gina, and Daniel understand, the two authors tell us, is that bad teeth have come to be seen as a marker of "poor parenting, low educational achievement and slow or faulty intellectual development." They are an outward marker of caste.="Almost every time we asked interviewees what their first priority would be if the president established universal health coverage tomorrow," Sered and Fernandopulle write, "the immediate answer was 'my teeth.' "

  The U.=S.=health-care system, according to "Uninsured in America," has created a group of people who increasingly look different from others and suffer in ways that others do not.=The leading cause of personal bankruptcy in the United States is unpaid medical bills.=Half of the uninsured owe money to hospitals, and a third are being pursued by collection agencies.=Children without health insurance are less likely to receive medical attention for serious injuries, for recurrent ear infections, or for asthma.=Lung-cancer patients without insurance are less likely to receive surgery, chemotherapy, or radiation treatment.=Heart-attack victims without health insurance are less likely to receive angioplasty.=People with pneumonia who don't have health insurance are less likely to receive X rays or consultations.=The death rate in any given year for someone without health insurance is twenty-five per cent higher than for someone with insur-ance.=Because the uninsured are sicker than the rest of us, they can't get better jobs, and because they can't get better jobs they can't afford health insurance, and because they can't afford health insurance they get even sicker.=John, the manager of a bar in Idaho, tells Sered and Fernandopulle that as a result of various workplace injuries over the years he takes eight ibuprofen, waits two hours, then takes eight more--and tries to cadge as much prescription pain medication as he can from friends.="There are times when I should've gone to the doctor, but I couldn't afford to go because I don't have insurance," he says.="Like when my back messed up, I should've gone.=If I had insurance, I would've went, because I know I could get treatment, but when you can't afford it you don't go.=Because the harder the hole you get into in terms of bills, then you'll never get out.=So you just say, 'I can deal with the pain.' "

  2.

  One of the great mysteries of political life in the United States is why Americans are so devoted to their health-care system.=Six times in the past century--during the First World War, during the Depression, during the Truman and Johnson Administrations, in the Senate in the nineteen-seventies, and during the Clinton years--efforts have been made to introduce some kind of universal health insurance, and each time the efforts have been rejected.=Instead, the United States has opted for a makeshift system of increasing complexity and dysfunction.=Americans spend $5,267 per capita on health care every year, almost two and half times the industrialized world's median of $2,193; the extra spending comes to hundreds of billions of dollars a year.=What does that extra spending buy us? Americans have fewer doctors per capita than most Western countries.=We go to the doctor less than people in other Western countries.=We get admitted to the hospital less frequently than people in other Western countries.=We are less satisfied with our health care than our counterparts in other countries.=American life expectancy is lower than the Western average.=Childhood-immunization rates in the United States are lower than average.=Infant-mortality rates are in the nineteenth percentile of industrialized nations.=Doctors here perform more high-end medical procedures, such as coronary angioplasties, than in other countries, but most of the wealthier Western countries have more CT scanners than the United States does, and Switzerland, Japan, Austria, and Finland all have more MRI machines per capita.=Nor is our system more efficient.=The United States spends more than a thousand dollars per capita per year--or close to four hundred billion dollars--on health-care-related paperwork and administration, whereas Canada, for example, spends only about three hundred dollars per capita.=And, of course, every other country in the industrialized world insures all its citizens; despite those extra hundreds of billions of dollars we spend each year, we leave forty-five million people without any insurance.=A country that displays an almost ruthless commitment to efficiency and performance in every aspect of its economy--a country that switched to Japanese cars the moment they were more reliable, and to Chinese T-shirts the moment they were five cents cheaper--has loyally stuck with a health-care system that leaves its citizenry pulling out their teeth with pliers.

  America's health-care mess is, in part, simply an accident of history.=The fact that there have been six attempts at universal health coverage in the last century suggests that there has long been support for the idea.=But politics has always got in the way.=In both Europe and the United States, for example, the push for health insurance was led, in large part, by organized labor.=But in Europe the unions worked through the political system, fighting for coverage for all citizens.=From the start, health insurance in Europe was public and universal, and that created powerful political support for any attempt to expand benefits.=In the United States, by contrast, the unions worked through the collective-bargaining system and, as a result, could win health benefits only for their own members.=Health insurance here has always been private and selective, and every attempt to expand benefits has resulted in a paralyzing political battle over who would be added to insurance rolls and who ought to pay for those additions.

  Policy is driven by more than politics, however.=It is equally driven by ideas, and in the past few decades a particular idea has taken hold among prominent American economists which has also been a powerful impediment to the expansion of health insurance.=The idea is known as "moral hazard." Health economists in other Western nations do not share this obsession.=Nor do most Americans.=But moral hazard has profoundly shaped the way think tanks formulate policy and the way experts argue and the way health insurers structure their plans and the way legislation and regulations have been written.=The health-care mess isn't merely the unintentional result of political dysfunction, in other words.=It is also the deliberate consequence of the way in which American policymakers have come to think about insurance.

  "Moral hazard" is the term economists use to describe the fact that insurance can change the behavior of the person being insured.=If your office gives you and your co-workers all the free Pepsi you want--if your employer, in effect, offers universal Pepsi insurance--you'll drink more Pepsi than you would have otherwise.=If you have a no-deductible fire-insurance policy, you may be a little less diligent in clearing the brush away from your house.=The savings-and-loan crisis of the nineteen-eighties was created, in large part, by the fact that the federal government insured savings deposits of up to a hundred thousand dollars, and so the newly deregulated S. & L.s made far riskier investments than they would have otherwise.=Insurance can have the paradoxical effect of producing risky and wasteful behavior.=Economists spend a great deal of time thinking about such moral hazard for good reason.=Insurance is an attempt to make human life safer and more secure.=But, if those efforts can backfire and produce riskier behavior, providing insurance becomes a much more complicated and problematic endeavor.

  In 1968, the economist Mark Pauly argued that moral hazard played an enormous role in medicine, and, as John Nyman writes in his book "The Theory of the Demand for Health Insurance," Pauly's paper has become the "single most influential article in the health economics literature." Nyman, an economist at the University of Minnesota, says that the fear of moral hazard lies behind the thicket of co-payments and deductibles and utilization reviews which characterizes the American health-insurance system.=Fear of moral hazard, Nyman writes, also explains "the general lack of enthusiasm by U.S.=health economists for the expansion of health insurance coverage (for example, national health insurance or expanded Medicare benefits) in the U.S."

  What Nyman is saying is that when your insurance company requires that you make a twenty-dollar co-payment for a visit to the doctor, or when your plan includes an annual five-hundred-dollar or thousand-dollar deductible, it's not simply an attempt to get you to pick up a larger share of your health costs.=It is an attempt to make your use of the health-care system more efficient.=Making you responsible for a share of the costs, the argument runs, will reduce moral hazard: you'll no longer grab one of those free Pepsis when you aren't really thirsty.=That's also why Nyman says that the notion of moral hazard is behind the "lack of enthusiasm" for expansion of health insurance.=If you think of insurance as producing wasteful consumption of medical services, then the fact that there are forty-five million Americans without health insurance is no longer an immediate cause for alarm.=After all, it's not as if the uninsured never go to the doctor.=They spend, on average, $934 a year on medical care.=A moral-hazard theorist would say that they go to the doctor when they really have to.=Those of us with private insurance, by contrast, consume $2,347 worth of health care a year.=If a lot of that extra $1,413 is waste, then maybe the uninsured person is the truly efficient consumer of health care.

  The moral-hazard argument makes sense, however, only if we consume health care in the same way that we consume other consumer goods, and to economists like Nyman this assumption is plainly absurd.=We go to the doctor grudgingly, only because we're sick.="Moral hazard is overblown," the Princeton economist Uwe Reinhardt says.="You always hear that the demand for health care is unlimited.=This is just not true.=People who are very well insured, who are very rich, do you see them check into the hospital because it's free? Do people really like to go to the doctor? Do they check into the hospital instead of playing golf?"

  For that matter, when you have to pay for your own health care, does your consumption really become more efficient? In the late nineteen-seventies, the rand Corporation did an extensive study on the question, randomly assigning families to health plans with co-payment levels at zero per cent, twenty-five per cent, fifty per cent, or ninety-five per cent, up to six thousand dollars.=As you might expect, the more that people were asked to chip in for their health care the less care they used.=The problem was that they cut back equally on both frivolous care and useful care.=Poor people in the high-deductible group with hypertension, for instance, didn't do nearly as good a job of controlling their blood pressure as those in other groups, resulting in a ten-per-cent increase in the likelihood of death.=As a recent Commonwealth Fund study concluded, cost sharing is "a blunt instrument." Of course it is: how should the average consumer be expected to know beforehand what care is frivolous and what care is useful? I just went to the dermatologist to get moles checked for skin cancer.=If I had had to pay a hundred per cent, or even fifty per cent, of the cost of the visit, I might not have gone.=Would that have been a wise decision? I have no idea.=But if one of those moles really is cancerous, that simple, inexpensive visit could save the health-care system tens of thousands of dollars (not to mention saving me a great deal of heartbreak).=The focus on moral hazard suggests that the changes we make in our behavior when we have insurance are nearly always wasteful.=Yet, when it comes to health care, many of the things we do only because we have insurance--like getting our moles checked, or getting our teeth cleaned regularly, or getting a mammogram or engaging in other routine preventive care--are anything but wasteful and inefficient.=In fact, they are behaviors that could end up saving the health-care system a good deal of money.

  Sered and Fernandopulle tell the story of Steve, a factory worker from northern Idaho, with a "grotesquelooking left hand--what looks like a bone sticks out the side." When he was younger, he broke his hand.="The doctor wanted to operate on it," he recalls.="And because I didn't have insurance, well, I was like 'I ain't gonna have it operated on.' The doctor said, 'Well, I can wrap it for you with an Ace bandage.' I said, 'Ahh, let's do that, then.' " Steve uses less health care than he would if he had insurance, but that's not because he has defeated the scourge of moral hazard.=It's because instead of getting a broken bone fixed he put a bandage on it.

  3.

  At the center of the Bush Administration's plan to address the health-insurance mess are Health Savings Accounts, and Health Savings Accounts are exactly what you would come up with if you were concerned, above all else, with minimizing moral hazard.=The logic behind them was laid out in the 2004 Economic Report of the President.=Americans, the report argues, have too much health insurance: typical plans cover things that they shouldn't, creating the problem of overconsumption.=Several paragraphs are then devoted to explaining the theory of moral hazard.=The report turns to the subject of the uninsured, concluding that they fall into several groups.=Some are foreigners who may be covered by their countries of origin.=Some are people who could be covered by Medicaid but aren't or aren't admitting that they are.=Finally, a large number "remain uninsured as a matter of choice." The report continues, "Researchers believe that as many as one-quarter of those without health insurance had coverage available through an employer but declined the coverage.   . Still others may remain uninsured because they are young and healthy and do not see the need for insurance." In other words, those with health insurance are overinsured and their behavior is distorted by moral hazard.=Those without health insurance use their own money to make decisions about insurance based on an assessment of their needs.=The insured are wasteful.=The uninsured are prudent.=So what's the solution? Make the insured a little bit more like the uninsured.

  Under the Health Savings Accounts system, consumers are asked to pay for routine health care with their own money--several thousand dollars of which can be put into a tax-free account.=To handle their catastrophic expenses, they then purchase a basic health-insurance package with, say, a thousand-dollar annual deductible.=As President Bush explained recently, "Health Savings Accounts all aim at empowering people to make decisions for themselves, owning their own health-care plan, and at the same time bringing some demand control into the cost of health care."

  The country described in the President's report is a very different place from the country described in "Uninsured in America." Sered and Fernandopulle look at the billions we spend on medical care and wonder why Americans have so little insurance.=The President's report considers the same situation and worries that we have too much.=Sered and Fernandopulle see the lack of insurance as a problem of poverty; a third of the uninsured, after all, have incomes below the federal poverty line.=In the section on the uninsured in the President's report, the word "poverty" is never used.=In the Administration's view, people are offered insurance but "decline the coverage" as "a matter of choice." The uninsured in Sered and Fernandopulle's book decline coverage, but only because they can't afford it.=Gina, for instance, works for a beauty salon that offers her a bare-bones health-insurance plan with a thousand-dollar deductible for two hundred dollars a month.=What's her total income? Nine hundred dollars a month.=She could "choose" to accept health insurance, but only if she chose to stop buying food or paying the rent.

  The biggest difference between the two accounts, though, has to do with how each views the function of insurance.=Gina, Steve, and Loretta are ill, and need insurance to cover the costs of getting better.=In their eyes, insurance is meant to help equalize financial risk between the healthy and the sick.=In the insurance business, this model of coverage is known as "social insurance," and historically it was the way health coverage was conceived.=If you were sixty and had heart disease and diabetes, you didn't pay substantially more for coverage than a perfectly healthy twenty-five-year-old.=Under social insurance, the twenty-five-year-old agrees to pay thousands of dollars in premiums even though he didn't go to the doctor at all in the previous year, because he wants to make sure that someone else will subsidize his health care if he ever comes down with heart disease or diabetes.=Canada and Germany and Japan and all the other industrialized nations with universal health care follow the social-insurance model.=Medicare, too, is based on the social-insurance model, and, when Americans with Medicare report themselves to be happier with virtually every aspect of their insurance coverage than people with private insurance (as they do, repeatedly and overwhelmingly), they are referring to the social aspect of their insurance.=They aren't getting better care.=But they are getting something just as valuable: the security of being insulated against the financial shock of serious illness.

  There is another way to organize insurance, however, and that is to make it actuarial.=Car insurance, for instance, is actuarial.=How much you pay is in large part a function of your individual situation and history: someone who drives a sports car and has received twenty speeding tickets in the past two years pays a much higher annual premium than a soccer mom with a minivan.=In recent years, the private insurance industry in the United States has been moving toward the actuarial model, with profound consequences.=The triumph of the actuarial model over the social-insurance model is the reason that companies unlucky enough to employ older, high-cost employees--like United Airlines--have run into such financial difficulty.=It's the reason that automakers are increasingly moving their operations to Canada.=It's the reason that small businesses that have one or two employees with serious illnesses suddenly face unmanageably high health-insurance premiums, and it's the reason that, in many states, people suffering from a potentially high-cost medical condition can't get anyone to insure them at all.

  Health Savings Accounts represent the final, irrevocable step in the actuarial direction.=If you are preoccupied with moral hazard, then you want people to pay for care with their own money, and, when you do that, the sick inevitably end up paying more than the healthy.=And when you make people choose an insurance plan that fits their individual needs, those with significant medical problems will choose expensive health plans that cover lots of things, while those with few health problems will choose cheaper, bare-bones plans.=The more expensive the comprehensive plans become, and the less expensive the bare-bones plans become, the more the very sick will cluster together at one end of the insurance spectrum, and the more the well will cluster together at the low-cost end.=The days when the healthy twenty-five-year-old subsidizes the sixty-year-old with heart disease or diabetes are coming to an end.="The main effect of putting more of it on the consumer is to reduce the social redistributive element of insurance," the Stanford economist Victor Fuchs says.=Health Savings Accounts are not a variant of universal health care.=In their governing assumptions, they are the antithesis of universal health care.

  The issue about what to do with the health-care system is sometimes presented as a technical argument about the merits of one kind of coverage over another or as an ideological argument about socialized versus private medicine.=It is, instead, about a few very simple questions.=Do you think that this kind of redistribution of risk is a good idea? Do you think that people whose genes predispose them to depression or cancer, or whose poverty complicates asthma or diabetes, or who get hit by a drunk driver, or who have to keep their mouths closed because their teeth are rotting ought to bear a greater share of the costs of their health care than those of us who are lucky enough to escape such misfortunes? In the rest of the industrialized world, it is assumed that the more equally and widely the burdens of illness are shared, the better off the population as a whole is likely to be.=The reason the United States has forty-five million people without coverage is that its health-care policy is in the hands of people who disagree, and who regard health insurance not as the solution but as the problem.
GO TO TOP MENU

  1.

  As a writer I wear two hats. I am a staff writer for the New Yorker magazine, where I have been under contract more or less continuously since 1996. I also do public speaking, based on my second career as the author of two books--The Tipping Point and Blink. Over the past four or five years, I have given talks to corporations, trade associations, conventions of one sort or another, colleges, think tanks, charitable groups, public lecture series and, on one occasion (arranged by my mother) my old high school. For some of those engagements, I have been paid. For those given to academic and charitable organizations, I generally have not.

  Most of the time, these two hats complement each other. It was because of my work as a New Yorker writer that I was able to get a contract to write my books. It is because of my books that I have gotten speaking engagements, and it is, in part, because of my books and my speaking that some people have discovered me in the New Yorker. But I recognize that there is also the possibility that these two roles can come into conflict, as is always the case when someone has to serve two different constituencies. This note is an attempt to talk about that possibility, and to think through its implications. I will warn you that what follows is quite long. It is long because the question of potential conflicts of interest is a fraught and difficult subject for journalists, and I think it is worth taking seriously.

  2.

  The majority of my time and effort is given to my job at the New Yorker. I have a contract with the New Yorker to produce a certain number of words every year (usually 40,000 to 50,000). I am forbidden, with certain small exceptions, to write for other magazines. The choice of what I write about is largely up to me, although I often take suggestions from editors at the magazine. I do not cover a specific subject area or beat, and so far the New Yorker has placed no restrictions on what I might write about.

  Twice, since joining the New Yorker, I have taken an unpaid book leave. Over the past five or six years, I have also occasionally given paid speeches. This is not unusual. Writers of business books--or, indeed, general interest books of many kinds--have become, in the past few years, regulars on the speaking circuit. A number of New Yorker writers are represented by speaking agents, and it is a safe bet that virtually every author on the New York Times non-fiction bestseller list also does paid lectures. The increasing overlap between the speaking and writing worlds is a function of two things. First, speaking has come to be seen as a integral part of promoting a book. Second, the tremendous growth in recent years in conventions has created a greater demand for public speakers. I am represented in my paid speaking by the Leigh Bureau. (www.leighbureau.com). They handle the negotiations over my fee, and the terms of the engagement, expenses, schedule and so forth.

  Sometimes my speeches are drawn from my books, The Tipping Point and Blink. On other occasions, I talk about some idea or article that I'm working on, or come up with entirely new material for the occasion. I accept speaking engagements because I enjoy public speaking and traveling, because it is financially rewarding, because I realize that it's an important way of promoting my books, and because, most of all, I believe there is real value in the public presentation of ideas. For example, all three of the long case studies at the heart of Blink were previewed, while I was in the midst of writing my book, before various academic audiences. In every case, the comments of audience members helped me re-shape and sharpen my arguments. The written word, to my mind, improves to the extent that it comes to resemble the spoken word--and the discipline of having to present my work orally has helped to make me a clearer and more succinct writer.

  The list of clients who have engaged my services is eclectic. To name just a few, in the past few months I have spoken to a hospital group in Las Vegas, a conference of arts and civic organizations in Michigan, a computer storage company convention in Texas, a group of entrepreneurs and local business leaders in Silicon Valley, and a national meeting on homelessness in New York. The first three of those were paid. The last two were not.

  3.

  I realize that at some media organizations what I do as a speaker would be prohibited. I used to work at the Washington Post, and there were a strict set of ethical guidelines, governing potential conflicts of interest. We certainly couldn't accept paid speaking engagements. Nor were we allowed to accept gifts of any sort. (When the inevitable bottles and baskets of fruit and chocolates came from public relations agencies every Christmas, we would hand the goodies back to the startled--and delighted--messenger who brought them.) When we took sources out to lunch, we had to pay. The editor of the Washington Post, Leonard Downie, even goes so far as decline to vote, because he thinks that casting a ballot will compromise his objectivity. This is what might be called the "hard" position on conflicts-of-interest. Once, when I was still at the Post, Downie admonished the staff against attending a big pro-choice rally that was being staged one weekend in Washington, D.C. His position was that abortion was a highly divisive issue; the Post was a paper committed to covering the news fairly and objectively; and that if Post reporters were known to be marching in an abortion rally it would undermine that mission. (Right around that time, I attended a seminar at a journalism school and was asked about the Downie rule. I said--in what I thought was an obvious joke--that the reason for the policy was practical not ideological; that so many Washington Post were pro-choice that if they didn't prohibit people from going to the rally there would be no one around to put out the next day's paper. I still think that's funny. My editors did not.)

  I have a great deal of respect for the hard position. I think that it makes a difference in a town as ideologically divided as Washington, that the head of the dominant paper sits publicly and firmly on the fence. That said, many reporters at the Post--including me, at the time--did not agree with Downie's abortion rally ban, and nor did they (or me) follow his lead on not voting. A writer is a professional, and part of what it means to be a professional is that one is assumed to be capable of separating one's personal from one's professional life. A doctor who misdiagnoses you is not allowed to say, in his defense, that he had a fight with his wife that morning; nor can a doctor who has a personal aversion to homosexuality refuse to treat a gay patient. We rely, as patients, on the fact that a doctor can create a reasonably clear boundary between the various domains of his life. I think we can reasonably have the same faith in journalists.

  I'm also not sure how much sense the hard position has for a writer in my position. One of the big reasons the Post has such strict rules on accepting gifts and paying for meals is that reporters cover beats. They have a specific, narrow field that they follow closely, and the intimacy that arises between reporter and source inevitably creates the potential for a problem. When I covered the health bureaucracy for the Washington Post, there were senior officials in the government that I talked to nearly every day. I talked to some sources more than I talked to my family. That's why newspapers are understandably so nervous about the potential for bias, because friendship can easily muddy the waters of objectivity. At the New Yorker, however, I don't have a beat. I write about everything under the sun. Most of the time, when I interview someone for a story, I never talk to them again. More to the point, the rules at the Washington Post were intended to protect the paper's status as an objective source of news. The perception--and fact--of objectivity is central to the mission of a modern newspaper. At the New Yorker, I'm under no obligation to be objective. I'm only under the obligation to be fair--and the difference between fairness and objectivity is considerable. The test of a newspaper article is that when a reader finishes reading it, he or she has no idea where the writer stands on the issues under discussion. That's objectivity. With fairness, the bar is a little lower. It is perfectly permissible--even advisable--that a reader of a New Yorker article know where the writer stands on the issue under discussion. It is important only that we be fair: that we accurately and appropriately represent the ideas at hand. During the election, many of the New Yorker's political editorials were written by Hendrik Hertzberg. No reader could have any doubts about where Hertzberg's sympathies lay. He is a Democrat. He is friends with many Democrats. He was a speechwriter for Jimmy Carter. He consistently and powerfully argued against the Republican orthodoxy. Hertzberg could never cover the White House beat for the Washington Post. But he is a brilliant political writer for the New Yorker because he manages--even when his sympathies lie with one side of the argument--to be scrupulously fair. He does not misrepresent his intellectual opponents. He meets--and confronts--them on their own terms. That's the truest test of a polemicist.

  There's another complication with the "hard position", and that has to do with the difference between working for a media organization and writing a book. At the Washington Post, I was not responsible for the economic success of the newspaper. I was completely free to write whatever I chose--so long as it was fairly reported--even if that article resulted in, for example, lost advertising revenue. The same is true, in this context, of the New Yorker. The New Yorker has made a great deal of money running ads from automobile companies, in particular ads for SUVs. In January of last year, I wrote an article that was highly critical of SUVs--and of people who bought them--and no one from the business side of the magazine so much as said a word to me about what impact that might have on the magazine's bottom line. That is as it should be. We have a wall between the business and editorial sides of the magazine.

  Writing a book, though, is a little more complicated, because that wall doesn't exist. A writer is required by his publisher not just to write the book but also to promote it, to wear both a business hat and an editorial hat, and that changes the writer's role substantially. In my upcoming book tour for Blink, I will do dozens of talks and readings at bookstores and companies and lecture series and civic groups, and doing that kind of promotion is considered part of the job. As a Washington Post reporter, if a company like Microsoft came to me and asked me to speak to them, Leonard Downie would have, quite properly, required me to say no. After all, as a reporter for the Post, I wrote about technology. That would create the appearance and perhaps even potential for a conflict. But what is one of the stops on my book tour? The Microsoft campus, because they have a series in which they bring in authors they think their employees might find interesting. And if I said no to that, my publisher would quite properly be annoyed. The only reason they published my book, after all, is that they want to sell lots and lots of copies of it, and they wouldn't be terribly impressed by any ethical qualms I might have about who I should and shouldn't promote my book to. The Downie principle is a rule that works for newspapers. It doesn't necessarily work for book publishing.

  4.

  There is a second, very different, approach to the conflict of interest question, and it has been best articulated by Michael Kinsley, the editorial page editor of the Los Angeles Times, who formerly edited Harper's, the New Republic, and the online journal Slate. While at Slate, Kinsley inaugurated a practice, in sharp contrast to Downie, of having all his writers publicly declare their political preferences. Here is Kinsley's memo on the subject. It dates from the 2000 U.S. presidential election. It's worth reading in full:

  Like many lunatic ideas, Leonard Downie's has a certain inner logic: If opinions are corrupting, just don't have opinions. Downie, executive editor of the Washington Post, is well known for believing that--in the service of objectivity--a journalist in his position should not vote. Writing on the Post op-ed page a couple weeks ago, Downie went even further. He said he does not even allow himself the luxury of deciding whom he would vote for if he was into that sort of thing.

  Many journalists (including me) find this excessive. Journalists are still citizens, with the rights and duties of citizenship. Journalists are also people, for the most part, and people naturally develop opinions about the world around them. This is not a capacity you can turn on and off like a switch. The critical faculties that make for a good journalist probably make purging yourself of all relevant opinion even less plausible. Downie is certainly right that there is no point in not voting officially if you're voting mentally. But in concluding that he therefore shouldn't even vote mentally, he is buying into the fallacy that having an opinion is the same as having a bias.

  What's the difference? Bias is a failure to suppress your opinions or (if opinion is in your job description) to state and defend your opinions openly. Like avoiding opinions, avoiding all bias is probably impossible. Among other difficulties, objectivity is not a huge safety zone. It is a narrow path between bias on one side and bottomless relativism on the other. Journalists are not supposed to be neutral between fact and falsehood or about certain basic shared values. We may state baldly that two plus two is four and may assume without supporting evidence that democracy is a good thing. But beyond that, the fog of disagreement sets in.

  So perfect objectivity is not just unachievable but indefinable. That doesn't make it a false ideal. Avoiding bias is a more reasonable aspiration than avoiding opinion itself. If you reject the Downie Solution, though, you'd better have an alternative way to demonstrate your lack of bias. Fortunately, the burgeoning field of journalistic ethics has an all-purpose alternative solution for almost all ethical dilemmas. It is disclosure. Let your readers know that your great-aunt's ex-husband owns 10 shares of AT&T, and they can decide for themselves whether this biases your coverage of the telecommunications industry.

  Why shouldn't the same logic apply to politics? If you're not going to refrain from voting, why not let your readers know how you voted so they can judge your objectivity for themselves? If you're asking them to trust you despite your political opinions, shouldn't they know what those opinions are? If you believe you do an adequate job of preventing your opinions from curdling into bias, what are you afraid of?

  In this spirit (and to fill in the eerie voting-day news gap between the end of the campaign and the beginning of the election returns), Slate invited its entire staff to declare how they plan to vote and to briefly explain why. The exercise was voluntary, but most Slatesters joined in. You can read the results here. And Press Box launches a crusade here to get other journalists to follow our example.

  One result is no surprise: Slate's staff is voting overwhelmingly for Al Gore. Fear of confirming conservative suspicions about the liberal predisposition of the media is probably the main reason other journalists will resist following our lead.

  No doubt it is true that most journalists vote Democratic, just as most business executives (including most media owners) vote Republican, though neither tendency is as pronounced as their respective critics believe. This is a natural result of the sort of people who are attracted to various careers. It is not the product of any conspiracy. There is no Liberal Central Committee drafting young liberals into journalism against their will or blackballing young conservatives. And there is nothing that can be done to change this disparity, unless conservative press critics would like to see the media institute a political quota system, favoring conservatives over better-qualified liberals (affirmative action for opponents of affirmative action).

  But--for the millionth time!--an opinion is not a bias! The fact that reporters tend to be liberal says nothing one way or another about their tendency to be biased. It does suggest that when political bias does creep in, it is more likely to tilt liberal than conservative. But there are so many other pressures and prejudices built into the news--including occasional overcompensation for fear of appearing biased--that raw political bias plays a fairly small role. And any liberal bias in reporting is more than counterbalanced by the conservative tilt of the commentariat. Or so I believe.

  Of course it is not easy to persuade folks of this, and many will never believe it. No doubt it is easier just to keep your political opinions secret and imply that you don't have any. But that absurdity or dishonesty itself undermines your credibility. Or it ought to.

  There are two powerful ideas here. The first is about the virtues of transparency. Kinsley believes that its more harmful to pretend you don't have a particular opinion or conflict--and leave outsiders guessing and speculating about where your "true" allegiances lie--than it is to simply put your potential conflicts on the table. The second of Kinsley's notions, though--the idea that there is a real distinction between a bias and an opinion--is a bit more complex, and I think it's worth digging into it in more detail. On many occasions, I think, we use the words "bias" and "opinion" interchangeably. But Kinsley is right. They are very different.

  Bias, for example, is global. Opinion is particular. If I'm a film critic, and I hated the latest Werner Herzog movie, that's an opinion. That's why we read film critics. If I'm a film critic, though, and I hate all German movies, that's a bias. A bias is different in degree from an opinion. In the work of a critic who holds opinions, furthermore, there may be no discernable pattern: she may like this Western and dislike that Western, or like "Taxi Driver" and "Raging Bull," but dislike "Gangs of New York" and "Casino." Biases have a consistency that opinions do not. Thirdly, biases are predictive while opinions are reactive. The problem with the critic who hates all German films is that she's made up her mind about a class of movies before she's seen them--the effect of her bias is to close her mind. The critic who merely holds opinions, by contrast, is at least making an attempt to go into the latest German film with an open mind. When the great New Yorker film critic Pauline Kael was asked why she retired, she said that it was so she would never have to watch another Oliver Stone film. That was a quip--but it had a ring of truth. Kael recognized that a critic who rules out an entire category of film, a priori, can no longer be an effective critic.

  I think we can all agree that biases are a problem, particularly for a journalist. Writers with biases are predictable in the worst way and, more than that, they are dishonest. They pretend to have given thought to a subject, when all they've done is apply a fairly rigid set of preconceptions. For a writer to have an opinion, on the other hand, is a wonderful thing. The ability to form opinions is a sign of engagement with the world. And, like Michael Kinsley, I believe that the process of journalism is immeasurably improved when writers are open about their opinions. So let me start by making a few disclosures. You should know that in my early twenties, I was very conservative politically. I had a picture of Ronald Reagan on my wall in college. (Yes, I was that much of a geek). Today, if I could vote (and I can't, because I'm Canadian) I would vote Democrat. I am pro-choice and in favor of gay marriage. I believe in God. I think the war in Iraq is a terrible mistake. I am a big believer in free trade. I think, on balance, taxes in America--particularly for rich people--ought to be higher, not lower. I think smoking is a terrible problem and that cigarette manufacturers ought to be subjected to every possible social and political sanction. But I think that filing product liability lawsuits against cigarette manufacturers is absurd. I am opposed to the Death Penalty. I hate SUVs. I think many CEOs are overpaid. I think there is too much sex and violence on television. Of perhaps most relevance to my writing--because I write so much about health care--I have recently come to think that the United States needs to adopt a Canadian-style single-payer, government funded, universal health care system.

  Do any of these opinions rise to the level of bias? I don't think so. They don't cohere in a single identifiable ideology. And they aren't predictive, in the sense that they lead me inexorably towards writing in a pro-God, pro-Democratic, pro-choice, pro-gay marriage, pro-free trade, pro-higher taxes, anti-Iraq war kind of way. If you look at my articles, in fact, you'll see that I rarely even write about the subject areas where I have the most strongly held opinions. What's more, when I interview or profile people I don't ask them for their opinions on these same subjects--so there's very little chance for any conflict or agreement in our attitudes to become an issue. I should also say that, by the time you read this, any number of the opinions I've stated above may well have changed. That's another important difference between biases and opinions. Biases are pretty stable. Opinions come and go.

  5.

  Let's talk, then, about my speeches, since that's clearly the area the question of biases and opinions is most fraught. Does the fact that I am periodically paid by company X or organization Y to deliver a speech about the Tipping Point or Blink affect, in some way, the content or the direction of my writing? I think the answer to that has to be yes. For example, I have in my book Blink an entire chapter devoted to an attack on market research--in particular the heavy reliance in corporate America on focus groups. I got interested in that question because I once spoke about the Tipping Point at a music industry meeting, and at a cocktail party afterwards got into a long conversation with a industry executive, who told me how he thought pop music was being ruined by radio's adherence to market research. Would I have developed that view of market research had I never gone to that conference? I don't know. But probably not. When you give a speech to a particular group, you expose yourself to the opinions of that group, and that exposure cannot help but affect the way you think. Similarly, I once gave a talk at the Yale University psychology department, for which I was paid an (small) honorarium. Afterwards, I spent the better part of a day chatting with various psychologists on the Yale faculty. Out of those conversations have come a number of ideas and thoughts that have made it into my writing. Here I feel quite confident in saying that if I had never gone to Yale, I would not been exposed to those ideas. Yale paid me to come to New Haven, and I left New Haven excited about the psychological research being conducted at Yale. I think, in fact, that the people who invited me to Yale would readily admit that this was their intention. Why else invite a journalist to campus, and set up a day of interviews with him, if not to influence him in some way?

  I'm not sure there's anything wrong with this kind of influence, though. I didn't develop a pro-Yale bias--an enduring, predictive, monolithic predisposition towards Yale. I developed a set of opinions shaped by my Yale experiences. As it happens, those opinions were favorable to Yale. But opinions are fickle things, and the opposite could just as easily be true. I could have been so unimpressed by the people I met at Yale that I could have been indifferent or even hostile to their work. For instance, I've given lots of paid talks, over the years, to medical groups of one sort or another--groups of doctors or insurers or hospitals. Virtually all of those groups--at least on an institutional basis--are firmly opposed to a Canadian-style universal health insurance. Yet it is over that same period that I have changed my mind and become a strong advocate of the Canadian system. The effect of meetings and talking to all those people in the U.S. health care world has been to convince me of the intellectual bankruptcy of their position. Like I said, the effect of influences on opinions is not a one way street.

  Now, you might say that that's a rare example, and in most cases the fact of being paid by someone predisposes you to be sympathetic to their cause. This assumption, in fact, is one of the cornerstones of the Downie school: money is the fastest route to journalistic corruption. But is it really? In my experience, personal friendships are far more serious source of bias. For example, the CEO of Research in Motion, the Canadian firm that makes the Blackberry, is a man named Jim Balsillie, who was a friend of mine in college. Furthermore, RIM is closely linked to the University of Waterloo, an institution where my father taught for 30 years and to which I have profound emotional ties. Have I ever received a dime from RIM? No. Can I be objective on the subject of the Blackberry? I really doubt it. Yet I'm quite sure that if I wrote an article on hand-held communication devices and came out heavily in favor of the Blackberry, I'd get in far more trouble if I failed to disclose any financial dealings I'd had with the company than if I failed to disclose my sentimental attachments to RIM and Waterloo. At a place like the Washington Post, in fact, having given a speech to RIM would disqualify me from covering the telecommunications industry. But growing up in Waterloo would not. Isn't that a bit strange?

  Let me go further. I don't think that the health insurance example--where I ended up taking an opinion opposite to the interests of the people I got paid to speak to--is all that rare. In fact, it has happened on several occasions. On behalf of the business side of the New Yorker, I have repeatedly given talks or presentations to representatives of companies that advertise with the magazine. For some of those presentations, I have been paid. And on a number of occasions, those groups have included people from the U.S. automobile industry. Has that biased me in favor of the Big Three? Well, no. As I've stated, last January I wrote an article bitterly attacking the SUV, which has been the cornerstone of the financial success of Ford, General Motors and Chrysler over the past ten years. Giving a speech does not buy my allegiance to the interests of my audience. Why? Because giving a paid speech to a group for an hour is simply not enough to create a bias in that group's favor. It's a very different sort of transaction. I'm not invited to speak to those medical groups because I promise to agree with their position on health care, and I'm not invited to speak to groups from Detroit because I promise to agree with their position on SUVs. In fact, my position on health insurance or SUVs never comes up. I'm invited because those audiences want to hear about my work. Financial ties are in danger of being corrupting when they are ties, when they are, in some way, permanent and when resources and influence and information move equally in both directions. There is no question, for instance, that I am hopelessly biased in favor of The New Yorker. I've been writing for them for ten years. They've been paying my salary for ten years. The people who work on the magazine have become some of my closest friends. If there's some deep dark scandal buried beneath the surface of the New Yorker, it's probably not a good idea to rely on me to ferret it out.

  6.

  Let's look at another example. This is a harder case. And I bring it up because it recently was written about publicly (www.slate.com/id/2110572). The article was called "High Prices" You can find it on the archive page of this website. It made the following argument.

  1. The problem of rising prescription drug costs is being driven not by rising drug prices, but by the fact that consumers are using greater volumes of drugs.
  2. That means the solution to soaring drug costs lies in the hands of those who uses drugs, not the companies who make and market them.
  3. There are clear strategies that drug buyers and users can take to lower their drug bills. They can substitute cheaper generics for more expensive brand name drugs. They can also be far more aggressive in controlling when and how drugs are prescribed; understanding, for instance, which kinds of patients don't need the most expensive state-of-the-art therapy, and can be usefully treated with older cheaper regimens. One example I gave was the anti-inflammatory drugs Vioxx and Celebrex. There is no evidence that those very expensive drugs are any better at reducing inflammation, in the vast majority of patients, than ibuprofen, which costs pennies a pill.
  4. The real problem is not that drugs are expensive. It's that many Americans don't have the means or health insurance to pay for drugs at any price. If we really cared about that problem, we would enact national health insurance.
  5. Doctors--and in particularly the editors of medical journals--have not been diligent enough in directing patients to the most cost-effective remedies.

  Have I given paid speeches to companies or industries mentioned or affected by that article? Yes I have. As I stated earlier, I have given my Tipping Point talk to groups of doctors, hospitals, insurers, as well as Pharmacy Benefit Managers and groups funded by the National Institutes of Health. More specifically, I have on several occasions over the past four years given paid speeches on the Tipping Point to pharmaceutical companies. So did that create a bias in favor of the pharmaceutical industry? Well, the evidence is for all to see in the article itself, and I think the answer is pretty clear. For instance, I said that consumers would be better off ignoring Astra Zeneca's marketing campaign for its heartburn drug Nexium and just buy cheaper generic or over-the-counter versions of that drug, or similar drugs that category. I also suggested that most consumers would be better off skipping Pfizer's heavily marketed and highly expensive drug Celebrex, and sticking to ibuprofen. Pfizer's Celebrex and Astra Zeneca's Nexium are two of the most profitable drugs in the world. "High Prices" advocated systematic generic and theraputic substitution--a policy that would cut the sales of Pfizer and Astra Zeneca, and everyone else in the brand-name pharmaceutical business, by tens of billions of dollars. It advocated the use of third-parties, like pharmacy benefit managers, to more rigorously control physician prescribing practices. Finally, I said I was in favor of a Canadian-style single payer system. And if we had that kind of system, of course, the U.S. government could do what the Canadian government does now, which is use the enormous bargaining power of a universal health system to demand steep discounts from the industry. You can see why I wasn't terribly concerned about the possibility that someone would think, as a result of my speaking, that I had a pro-drug company bias. (I should point out that, to my great puzzlement, I ended up being accused by some readers of exactly that.)

  But that doesn't entirely resolve the issue. Suppose I had wanted to write something overwhelmingly positive about the brand name prescription drug industry? What if I learned, while at some speaking engagement, say, that the company was on the verge of curing cancer? Now things start to get more complicated. Clearly as a journalist I'd want to write about that fact, and you as a reader would want to read it. And the fact that I got however many thousand dollars to talk about the Tipping Point would be dwarfed in importance by what I'd learned. Still, if I wanted to say something nice about that firm, I'd have to make my previous arrangement with those firms plain. I would have had to prove that my encounter with them created only an opinion about what to write about--not a bias. I would have to make very sure that my discussion of the issue at hand was fair, that I accurately represented both sides of the issue even if I favored one side over the other.

  All of that creates a fairly substantial burden. In a case other than a cure for cancer--where what I learned was less consequential--I might prefer to take the easy way out and not write the article at all. And then where would we be? Well, we'd have a case where the hat I wear as Tipping Point and Blink author and speaker would affect the direction and content of my writing for the New Yorker. Faced with the possibility of suspicion of conflict of interest, I would be censoring myself.

  I should stress that this is a hypothetical example. I don't think that this kind of situation is likely to come very often. In most of the articles I write, there is no real potential for a perception of bias. Nonetheless, I'm not willing to dismiss this final consideration either. It's a real issue, and one of the reasons I wanted to work though this subject at such length (and I applaud those who are still with me) is that I wanted to put this small--but genuine--problem out in the open.

  So what to do? I'm open to suggestions. But let me start by enumerating a series of principles, to which I pledge to adhere, that I think make the possibility of opinion turning into bias and fairness into unfairness less likely.

  1. Any speaking I do should be just that--speaking. The possibility for trouble is much greater when a writer steps outside the role of giving a set speech and becomes a consultant or advisor, or in some ways develops a continuing financial arrangement with a specific company or organization. Once, I was asked to be a consultant to a marketing firm. I said yes--briefly--then immediately resigned when I realized the implications of that act. I will not do that again. Anything outside of simple speaking is inappropriate.
  2. Any speaking I do should be eclectic. One of the reasons that conflict of interest questions are so fraught in the world of medicine, for instance, is that doctors who do research on new drugs tend to have strong financial ties to a single industry, the drug industry. It's not just the fact of an outside source of income that's a problem. It's the singularity of that source of income. If every speech I gave was to doctors' groups, and I become dependent on doctors for my speaking career, then I become far more susceptible to the possibility of bias than if my speaking is distributed among many groups with widely divergent interests.
  3. My editors need to be fully aware of the scope of my speaking, and alerted to cases where there is overlap between those groups that I have spoken to and those groups that I have written about. As I've stated, I don't find that overlap necessarily problematic. But it certainly should be brought to light, and discussed, before any article is run.
  4. Finally, and perhaps most importantly, people who read my books and articles have an obligation as well. Journalism is not like the business world, where the mechanics of decisions and procedures take place behind closed doors. It is, rather, like science, where the fruits of all endeavor are put on public display. In the world of science, that transparency allows the profession to be self-policing. It is very hard to commit scientific fraud because all significant findings are published, scrutinized by other members of the scientific community, and--if they are sufficiently controversial--independently tested. Journalism is the same way. If my writing is biased or unfair, then the evidence of that bias and unfairness is a matter of public record, and any reader has the freedom--and the responsibility--to hold me accountable.

  I don't claim that these four principles entirely resolve the questions raised by the two hats that I wear. But I think that they are a start.

  --Malcolm Gladwell
December 10, 2004
GO TO TOP MENU

  Is pop culture dumbing us down or smartening us up?

1.

  Twenty years ago, a political philosopher named James Flynn uncovered a curious fact. Americans--at least, as measured by I.Q. tests--were getting smarter. This fact had been obscured for years, because the people who give I.Q. tests continually recalibrate the scoring system to keep the average at 100. But if you took out the recalibration, Flynn found, I.Q. scores showed a steady upward trajectory, rising by about three points per decade, which means that a person whose I.Q. placed him in the top ten per cent of the American population in 1920 would today fall in the bottom third. Some of that effect, no doubt, is a simple by-product of economic progress: in the surge of prosperity during the middle part of the last century, people in the West became better fed, better educated, and more familiar with things like I.Q. tests. But, even as that wave of change has subsided, test scores have continued to rise--not just in America but all over the developed world. What's more, the increases have not been confined to children who go to enriched day-care centers and private schools. The middle part of the curve--the people who have supposedly been suffering from a deteriorating public-school system and a steady diet of lowest-common-denominator television and mindless pop music--has increased just as much. What on earth is happening? In the wonderfully entertaining "Everything Bad Is Good for You" (Riverhead; $23.95), Steven Johnson proposes that what is making us smarter is precisely what we thought was making us dumber: popular culture.

  Johnson is the former editor of the online magazine Feed and the author of a number of books on science and technology. There is a pleasing eclecticism to his thinking. He is as happy analyzing "Finding Nemo" as he is dissecting the intricacies of a piece of software, and he's perfectly capable of using Nietzsche's notion of eternal recurrence to discuss the new creative rules of television shows. Johnson wants to understand popular culture--not in the postmodern, academic sense of wondering what "The Dukes of Hazzard" tells us about Southern male alienation but in the very practical sense of wondering what watching something like "The Dukes of Hazzard" does to the way our minds work.

  As Johnson points out, television is very different now from what it was thirty years ago. It's harder. A typical episode of "Starsky and Hutch," in the nineteen-seventies, followed an essentially linear path: two characters, engaged in a single story line, moving toward a decisive conclusion. To watch an episode of "Dallas" today is to be stunned by its glacial pace--by the arduous attempts to establish social relationships, by the excruciating simplicity of the plotline, by how obvious it was. A single episode of "The Sopranos," by contrast, might follow five narrative threads, involving a dozen characters who weave in and out of the plot. Modern television also requires the viewer to do a lot of what Johnson calls "filling in," as in a "Seinfeld" episode that subtly parodies the Kennedy assassination conspiracists, or a typical "Simpsons" episode, which may contain numerous allusions to politics or cinema or pop culture. The extraordinary amount of money now being made in the television aftermarket--DVD sales and syndication--means that the creators of television shows now have an incentive to make programming that can sustain two or three or four viewings. Even reality shows like "Survivor," Johnson argues, engage the viewer in a way that television rarely has in the past:

  When we watch these shows, the part of our brain that monitors the emotional lives of the people around us--the part that tracks subtle shifts in intonation and gesture and facial expression--scrutinizes the action on the screen, looking for clues. . . . The phrase "Monday-morning quarterbacking" was coined to describe the engaged feeling spectators have in relation to games as opposed to stories. We absorb stories, but we second-guess games. Reality programming has brought that second-guessing to prime time, only the game in question revolves around social dexterity rather than the physical kind.

  How can the greater cognitive demands that television makes on us now, he wonders, not matter?

  Johnson develops the same argument about video games. Most of the people who denounce video games, he says, haven't actually played them--at least, not recently. Twenty years ago, games like Tetris or Pac-Man were simple exercises in motor coördination and pattern recognition. Today's games belong to another realm. Johnson points out that one of the "walk-throughs" for "Grand Theft Auto III"--that is, the informal guides that break down the games and help players navigate their complexities--is fifty-three thousand words long, about the length of his book. The contemporary video game involves a fully realized imaginary world, dense with detail and levels of complexity.

  Indeed, video games are not games in the sense of those pastimes--like Monopoly or gin rummy or chess--which most of us grew up with. They don't have a set of unambiguous rules that have to be learned and then followed during the course of play. This is why many of us find modern video games baffling: we're not used to being in a situation where we have to figure out what to do. We think we only have to learn how to press the buttons faster. But these games withhold critical information from the player. Players have to explore and sort through hypotheses in order to make sense of the game's environment, which is why a modern video game can take forty hours to complete. Far from being engines of instant gratification, as they are often described, video games are actually, Johnson writes, "all about delayed gratification--sometimes so long delayed that you wonder if the gratification is ever going to show."

  At the same time, players are required to manage a dizzying array of information and options. The game presents the player with a series of puzzles, and you can't succeed at the game simply by solving the puzzles one at a time. You have to craft a longer-term strategy, in order to juggle and coördinate competing interests. In denigrating the video game, Johnson argues, we have confused it with other phenomena in teen-age life, like multitasking--simultaneously e-mailing and listening to music and talking on the telephone and surfing the Internet. Playing a video game is, in fact, an exercise in "constructing the proper hierarchy of tasks and moving through the tasks in the correct sequence," he writes. "It's about finding order and meaning in the world, and making decisions that help create that order."

  2.

  It doesn't seem right, of course, that watching "24" or playing a video game could be as important cognitively as reading a book. Isn't the extraordinary success of the "Harry Potter" novels better news for the culture than the equivalent success of "Grand Theft Auto III"? Johnson's response is to imagine what cultural critics might have said had video games been invented hundreds of years ago, and only recently had something called the book been marketed aggressively to children:

Reading books chronically understimulates the senses. Unlike the longstanding tradition of game playing--which engages the child in a vivid, three-dimensional world filled with moving images and musical sound-scapes, navigated and controlled with complex muscular movements--books are simply a barren string of words on the page. . . .
Books are also tragically isolating. While games have for many years engaged the young in complex social relationships with their peers, building and exploring worlds together, books force the child to sequester him or herself in a quiet space, shut off from interaction with other children. . . .
But perhaps the most dangerous property of these books is the fact that they follow a fixed linear path. You can't control their narratives in any fashion--you simply sit back and have the story dictated to you. . . . This risks instilling a general passivity in our children, making them feel as though they're powerless to change their circumstances. Reading is not an active, participatory process; it's a submissive one.

  He's joking, of course, but only in part. The point is that books and video games represent two very different kinds of learning. When you read a biology textbook, the content of what you read is what matters. Reading is a form of explicit learning. When you play a video game, the value is in how it makes you think. Video games are an example of collateral learning, which is no less important.

  Being "smart" involves facility in both kinds of thinking--the kind of fluid problem solving that matters in things like video games and I.Q. tests, but also the kind of crystallized knowledge that comes from explicit learning. If Johnson's book has a flaw, it is that he sometimes speaks of our culture being "smarter" when he's really referring just to that fluid problem-solving facility. When it comes to the other kind of intelligence, it is not clear at all what kind of progress we are making, as anyone who has read, say, the Gettysburg Address alongside any Presidential speech from the past twenty years can attest. The real question is what the right balance of these two forms of intelligence might look like. "Everything Bad Is Good for You" doesn't answer that question. But Johnson does something nearly as important, which is to remind us that we shouldn't fall into the trap of thinking that explicit learning is the only kind of learning that matters.

  In recent years, for example, a number of elementary schools have phased out or reduced recess and replaced it with extra math or English instruction. This is the triumph of the explicit over the collateral. After all, recess is "play" for a ten-year-old in precisely the sense that Johnson describes video games as play for an adolescent: an unstructured environment that requires the child actively to intervene, to look for the hidden logic, to find order and meaning in chaos.

  One of the ongoing debates in the educational community, similarly, is over the value of homework. Meta-analysis of hundreds of studies done on the effects of homework shows that the evidence supporting the practice is, at best, modest. Homework seems to be most useful in high school and for subjects like math. At the elementary-school level, homework seems to be of marginal or no academic value. Its effect on discipline and personal responsibility is unproved. And the causal relation between high-school homework and achievement is unclear: it hasn't been firmly established whether spending more time on homework in high school makes you a better student or whether better students, finding homework more pleasurable, spend more time doing it. So why, as a society, are we so enamored of homework? Perhaps because we have so little faith in the value of the things that children would otherwise be doing with their time. They could go out for a walk, and get some exercise; they could spend time with their peers, and reap the rewards of friendship. Or, Johnson suggests, they could be playing a video game, and giving their minds a rigorous workout.
GO TO TOP MENU

  In "Collapse, " Jared Diamond shows how societies destroy themselves.

  1.

  A thousand years ago, a group of Vikings led by Erik the Red set sail from Norway for the vast Arctic landmass west of Scandinavia which came to be known as Greenland. It was largely uninhabitable--a forbidding expanse of snow and ice. But along the southwestern coast there were two deep fjords protected from the harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years--and then they vanished.

  The story of the Eastern and Western Settlements of Greenland is told in Jared Diamond's "Collapse: How Societies Choose to Fail or Succeed" (Viking; $29.95). Diamond teaches geography at U.C.L.A. and is well known for his best-seller "Guns, Germs, and Steel," which won a Pulitzer Prize. In "Guns, Germs, and Steel," Diamond looked at environmental and structural factors to explain why Western societies came to dominate the world. In "Collapse," he continues that approach, only this time he looks at history's losers--like the Easter Islanders, the Anasazi of the American Southwest, the Mayans, and the modern-day Rwandans. We live in an era preoccupied with the way that ideology and culture and politics and economics help shape the course of history. But Diamond isn't particularly interested in any of those things--or, at least, he's interested in them only insofar as they bear on what to him is the far more important question, which is a society's relationship to its climate and geography and resources and neighbors. "Collapse" is a book about the most prosaic elements of the earth's ecosystem--soil, trees, and water--because societies fail, in Diamond's view, when they mismanage those environmental factors.

  There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time--devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.

  The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.

  But Greenland's ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. "The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass," he writes. "With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland's climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley." Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.

  The Norse needed to reduce their reliance on livestock--particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit--they called them skraelings, "wretches"--and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen's robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.

  2.

  Diamond's argument stands in sharp contrast to the conventional explanations for a society's collapse. Usually, we look for some kind of cataclysmic event. The aboriginal civilization of the Americas was decimated by the sudden arrival of smallpox. European Jewry was destroyed by Nazism. Similarly, the disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the "It got too cold, and they died" argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.

  But look, Diamond says, at Easter Island. Once, it was home to a thriving culture that produced the enormous stone statues that continue to inspire awe. It was home to dozens of species of trees, which created and protected an ecosystem fertile enough to support as many as thirty thousand people. Today, it's a barren and largely empty outcropping of volcanic rock. What happened? Did a rare plant virus wipe out the island's forest cover? Not at all. The Easter Islanders chopped their trees down, one by one, until they were all gone. "I have often asked myself, 'What did the Easter Islander who cut down the last palm tree say while he was doing it?'" Diamond writes, and that, of course, is what is so troubling about the conclusions of "Collapse." Those trees were felled by rational actors--who must have suspected that the destruction of this resource would result in the destruction of their civilization. The lesson of "Collapse" is that societies, as often as not, aren't murdered. They commit suicide: they slit their wrists and then, in the course of many decades, stand by passively and watch themselves bleed to death.

  This doesn't mean that acts of God don't play a role. It did get colder in Greenland in the early fourteen-hundreds. But it didn't get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn't adapt to the country's changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman's dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. "Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding," he writes. "Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?" It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn't eat fish. For one reason or another, they had a cultural taboo against it.

  Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.

  Why did the Norse choose not to eat fish? Because they weren't thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance. "The Norse were undone by the same social glue that had enabled them to master Greenland's difficulties," Diamond writes. "The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity." He goes on:

  To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.

  Diamond's distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.

  Diamond points out that the Easter Islanders did not practice, so far as we know, a uniquely pathological version of South Pacific culture. Other societies, on other islands in the Hawaiian archipelago, chopped down trees and farmed and raised livestock just as the Easter Islanders did. What doomed the Easter Islanders was the interaction between what they did and where they were. Diamond and a colleague, Barry Rollet, identified nine physical factors that contributed to the likelihood of deforestation--including latitude, average rainfall, aerial-ash fallout, proximity to Central Asia's dust plume, size, and so on--and Easter Island ranked at the high-risk end of nearly every variable. "The reason for Easter's unusually severe degree of deforestation isn't that those seemingly nice people really were unusually bad or improvident," he concludes. "Instead, they had the misfortune to be living in one of the most fragile environments, at the highest risk for deforestation, of any Pacific people." The problem wasn't the Easter Islanders. It was Easter Island.

  In the second half of "Collapse," Diamond turns his attention to modern examples, and one of his case studies is the recent genocide in Rwanda. What happened in Rwanda is commonly described as an ethnic struggle between the majority Hutu and the historically dominant, wealthier Tutsi, and it is understood in those terms because that is how we have come to explain much of modern conflict: Serb and Croat, Jew and Arab, Muslim and Christian. The world is a cauldron of cultural antagonism. It's an explanation that clearly exasperates Diamond. The Hutu didn't just kill the Tutsi, he points out. The Hutu also killed other Hutu. Why? Look at the land: steep hills farmed right up to the crests, without any protective terracing; rivers thick with mud from erosion; extreme deforestation leading to irregular rainfall and famine; staggeringly high population densities; the exhaustion of the topsoil; falling per-capita food production. This was a society on the brink of ecological disaster, and if there is anything that is clear from the study of such societies it is that they inevitably descend into genocidal chaos. In "Collapse," Diamond quite convincingly defends himself against the charge of environmental determinism. His discussions are always nuanced, and he gives political and ideological factors their due. The real issue is how, in coming to terms with the uncertainties and hostilities of the world, the rest of us have turned ourselves into cultural determinists.

  3.

  For the past thirty years, Oregon has had one of the strictest sets of land-use regulations in the nation, requiring new development to be clustered in and around existing urban development. The laws meant that Oregon has done perhaps the best job in the nation in limiting suburban sprawl, and protecting coastal lands and estuaries. But this November Oregon's voters passed a ballot referendum, known as Measure 37, that rolled back many of those protections. Specifically, Measure 37 said that anyone who could show that the value of his land was affected by regulations implemented since its purchase was entitled to compensation from the state. If the state declined to pay, the property owner would be exempted from the regulations.

  To call Measure 37--and similar referendums that have been passed recently in other states--intellectually incoherent is to put it mildly. It might be that the reason your hundred-acre farm on a pristine hillside is worth millions to a developer is that it's on a pristine hillside: if everyone on that hillside could subdivide, and sell out to Target and Wal-Mart, then nobody's plot would be worth millions anymore. Will the voters of Oregon then pass Measure 38, allowing them to sue the state for compensation over damage to property values caused by Measure 37?

  It is hard to read "Collapse," though, and not have an additional reaction to Measure 37. Supporters of the law spoke entirely in the language of political ideology. To them, the measure was a defense of property rights, preventing the state from unconstitutional "takings." If you replaced the term "property rights" with "First Amendment rights," this would have been indistinguishable from an argument over, say, whether charitable groups ought to be able to canvass in malls, or whether cities can control the advertising they sell on the sides of public buses. As a society, we do a very good job with these kinds of debates: we give everyone a hearing, and pass laws, and make compromises, and square our conclusions with our constitutional heritage--and in the Oregon debate the quality of the theoretical argument was impressively high.

  The thing that got lost in the debate, however, was the land. In a rapidly growing state like Oregon, what, precisely, are the state's ecological strengths and vulnerabilities? What impact will changed land-use priorities have on water and soil and cropland and forest? One can imagine Diamond writing about the Measure 37 debate, and he wouldn't be very impressed by how seriously Oregonians wrestled with the problem of squaring their land-use rules with their values, because to him a society's environmental birthright is not best discussed in those terms. Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs--with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays--that they forget that the pastureland is shrinking and the forest cover is gone.

  When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland--crucifixes, bowls, furniture, doors, roof timbers--which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.
GO TO TOP MENU

  Mammography, air power, and the limits of looking.

  1.

  At the beginning of the first Gulf War, the United States Air Force dispatched two squadrons of F-15E Strike Eagle fighter jets to find and destroy the Scud missiles that Iraq was firing at Israel. The rockets were being launched, mostly at night, from the backs of modified flatbed tractor-trailers, moving stealthily around a four-hundred-square-mile "Scud box" in the western desert. The plan was for the fighter jets to patrol the box from sunset to sunrise. When a Scud was launched, it would light up the night sky. An F-15E pilot would fly toward the launch point, follow the roads that crisscrossed the desert, and then locate the target using a state-of-the-art, $4.6-million device called a LANTIM navigation and targeting pod, capable of taking a high-resolution infrared photograph of a four-and-a-half-mile swath below the plane. How hard could it be to pick up a hulking tractor-trailer in the middle of an empty desert?

  Almost immediately, reports of Scud kills began to come back from the field. The Desert Storm commanders were elated. "I remember going out to Nellis Air Force Base after the war," Barry Watts, a former Air Force colonel, says. "They did a big static display, and they had all the Air Force jets that flew in Desert Storm, and they had little placards in front of them, with a box score, explaining what this plane did and that plane did in the war. And, when you added up how many Scud launchers they claimed each got, the total was about a hundred." Air Force officials were not guessing at the number of Scud launchers hit; as far as they were concerned, they knew. They had a four-million-dollar camera, which took a nearly perfect picture, and there are few cultural reflexes more deeply ingrained than the idea that a picture has the weight of truth. "That photography not only does not, but cannot lie, is a matter of belief, an article of faith," Charles Rosen and Henri Zerner have written. "We tend to trust the camera more than our own eyes." Thus was victory declared in the Scud hunt - until hostilities ended and the Air Force appointed a team to determine the effectiveness of the air campaigns in Desert Storm. The actual number of definite Scud kills, the team said, was zero.

  The problem was that the pilots were operating at night, when depth perception is impaired. LANTIM could see in the dark, but the camera worked only when it was pointed in the right place, and the right place wasn't obvious. Meanwhile, the pilot had only about five minutes to find his quarry, because after launch the Iraqis would immediately hide in one of the many culverts underneath the highway between Baghdad and Jordan, and the screen the pilot was using to scan all that desert measured just six inches by six inches. "It was like driving down an interstate looking through a soda straw," Major General Mike DeCuir, who flew numerous Scud-hunt missions throughout the war, recalled. Nor was it clear what a Scud launcher looked like on that screen. "We had an intelligence photo of one on the ground. But you had to imagine what it would look like on a black-and-white screen from twenty thousand feet up and five or more miles away," DeCuir went on. "With the resolution we had at the time, you could tell something was a big truck and that it had wheels, but at that altitude it was hard to tell much more than that." The postwar analysis indicated that a number of the targets the pilots had hit were actually decoys, constructed by the Iraqis from old trucks and spare missile parts. Others were tanker trucks transporting oil on the highway to Jordan. A tanker truck, after all, is a tractor-trailer hauling a long, shiny cylindrical object, and, from twenty thousand feet up at four hundred miles an hour on a six-by-six-inch screen, a long, shiny cylindrical object can look a lot like a missile. "It's a problem we've always had," Watts, who served on the team that did the Gulf War analysis, said. "It's night out. You think you've got something on the sensor. You roll out your weapons. Bombs go off. It's really hard to tell what you did."

  You can build a high-tech camera, capable of taking pictures in the middle of the night, in other words, but the system works only if the camera is pointed in the right place, and even then the pictures are not self-explanatory. They need to be interpreted, and the human task of interpretation is often a bigger obstacle than the technical task of picture-taking. This was the lesson of the Scud hunt: pictures promise to clarify but often confuse. The Zapruder film intensified rather than dispelled the controversy surrounding John F. Kennedy's assassination. The videotape of the beating of Rodney King led to widespread uproar about police brutality; it also served as the basis for a jury's decision to acquit the officers charged with the assault. Perhaps nowhere have these issues been so apparent, however, as in the arena of mammography. Radiologists developed state-of-the-art X-ray cameras and used them to scan women's breasts for tumors, reasoning that, if you can take a nearly perfect picture, you can find and destroy tumors before they go on to do serious damage. Yet there remains a great deal of confusion about the benefits of mammography. Is it possible that we place too much faith in pictures?

  2.

  The head of breast imaging at Memorial Sloan-Kettering Cancer Center, in New York City, is a physician named David Dershaw, a youthful man in his fifties, who bears a striking resemblance to the actor Kevin Spacey. One morning not long ago, he sat down in his office at the back of the Sloan-Kettering Building and tried to explain how to read a mammogram.

  Dershaw began by putting an X-ray on a light box behind his desk. "Cancer shows up as one of two patterns," he said. "You look for lumps and bumps, and you look for calcium. And, if you find it, you have to make a determination: is it acceptable, or is it a pattern that might be due to cancer?" He pointed at the X-ray. "This woman has cancer. She has these tiny little calcifications. Can you see them? Can you see how small they are?" He took out a magnifying glass and placed it over a series of white flecks; as a cancer grows, it produces calcium deposits. "That's the stuff we are looking for," he said.

  Then Dershaw added a series of slides to the light box and began to explain all the varieties that those white flecks came in. Some calcium deposits are oval and lucent. "They're called eggshell calcifications," Dershaw said. "And they're basically benign." Another kind of calcium runs like a railway track on either side of the breast's many blood vessels - that's benign, too. "Then there's calcium that's thick and heavy and looks like popcorn," Dershaw went on. "That's just dead tissue. That's benign. There's another calcification that's little sacs of calcium floating in liquid. It's called 'milk of calcium.' That's another kind of calcium that's always benign." He put a new set of slides against the light. "Then we have calcium that looks like this - irregular. All of these are of different density and different sizes and different configurations. Those are usually benign, but sometimes they are due to cancer. Remember you saw those railway tracks? This is calcium laid down inside a tube as well, but you can see that the outside of the tube is irregular. That's cancer." Dershaw's explanations were beginning to be confusing. "There are certain calcifications in benign tissues that are always benign," he said. "There are certain kinds that are always associated with cancer. But those are the ends of the spectrum, and the vast amount of calcium is somewhere in the middle. And making that differentiation, between whether the calcium is acceptable or not, is not clear-cut."

  The same is true of lumps. Some lumps are simply benign clumps of cells. You can tell they are benign because the walls of the mass look round and smooth; in a cancer, cells proliferate so wildly that the walls of the tumor tend to be ragged and to intrude into the surrounding tissue. But sometimes benign lumps resemble tumors, and sometimes tumors look a lot like benign lumps. And sometimes you have lots of masses that, taken individually, would be suspicious but are so pervasive that the reasonable conclusion is that this is just how the woman's breast looks. "If you have a CAT scan of the chest, the heart always looks like the heart, the aorta always looks like the aorta," Dershaw said. "So when there's a lump in the middle of that, it's clearly abnormal. Looking at a mammogram is conceptually different from looking at images elsewhere in the body. Everything else has anatomy - anatomy that essentially looks the same from one person to the next. But we don't have that kind of standardized information on the breast. The most difficult decision I think anybody needs to make when we're confronted with a patient is: Is this person normal? And we have to decide that without a pattern that is reasonably stable from individual to individual, and sometimes even without a pattern that is the same from the left side to the right."

  Dershaw was saying that mammography doesn't fit our normal expectations of pictures. In the days before the invention of photography, for instance, a horse in motion was represented in drawings and paintings according to the convention of ventre à terre, or "belly to the ground." Horses were drawn with their front legs extended beyond their heads, and their hind legs stretched straight back, because that was the way, in the blur of movement, a horse seemed to gallop. Then, in the eighteen-seventies, came Eadweard Muybridge, with his famous sequential photographs of a galloping horse, and that was the end of ventre à terre. Now we knew how a horse galloped. The photograph promised that we would now be able to capture reality itself.

  The situation with mammography is different. The way in which we ordinarily speak about calcium and lumps is clear and unambiguous. But the picture demonstrates how blurry those seemingly distinct categories actually are. Joann Elmore, a physician and epidemiologist at the University of Washington Harborview Medical Center, once asked ten board-certified radiologists to look at a hundred and fifty mammograms - of which twenty-seven had come from women who developed breast cancer, and a hundred and twenty-three from women who were known to have been healthy. One radiologist caught eighty-five per cent of the cancers the first time around. Another caught only thirty-seven per cent. One looked at the same X-rays and saw suspicious masses in seventy-eight per cent of the cases. Another doctor saw "focal asymmetric density" in half of the cancer cases; yet another saw no "focal asymmetric density" at all. There was one particularly perplexing mammogram that three radiologists thought was normal, two thought was abnormal but probably benign, four couldn't make up their minds about, and one was convinced was cancer. (The patient was fine.) Some of these differences are a matter of skill, and there is good evidence that with more rigorous training and experience radiologists can become better at reading breast X-rays. But so much of what can be seen on an X-ray falls into a gray area that interpreting a mammogram is also, in part, a matter of temperament. Some radiologists see something ambiguous and are comfortable calling it normal. Others see something ambiguous and get suspicious.

  Does that mean radiologists ought to be as suspicious as possible? You might think so, but caution simply creates another kind of problem. The radiologist in the Elmore study who caught the most cancers also recommended immediate workups - a biopsy, an ultrasound, or additional X-rays - on sixty-four per cent of the women who didn't have cancer. In the real world, a radiologist who needlessly subjected such an extraordinary percentage of healthy patients to the time, expense, anxiety, and discomfort of biopsies and further testing would find himself seriously out of step with his profession. Mammography is not a form of medical treatment, where doctors are justified in going to heroic lengths on behalf of their patients. Mammography is a form of medical screening: it is supposed to exclude the healthy, so that more time and attention can be given to the sick. If screening doesn't screen, it ceases to be useful.

  Gilbert Welch, a medical-outcomes expert at Dartmouth, has pointed out that, given current breast-cancer mortality rates, nine out of every thousand sixty-year-old women will die of breast cancer in the next ten years. If every one of those women had a mammogram every year, that number would fall to six. The radiologist seeing those thousand women, in other words, would read ten thousand X-rays over a decade in order to save three lives - and that's using the most generous possible estimate of mammography's effectiveness. The reason a radiologist is required to assume that the overwhelming number of ambiguous things are normal, in other words, is that the overwhelming number of ambiguous things really are normal. Radiologists are, in this sense, a lot like baggage screeners at airports. The chances are that the dark mass in the middle of the suitcase isn't a bomb, because you've seen a thousand dark masses like it in suitcases before, and none of those were bombs - and if you flagged every suitcase with something ambiguous in it no one would ever make his flight. But that doesn't mean, of course, that it isn't a bomb. All you have to go on is what it looks like on the X-ray screen - and the screen seldom gives you quite enough information.

  3.

  Dershaw picked up a new X-ray and put it on the light box. It belonged to a forty-eight-year-old woman. Mammograms indicate density in the breast: the denser the tissue is, the more the X rays are absorbed, creating the variations in black and white that make up the picture. Fat hardly absorbs the beam at all, so it shows up as black. Breast tissue, particularly the thick breast tissue of younger women, shows up on an X-ray as shades of light gray or white. This woman's breasts consisted of fat at the back of the breast and more dense, glandular tissue toward the front, so the X-ray was entirely black, with what looked like a large white, dense cloud behind the nipple. Clearly visible, in the black, fatty portion of the left breast, was a white spot. "Now, that looks like a cancer, that little smudgy, irregular, infiltrative thing," Dershaw said. "It's about five millimetres across." He looked at the X-ray for a moment. This was mammography at its best: a clear picture of a problem that needed to be fixed. Then he took a pen and pointed to the thick cloud just to the right of the tumor. The cloud and the tumor were exactly the same color. "That cancer only shows up because it's in the fatty part of the breast," he said. "If you take that cancer and put it in the dense part of the breast, you'd never see it, because the whiteness of the mass is the same as the whiteness of normal tissue. If the tumor was over there, it could be four times as big and we still wouldn't see it."

  What's more, mammography is especially likely to miss the tumors that do the most harm. A team led by the research pathologist Peggy Porter analyzed four hundred and twenty-nine breast cancers that had been diagnosed over five years at the Group Health Cooperative of Puget Sound. Of those, two hundred and seventy-nine were picked up by mammography, and the bulk of them were detected very early, at what is called Stage One. (Cancer is classified into four stages, according to how far the tumor has spread from its original position.) Most of the tumors were small, less than two centimetres. Pathologists grade a tumor's aggression according to such measures as the "mitotic count" - the rate at which the cells are dividing - and the screen-detected tumors were graded "low" in almost seventy per cent of the cases. These were the kinds of cancers that could probably be treated successfully. "Most tumors develop very, very slowly, and those tend to lay down calcium deposits - and what mammograms are doing is picking up those calcifications," Leslie Laufman, a hematologist-oncologist in Ohio, who served on a recent National Institutes of Health breast-cancer advisory panel, said. "Almost by definition, mammograms are picking up slow-growing tumors."

  A hundred and fifty cancers in Porter's study, however, were missed by mammography. Some of these were tumors the mammogram couldn't see - that were, for instance, hiding in the dense part of the breast. The majority, though, simply didn't exist at the time of the mammogram. These cancers were found in women who had had regular mammograms, and who were legitimately told that they showed no sign of cancer on their last visit. In the interval between X-rays, however, either they or their doctor had manually discovered a lump in their breast, and these "interval" cancers were twice as likely to be in Stage Three and three times as likely to have high mitotic counts; twenty-eight per cent had spread to the lymph nodes, as opposed to eighteen per cent of the screen-detected cancers. These tumors were so aggressive that they had gone from undetectable to detectable in the interval between two mammograms.

  The problem of interval tumors explains why the overwhelming majority of breast-cancer experts insist that women in the critical fifty-to-sixty-nine age group get regular mammograms. In Porter's study, the women were X-rayed at intervals as great as every three years, and that created a window large enough for interval cancers to emerge. Interval cancers also explain why many breast-cancer experts believe that mammograms must be supplemented by regular and thorough clinical breast exams. ("Thorough" is defined as palpation of the area from the collarbone to the bottom of the rib cage, one dime-size area at a time, at three levels of pressure - just below the skin, the mid-breast, and up against the chest wall - by a specially trained practitioner for a period not less than five minutes per breast.) In a major study of mammography's effectiveness - one of a pair of Canadian trials conducted in the nineteen-eighties - women who were given regular, thorough breast exams but no mammograms were compared with those who had thorough breast exams and regular mammograms, and no difference was found in the death rates from breast cancer between the two groups. The Canadian studies are controversial, and some breast-cancer experts are convinced that they may have understated the benefits of mammography. But there is no denying the basic lessons of the Canadian trials: that a skilled pair of fingertips can find out an extraordinary amount about the health of a breast, and that we should not automatically value what we see in a picture over what we learn from our other senses.

  "The finger has hundreds of sensors per square centimetre," says Mark Goldstein, a sensory psychophysicist who co-founded MammaCare, a company devoted to training nurses and physicians in the art of the clinical exam. "There is nothing in science or technology that has even come close to the sensitivity of the human finger with respect to the range of stimuli it can pick up. It's a brilliant instrument. But we simply don't trust our tactile sense as much as our visual sense."

  4.

  On the night of August 17, 1943, two hundred B-17 bombers from the United States Eighth Air Force set out from England for the German city of Schweinfurt. Two months later, two hundred and twenty-eight B-17s set out to strike Schweinfurt a second time. The raids were two of the heaviest nights of bombing in the war, and the Allied experience at Schweinfurt is an example of a more subtle - but in some cases more serious - problem with the picture paradigm.

  The Schweinfurt raids grew out of the United States military's commitment to bombing accuracy. As Stephen Budiansky writes in his wonderful recent book "Air Power," the chief lesson of aerial bombardment in the First World War was that hitting a target from eight or ten thousand feet was a prohibitively difficult task. In the thick of battle, the bombardier had to adjust for the speed of the plane, the speed and direction of the prevailing winds, and the pitching and rolling of the plane, all while keeping the bombsight level with the ground. It was an impossible task, requiring complex trigonometric calculations. For a variety of reasons, including the technical challenges, the British simply abandoned the quest for precision: in both the First World War and the Second, the British military pursued a strategy of "morale" or "area" bombing, in which bombs were simply dropped, indiscriminately, on urban areas, with the intention of killing, dispossessing, and dispiriting the German civilian population.

  But the American military believed that the problem of bombing accuracy was solvable, and a big part of the solution was something called the Norden bombsight. This breakthrough was the work of a solitary, cantankerous genius named Carl Norden, who operated out of a factory in New York City. Norden built a fifty-pound mechanical computer called the Mark XV, which used gears and wheels and gyroscopes to calculate airspeed, altitude, and crosswinds in order to determine the correct bomb-release point. The Mark XV, Norden's business partner boasted, could put a bomb in a pickle barrel from twenty thousand feet. The United States spent $1.5 billion developing it, which, as Budiansky points out, was more than half the amount that was spent building the atomic bomb. "At air bases, the Nordens were kept under lock and key in secure vaults, escorted to their planes by armed guards, and shrouded in a canvas cover until after takeoff," Budiansky recounts. The American military, convinced that its bombers could now hit whatever they could see, developed a strategic approach to bombing, identifying and selectively destroying targets that were critical to the Nazi war effort. In early 1943, General Henry (Hap) Arnold - the head of the Army Air Forces - assembled a group of prominent civilians to analyze the German economy and recommend critical targets. The Advisory Committee on Bombardment, as it was called, determined that the United States should target Germany's ball-bearing factories, since ball bearings were critical to the manufacture of airplanes. And the center of the German ball-bearing industry was Schweinfurt. Allied losses from the two raids were staggering. Thirty-six B-17s were shot down in the August attack, sixty-two bombers were shot down in the October raid, and between the two operations a further hundred and thirty-eight planes were badly damaged. Yet, with the war in the balance, this was considered worth the price. When the damage reports came in, Arnold exulted, "Now we have got Schweinfurt!" He was wrong.

  The problem was not, as in the case of the Scud hunt, that the target could not be found, or that what was thought to be the target was actually something else. The B-17s, aided by their Norden Mark XVs, hit the ball-bearing factories hard. The problem was that the picture Air Force officers had of their target didn't tell them what they really needed to know. The Germans, it emerged, had ample stockpiles of ball bearings. They also had no difficulty increasing their imports from Sweden and Switzerland, and, through a few simple design changes, they were able to greatly reduce their need for ball bearings in aircraft production. What's more, although the factory buildings were badly damaged by the bombing, the machinery inside wasn't. Ball-bearing equipment turned out to be surprisingly hardy. "As it was, not a tank, plane, or other piece of weaponry failed to be produced because of lack of ball bearings," Albert Speer, the Nazi production chief, wrote after the war. Seeing a problem and understanding it, then, are two different things.

  In recent years, with the rise of highly accurate long-distance weaponry, the Schweinfurt problem has become even more acute. If you can aim at and hit the kitchen at the back of a house, after all, you don't have to bomb the whole building. So your bomb can be two hundred pounds rather than a thousand. That means, in turn, that you can fit five times as many bombs on a single plane and hit five times as many targets in a single sortie, which sounds good - except that now you need to get intelligence on five times as many targets. And that intelligence has to be five times more specific, because if the target is in the bedroom and not the kitchen, you've missed him.

  This is the issue that the United States command faced in the most recent Iraq war. Early in the campaign, the military mounted a series of air strikes against specific targets, where Saddam Hussein or other senior Baathist officials were thought to be hiding. There were fifty of these so-called "decapitation" attempts, each taking advantage of the fact that modern-day G.P.S.-guided bombs can be delivered from a fighter to within thirteen metres of their intended target. The strikes were dazzling in their precision. In one case, a restaurant was levelled. In another, a bomb burrowed down into a basement. But, in the end, every single strike failed. "The issue isn't accuracy," Watts, who has written extensively on the limitations of high-tech weaponry, says. "The issue is the quality of targeting information. The amount of information we need has gone up an order of magnitude or two in the last decade."

  5.

  Mammography has a Schweinfurt problem as well. Nowhere is that more evident than in the case of the breast lesion known as ductal carcinoma in situ, or DCIS, which shows up as a cluster of calcifications inside the ducts that carry milk to the nipple. It's a tumor that hasn't spread beyond those ducts, and it is so tiny that without mammography few women with DCIS would ever know they had it. In the past couple of decades, as more and more people have received regular breast X-rays and the resolution of mammography has increased, diagnoses of DCIS have soared. About fifty thousand new cases are now found every year in the United States, and virtually every DCIS lesion detected by mammography is promptly removed. But what has the targeting and destruction of DCIS meant for the battle against breast cancer? You'd expect that if we've been catching fifty thousand early-stage cancers every year, we should be seeing a corresponding decrease in the number of late-stage invasive cancers. It's not clear whether we have. During the past twenty years, the incidence of invasive breast cancer has continued to rise by the same small, steady increment every year.

  In 1987, pathologists in Denmark performed a series of autopsies of women in their forties who had not been known to have breast cancer when they died of other causes. The pathologists looked at an average of two hundred and seventy-five samples of breast tissue in each case, and found some evidence of cancer - usually DCIS - in nearly forty per cent of the women. Since breast cancer accounts for less than four per cent of female deaths, clearly the overwhelming majority of these women, had they lived longer, would never have died of breast cancer. "To me, that indicates that these kinds of genetic changes happen really frequently, and that they can happen without having an impact on women's health," Karla Kerlikowske, a breast-cancer expert at the University of California at San Francisco, says. "The body has this whole mechanism to repair things, and maybe that's what happened with these tumors." Gilbert Welch, the medical-outcomes expert, thinks that we fail to understand the hit-or-miss nature of cancerous growth, and assume it to be a process that, in the absence of intervention, will eventually kill us. "A pathologist from the International Agency for Research on Cancer once told me that the biggest mistake we ever made was attaching the word 'carcinoma' to DCIS," Welch says. "The minute carcinoma got linked to it, it all of a sudden drove doctors to recommend therapy, because what was implied was that this was a lesion that would inexorably progress to invasive cancer. But we know that that's not always the case."

  In some percentage of cases, however, DCIS does progress to something more serious. Some studies suggest that this happens very infrequently. Others suggest that it happens frequently enough to be of major concern. There is no definitive answer, and it's all but impossible to tell, simply by looking at a mammogram, whether a given DCIS tumor is among those lesions which will grow out from the duct or part of the majority that will never amount to anything. That's why some doctors feel that we have no choice but to treat every DCIS as life-threatening, and in thirty per cent of cases that means a mastectomy, and in another thirty-five per cent it means a lumpectomy and radiation. Would taking a better picture solve the problem? Not really, because the problem is that you don't know for sure what you're seeing, and as pictures have become better we have put ourselves in a position where we see more and more things that we don't know how to interpret. When it comes to DCIS, the mammogram delivers information without true understanding. "Almost half a million women have been diagnosed and treated for DCIS since the early nineteen-eighties - a diagnosis virtually unknown before then," Welch writes in his new book, "Should I Be Tested for Cancer?," a brilliant account of the statistical and medical uncertainties surrounding cancer screening. "This increase is the direct result of looking harder - in this case with 'better' mammography equipment. But I think you can see why it is a diagnosis that some women might reasonably prefer not to know about."

  6.

  The disturbing thing about DCIS, of course, is that our approach to this tumor seems like a textbook example of how the battle against cancer is supposed to work. Use a powerful camera. Take a detailed picture. Spot the tumor as early as possible. Treat it immediately and aggressively. The campaign to promote regular mammograms has used this early-detection argument with great success, because it makes intuitive sense. The danger posed by a tumor is represented visually. Large is bad; small is better - less likely to have metastasized. But here, too, tumors defy our visual intuitions.

  According to Donald Berry, who is the chairman of the Department of Biostatistics and Applied Mathematics at M. D. Anderson Cancer Center, in Houston, a woman's risk of death increases only by about ten per cent for every additional centimetre in tumor length. "Suppose there is a tumor size above which the tumor is lethal, and below which it's not," Berry says. "The problem is that the threshold varies. When we find a tumor, we don't know whether it has metastasized already. And we don't know whether it's tumor size that drives the metastatic process or whether all you need is a few million cells to start sloughing off to other parts of the body. We do observe that it's worse to have a bigger tumor. But not amazingly worse. The relationship is not as great as you'd think."

  In a recent genetic analysis of breast-cancer tumors, scientists selected women with breast cancer who had been followed for many years, and divided them into two groups - those whose cancer had gone into remission, and those whose cancer spread to the rest of their body. Then the scientists went back to the earliest moment that each cancer became apparent, and analyzed thousands of genes in order to determine whether it was possible to predict, at that moment, who was going to do well and who wasn't. Early detection presumes that it isn't possible to make that prediction: a tumor is removed before it becomes truly dangerous. But scientists discovered that even with tumors in the one-centimetre range - the range in which cancer is first picked up by a mammogram - the fate of the cancer seems already to have been set. "What we found is that there is biology that you can glean from the tumor, at the time you take it out, that is strongly predictive of whether or not it will go on to metastasize," Stephen Friend, a member of the gene-expression team at Merck, says. "We like to think of a small tumor as an innocent. The reality is that in that innocent lump are a lot of behaviors that spell a potential poor or good prognosis."

  The good news here is that it might eventually be possible to screen breast cancers on a genetic level, using other kinds of tests - even blood tests - to look for the biological traces of those genes. This might also help with the chronic problem of overtreatment in breast cancer. If we can single out that small percentage of women whose tumors will metastasize, we can spare the rest the usual regimen of surgery, radiation, and chemotherapy. Gene-signature research is one of a number of reasons that many scientists are optimistic about the fight against breast cancer. But it is an advance that has nothing to do with taking more pictures, or taking better pictures. It has to do with going beyond the picture.

  Under the circumstances, it is not hard to understand why mammography draws so much controversy. The picture promises certainty, and it cannot deliver on that promise. Even after forty years of research, there remains widespread disagreement over how much benefit women in the critical fifty-to-sixty-nine age bracket receive from breast X-rays, and further disagreement about whether there is enough evidence to justify regular mammography in women under fifty and over seventy. Is there any way to resolve the disagreement? Donald Berry says that there probably isn't - that a clinical trial that could definitively answer the question of mammography's precise benefits would have to be so large (involving more than five hundred thousand women) and so expensive (costing billions of dollars) as to be impractical. The resulting confusion has turned radiologists who do mammograms into one of the chief targets of malpractice litigation. "The problem is that mammographers - radiology groups - do hundreds of thousands of these mammograms, giving women the illusion that these things work and they are good, and if a lump is found and in most cases if it is found early, they tell women they have the probability of a higher survival rate," says E. Clay Parker, a Florida plaintiff's attorney, who recently won a $5.1 million judgment against an Orlando radiologist. "But then, when it comes to defending themselves, they tell you that the reality is that it doesn't make a difference when you find it. So you scratch your head and say, 'Well, why do you do mammography, then?'"

  The answer is that mammograms do not have to be infallible to save lives. A modest estimate of mammography's benefit is that it reduces the risk of dying from breast cancer by about ten per cent - which works out, for the average woman in her fifties, to be about three extra days of life, or, to put it another way, a health benefit on a par with wearing a helmet on a ten-hour bicycle trip. That is not a trivial benefit. Multiplied across the millions of adult women in the United States, it amounts to thousands of lives saved every year, and, in combination with a medical regimen that includes radiation, surgery, and new and promising drugs, it has helped brighten the prognosis for women with breast cancer. Mammography isn't as a good as we'd like it to be. But we are still better off than we would be without it.

  "There is increasingly an understanding among those of us who do this a lot that our efforts to sell mammography may have been over-vigorous," Dershaw said, "and that although we didn't intend to, the perception may have been that mammography accomplishes even more than it does." He was looking, as he spoke, at the mammogram of the woman whose tumor would have been invisible had it been a few centimetres to the right. Did looking at an X-ray like that make him nervous? Dershaw shook his head. "You have to respect the limitations of the technology," he said. "My job with the mammogram isn't to find what I can't find with a mammogram. It's to find what I can find with a mammogram. If I'm not going to accept that, then I shouldn't be reading mammograms."

  7.

  In February of last year, just before the start of the Iraq war, Secretary of State Colin Powell went before the United Nations to declare that Iraq was in defiance of international law. He presented transcripts of telephone conversations between senior Iraqi military officials, purportedly discussing attempts to conceal weapons of mass destruction. He told of eyewitness accounts of mobile biological-weapons facilities. And, most persuasively, he presented a series of images - carefully annotated, high-resolution satellite photographs of what he said was the Taji Iraqi chemical-munitions facility.

  "Let me say a word about satellite images before I show a couple," Powell began. "The photos that I am about to show you are sometimes hard for the average person to interpret, hard for me. The painstaking work of photo analysis takes experts with years and years of experience, poring for hours and hours over light tables. But as I show you these images, I will try to capture and explain what they mean, what they indicate, to our imagery specialists." The first photograph was dated November 10, 2002, just three months earlier, and years after the Iraqis were supposed to have rid themselves of all weapons of mass destruction. "Let me give you a closer look," Powell said as he flipped to a closeup of the first photograph. It showed a rectangular building, with a vehicle parked next to it. "Look at the image on the left. On the left is a closeup of one of the four chemical bunkers. The two arrows indicate the presence of sure signs that the bunkers are storing chemical munitions. The arrow at the top that says 'Security' points to a facility that is a signature item for this kind of bunker. Inside that facility are special guards and special equipment to monitor any leakage that might come out of the bunker." Then he moved to the vehicle next to the building. It was, he said, another signature item. "It's a decontamination vehicle in case something goes wrong. . . . It is moving around those four and it moves as needed to move as people are working in the different bunkers."

  Powell's analysis assumed, of course, that you could tell from the picture what kind of truck it was. But pictures of trucks, taken from above, are not always as clear as we would like; sometimes trucks hauling oil tanks look just like trucks hauling Scud launchers, and, while a picture is a good start, if you really want to know what you're looking at you probably need more than a picture. I looked at the photographs with Patrick Eddington, who for many years was an imagery analyst with the C.I.A. Eddington examined them closely. "They're trying to say that those are decontamination vehicles," he told me. He had a photo up on his laptop, and he peered closer to get a better look. "But the resolution is sufficient for me to say that I don't think it is - and I don't see any other decontamination vehicles down there that I would recognize." The standard decontamination vehicle was a Soviet-made box-body van, Eddington said. This truck was too long. For a second opinion, Eddington recommended Ray McGovern, a twenty-seven-year C.I.A. analyst, who had been one of George H. W. Bush's personal intelligence briefers when he was Vice-President. "If you're an expert, you can tell one hell of a lot from pictures like this," McGovern said. He'd heard another interpretation. "I think," he said, "that it's a fire truck."
GO TO TOP MENU

  Should a charge of plagiarism ruin your life?

  1.

  One day this spring, a psychiatrist named Dorothy Lewis got a call from her friend Betty, who works in New York City. Betty had just seen a Broadway play called "Frozen," written by the British playwright Bryony Lavery. "She said, 'Somehow it reminded me of you. You really ought to see it,'" Lewis recalled. Lewis asked Betty what the play was about, and Betty said that one of the characters was a psychiatrist who studied serial killers. "And I told her, 'I need to see that as much as I need to go to the moon.'"

  Lewis has studied serial killers for the past twenty-five years. With her collaborator, the neurologist Jonathan Pincus, she has published a great many research papers, showing that serial killers tend to suffer from predictable patterns of psychological, physical, and neurological dysfunction: that they were almost all the victims of harrowing physical and sexual abuse as children, and that almost all of them have suffered some kind of brain injury or mental illness. In 1998, she published a memoir of her life and work entitled "Guilty by Reason of Insanity." She was the last person to visit Ted Bundy before he went to the electric chair. Few people in the world have spent as much time thinking about serial killers as Dorothy Lewis, so when her friend Betty told her that she needed to see "Frozen" it struck her as a busman's holiday.

  But the calls kept coming. "Frozen" was winning raves on Broadway, and it had been nominated for a Tony. Whenever someone who knew Dorothy Lewis saw it, they would tell her that she really ought to see it, too. In June, she got a call from a woman at the theatre where "Frozen" was playing. "She said she'd heard that I work in this field, and that I see murderers, and she was wondering if I would do a talk-back after the show," Lewis said. "I had done that once before, and it was a delight, so I said sure. And I said, would you please send me the script, because I wanted to read the play."

  The script came, and Lewis sat down to read it. Early in the play, something caught her eye, a phrase: "it was one of those days." One of the murderers Lewis had written about in her book had used that same expression. But she thought it was just a coincidence. "Then, there's a scene of a woman on an airplane, typing away to her friend. Her name is Agnetha Gottmundsdottir. I read that she's writing to her colleague, a neurologist called David Nabkus. And with that I realized that more was going on, and I realized as well why all these people had been telling me to see the play."

  Lewis began underlining line after line. She had worked at New York University School of Medicine. The psychiatrist in "Frozen" worked at New York School of Medicine. Lewis and Pincus did a study of brain injuries among fifteen death-row inmates. Gottmundsdottir and Nabkus did a study of brain injuries among fifteen death-row inmates. Once, while Lewis was examining the serial killer Joseph Franklin, he sniffed her, in a grotesque, sexual way. Gottmundsdottir is sniffed by the play's serial killer, Ralph. Once, while Lewis was examining Ted Bundy, she kissed him on the cheek. Gottmundsdottir, in some productions of "Frozen," kisses Ralph. "The whole thing was right there," Lewis went on. "I was sitting at home reading the play, and I realized that it was I. I felt robbed and violated in some peculiar way. It was as if someone had stolen--I don't believe in the soul, but, if there was such a thing, it was as if someone had stolen my essence."

  Lewis never did the talk-back. She hired a lawyer. And she came down from New Haven to see "Frozen." "In my book," she said, "I talk about where I rush out of the house with my black carry-on, and I have two black pocketbooks, and the play opens with her"--Agnetha--"with one big black bag and a carry-on, rushing out to do a lecture." Lewis had written about biting her sister on the stomach as a child. Onstage, Agnetha fantasized out loud about attacking a stewardess on an airplane and "biting out her throat." After the play was over, the cast came onstage and took questions from the audience. "Somebody in the audience said, 'Where did Bryony Lavery get the idea for the psychiatrist?'" Lewis recounted. "And one of the cast members, the male lead, said, 'Oh, she said that she read it in an English medical magazine.'" Lewis is a tiny woman, with enormous, childlike eyes, and they were wide open now with the memory. "I wouldn't have cared if she did a play about a shrink who's interested in the frontal lobe and the limbic system. That's out there to do. I see things week after week on television, on 'Law & Order' or 'C.S.I.,' and I see that they are using material that Jonathan and I brought to light. And it's wonderful. That would have been acceptable. But she did more than that. She took things about my own life, and that is the part that made me feel violated."

  At the request of her lawyer, Lewis sat down and made up a chart detailing what she felt were the questionable parts of Lavery's play. The chart was fifteen pages long. The first part was devoted to thematic similarities between "Frozen" and Lewis's book "Guilty by Reason of Insanity." The other, more damning section listed twelve instances of almost verbatim similarities--totalling perhaps six hundred and seventy-five words--between passages from "Frozen" and passages from a 1997 magazine profile of Lewis. The profile was called "Damaged." It appeared in the February 24, 1997, issue of The New Yorker. It was written by me.

  2.

  Words belong to the person who wrote them. There are few simpler ethical notions than this one, particularly as society directs more and more energy and resources toward the creation of intellectual property. In the past thirty years, copyright laws have been strengthened. Courts have become more willing to grant intellectual-property protections. Fighting piracy has become an obsession with Hollywood and the recording industry, and, in the worlds of academia and publishing, plagiarism has gone from being bad literary manners to something much closer to a crime. When, two years ago, Doris Kearns Goodwin was found to have lifted passages from several other historians, she was asked to resign from the board of the Pulitzer Prize committee. And why not? If she had robbed a bank, she would have been fired the next day.

  I'd worked on "Damaged" through the fall of 1996. I would visit Dorothy Lewis in her office at Bellevue Hospital, and watch the videotapes of her interviews with serial killers. At one point, I met up with her in Missouri. Lewis was testifying at the trial of Joseph Franklin, who claims responsibility for shooting, among others, the civil-rights leader Vernon Jordan and the pornographer Larry Flynt. In the trial, a videotape was shown of an interview that Franklin once gave to a television station. He was asked whether he felt any remorse. I wrote:

"I can't say that I do," he said. He paused again, then added, "The only thing I'm sorry about is that it's not legal."
"What's not legal?"
Franklin answered as if he'd been asked the time of day: "Killing Jews."

  That exchange, almost to the word, was reproduced in "Frozen."

  Lewis, the article continued, didn't feel that Franklin was fully responsible for his actions. She viewed him as a victim of neurological dysfunction and childhood physical abuse. "The difference between a crime of evil and a crime of illness," I wrote, "is the difference between a sin and a symptom." That line was in "Frozen," too--not once but twice. I faxed Bryony Lavery a letter:

I am happy to be the source of inspiration for other writers, and had you asked for my permission to quote--even liberally--from my piece, I would have been delighted to oblige. But to lift material, without my approval, is theft.

  Almost as soon as I'd sent the letter, though, I began to have second thoughts. The truth was that, although I said I'd been robbed, I didn't feel that way. Nor did I feel particularly angry. One of the first things I had said to a friend after hearing about the echoes of my article in "Frozen" was that this was the only way I was ever going to get to Broadway--and I was only half joking. On some level, I considered Lavery's borrowing to be a compliment. A savvier writer would have changed all those references to Lewis, and rewritten the quotes from me, so that their origin was no longer recognizable. But how would I have been better off if Lavery had disguised the source of her inspiration?

  Dorothy Lewis, for her part, was understandably upset. She was considering a lawsuit. And, to increase her odds of success, she asked me to assign her the copyright to my article. I agreed, but then I changed my mind. Lewis had told me that she "wanted her life back." Yet in order to get her life back, it appeared, she first had to acquire it from me. That seemed a little strange.

  Then I got a copy of the script for "Frozen." I found it breathtaking. I realize that this isn't supposed to be a relevant consideration. And yet it was: instead of feeling that my words had been taken from me, I felt that they had become part of some grander cause. In late September, the story broke. The Times, the Observer in England, and the Associated Press all ran stories about Lavery's alleged plagiarism, and the articles were picked up by newspapers around the world. Bryony Lavery had seen one of my articles, responded to what she read, and used it as she constructed a work of art. And now her reputation was in tatters. Something about that didn't seem right.

  3.

  In 1992, the Beastie Boys released a song called "Pass the Mic," which begins with a six-second sample taken from the 1976 composition "Choir," by the jazz flutist James Newton. The sample was an exercise in what is called multiphonics, where the flutist "overblows" into the instrument while simultaneously singing in a falsetto. In the case of "Choir," Newton played a C on the flute, then sang C, D-flat, C--and the distortion of the overblown C, combined with his vocalizing, created a surprisingly complex and haunting sound. In "Pass the Mic," the Beastie Boys repeated the Newton sample more than forty times. The effect was riveting.

  In the world of music, copyrighted works fall into two categories--the recorded performance and the composition underlying that performance. If you write a rap song, and want to sample the chorus from Billy Joel's "Piano Man," you first have to get permission from the record label to use the "Piano Man" recording, and then get permission from Billy Joel (or whoever owns his music) to use the underlying composition. In the case of "Pass the Mic," the Beastie Boys got the first kind of permission--the rights to use the recording of "Choir"--but not the second. Newton sued, and he lost--and the reason he lost serves as a useful introduction to how to think about intellectual property.

  At issue in the case wasn't the distinctiveness of Newton's performance. The Beastie Boys, everyone agreed, had properly licensed Newton's performance when they paid the copyright recording fee. And there was no question about whether they had copied the underlying music to the sample. At issue was simply whether the Beastie Boys were required to ask for that secondary permission: was the composition underneath those six seconds so distinctive and original that Newton could be said to own it? The court said that it wasn't.

  The chief expert witness for the Beastie Boys in the "Choir" case was Lawrence Ferrara, who is a professor of music at New York University, and when I asked him to explain the court's ruling he walked over to the piano in the corner of his office and played those three notes: C, D-flat, C. "That's it!" he shouted. "There ain't nothing else! That's what was used. You know what this is? It's no more than a mordent, a turn. It's been done thousands upon thousands of times. No one can say they own that."

  Ferrara then played the most famous four-note sequence in classical music, the opening of Beethoven's Fifth: G, G, G, E-flat. This was unmistakably Beethoven. But was it original? "That's a harder case," Ferrara said. "Actually, though, other composers wrote that. Beethoven himself wrote that in a piano sonata, and you can find figures like that in composers who predate Beethoven. It's one thing if you're talking about da-da-da dummm, da-da-da dummm--those notes, with those durations. But just the four pitches, G, G, G, E-flat? Nobody owns those."

  Ferrara once served as an expert witness for Andrew Lloyd Webber, who was being sued by Ray Repp, a composer of Catholic folk music. Repp said that the opening few bars of Lloyd Webber's 1984 "Phantom Song," from "The Phantom of the Opera," bore an overwhelming resemblance to his composition "Till You," written six years earlier, in 1978. As Ferrara told the story, he sat down at the piano again and played the beginning of both songs, one after the other; sure enough, they sounded strikingly similar. "Here's Lloyd Webber," he said, calling out each note as he played it. "Here's Repp. Same sequence. The only difference is that Andrew writes a perfect fourth and Repp writes a sixth."

  But Ferrara wasn't quite finished. "I said, let me have everything Andrew Lloyd Webber wrote prior to 1978--'Jesus Christ Superstar,"Joseph,"Evita.'" He combed through every score, and in "Joseph and the Amazing Technicolor Dreamcoat" he found what he was looking for. "It's the song 'Benjamin Calypso.'" Ferrara started playing it. It was immediately familiar. "It's the first phrase of 'Phantom Song.' It's even using the same notes. But wait--it gets better. Here's 'Close Every Door,' from a 1969 concert performance of 'Joseph.'" Ferrara is a dapper, animated man, with a thin, well-manicured mustache, and thinking about the Lloyd Webber case was almost enough to make him jump up and down. He began to play again. It was the second phrase of "Glad27"The first half of 'Phantom' is in 'Benjamin Calypso.' The second half is in 'Close Every Door.' They are identical. On the button. In the case of the first theme, in fact, 'Benjamin Calypso' is closer to the first half of the theme at issue than the plaintiff's song. Lloyd Webber writes something in 1984, and he borrows from himself."

  In the "Choir" case, the Beastie Boys' copying didn't amount to theft because it was too trivial. In the "Phantom" case, what Lloyd Webber was alleged to have copied didn't amount to theft because the material in question wasn't original to his accuser. Under copyright law, what matters is not that you copied someone else's work. What matters is what you copied, and how much you copied. Intellectual-property doctrine isn't a straightforward application of the ethical principle "Thou shalt not steal." At its core is the notion that there are certain situations where you can steal. The protections of copyright, for instance, are time-limited; once something passes into the public domain, anyone can copy it without restriction. Or suppose that you invented a cure for breast cancer in your basement lab. Any patent you received would protect your intellectual property for twenty years, but after that anyone could take your invention. You get an initial monopoly on your creation because we want to provide economic incentives for people to invent things like cancer drugs. But everyone gets to steal your breast-cancer cure--after a decent interval--because it is also in society's interest to let as many people as possible copy your invention; only then can others learn from it, and build on it, and come up with better and cheaper alternatives. This balance between the protecting and the limiting of intellectual property is, in fact, enshrined in the Constitution: "Congress shall have the power to promote the Progress of Science and useful Arts, by securing for limited"--note that specification, limited--"Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."

  4.

  So is it true that words belong to the person who wrote them, just as other kinds of property belong to their owners? Actually, no. As the Stanford law professor Lawrence Lessig argues in his new book "Free Culture":

In ordinary language, to call a copyright a "property" right is a bit misleading, for the property of copyright is an odd kind of property. . . . I understand what I am taking when I take the picnic table you put in your backyard. I am taking a thing, the picnic table, and after I take it, you don't have it. But what am I taking when I take the good idea you had to put a picnic table in the backyard--by, for example, going to Sears, buying a table, and putting it in my backyard? What is the thing that I am taking then?
The point is not just about the thingness of picnic tables versus ideas, though that is an important difference. The point instead is that in the ordinary case--indeed, in practically every case except for a narrow range of exceptions--ideas released to the world are free. I don't take anything from you when I copy the way you dress--though I might seem weird if I do it every day. . . . Instead, as Thomas Jefferson said (and this is especially true when I copy the way someone dresses), "He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me."

  Lessig argues that, when it comes to drawing this line between private interests and public interests in intellectual property, the courts and Congress have, in recent years, swung much too far in the direction of private interests. He writes, for instance, about the fight by some developing countries to get access to inexpensive versions of Western drugs through what is called "parallel importation"--buying drugs from another developing country that has been licensed to produce patented medicines. The move would save countless lives. But it has been opposed by the United States not on the ground that it would cut into the profits of Western pharmaceutical companies (they don't sell that many patented drugs in developing countries anyway) but on the ground that it violates the sanctity of intellectual property. "We as a culture have lost this sense of balance," Lessig writes. "A certain property fundamentalism, having no connection to our tradition, now reigns in this culture."

  Even what Lessig decries as intellectual-property extremism, however, acknowledges that intellectual property has its limits. The United States didn't say that developing countries could never get access to cheap versions of American drugs. It said only that they would have to wait until the patents on those drugs expired. The arguments that Lessig has with the hard-core proponents of intellectual property are almost all arguments about where and when the line should be drawn between the right to copy and the right to protection from copying, not whether a line should be drawn.

  But plagiarism is different, and that's what's so strange about it. The ethical rules that govern when it's acceptable for one writer to copy another are even more extreme than the most extreme position of the intellectual-property crowd: when it comes to literature, we have somehow decided that copying is never acceptable. Not long ago, the Harvard law professor Laurence Tribe was accused of lifting material from the historian Henry Abraham for his 1985 book, "God Save This Honorable Court." What did the charge amount to? In an exposé that appeared in the conservative publication The Weekly Standard, Joseph Bottum produced a number of examples of close paraphrasing, but his smoking gun was this one borrowed sentence: "Taft publicly pronounced Pitney to be a 'weak member' of the Court to whom he could not assign cases." That's it. Nineteen words.

  Not long after I learned about "Frozen," I went to see a friend of mine who works in the music industry. We sat in his living room on the Upper East Side, facing each other in easy chairs, as he worked his way through a mountain of CDs. He played "Angel," by the reggae singer Shaggy, and then "The Joker," by the Steve Miller Band, and told me to listen very carefully to the similarity in bass lines. He played Led Zeppelin's "Whole Lotta Love" and then Muddy Waters's "You Need Love," to show the extent to which Led Zeppelin had mined the blues for inspiration. He played "Twice My Age," by Shabba Ranks and Krystal, and then the saccharine seventies pop standard "Seasons in the Sun," until I could hear the echoes of the second song in the first. He played "Last Christmas," by Wham!, followed by Barry Manilow's "Can't Smile Without You" to explain why Manilow might have been startled when he first heard that song, and then "Joanna," by Kool and the Gang, because, in a different way, "Last Christmas" was an homage to Kool and the Gang as well. "That sound you hear in Nirvana," my friend said at one point, "that soft and then loud, kind of exploding thing, a lot of that was inspired by the Pixies. Yet Kurt Cobain"--Nirvana's lead singer and songwriter--"was such a genius that he managed to make it his own. And 'Smells Like Teen Spirit'?"--here he was referring to perhaps the best-known Nirvana song. "That's Boston's 'More Than a Feeling.'" He began to hum the riff of the Boston hit, and said, "The first time I heard 'Teen Spirit,' I said, 'That guitar lick is from "More Than a Feeling."' But it was different--it was urgent and brilliant and new."

  He played another CD. It was Rod Stewart's "Do Ya Think I'm Sexy," a huge hit from the nineteen-seventies. The chorus has a distinctive, catchy hook--the kind of tune that millions of Americans probably hummed in the shower the year it came out. Then he put on "Taj Mahal," by the Brazilian artist Jorge Ben Jor, which was recorded several years before the Rod Stewart song. In his twenties, my friend was a d.j. at various downtown clubs, and at some point he'd become interested in world music. "I caught it back then," he said. A small, sly smile spread across his face. The opening bars of "Taj Mahal" were very South American, a world away from what we had just listened to. And then I heard it. It was so obvious and unambiguous that I laughed out loud; virtually note for note, it was the hook from "Do Ya Think I'm Sexy." It was possible that Rod Stewart had independently come up with that riff, because resemblance is not proof of influence. It was also possible that he'd been in Brazil, listened to some local music, and liked what he heard.

  My friend had hundreds of these examples. We could have sat in his living room playing at musical genealogy for hours. Did the examples upset him? Of course not, because he knew enough about music to know that these patterns of influence--cribbing, tweaking, transforming--were at the very heart of the creative process. True, copying could go too far. There were times when one artist was simply replicating the work of another, and to let that pass inhibited true creativity. But it was equally dangerous to be overly vigilant in policing creative expression, because if Led Zeppelin hadn't been free to mine the blues for inspiration we wouldn't have got "Whole Lotta Love," and if Kurt Cobain couldn't listen to "More Than a Feeling" and pick out and transform the part he really liked we wouldn't have "Smells Like Teen Spirit"--and, in the evolution of rock, "Smells Like Teen Spirit" was a real step forward from "More Than a Feeling." A successful music executive has to understand the distinction between borrowing that is transformative and borrowing that is merely derivative, and that distinction, I realized, was what was missing from the discussion of Bryony Lavery's borrowings. Yes, she had copied my work. But no one was asking why she had copied it, or what she had copied, or whether her copying served some larger purpose.

  5.

  Bryony Lavery came to see me in early October. It was a beautiful Saturday afternoon, and we met at my apartment. She is in her fifties, with short tousled blond hair and pale-blue eyes, and was wearing jeans and a loose green shirt and clogs. There was something rugged and raw about her. In the Times the previous day, the theatre critic Ben Brantley had not been kind to her new play, "Last Easter." This was supposed to be her moment of triumph. "Frozen" had been nominated for a Tony. "Glad28"Last Easter" had opened Off Broadway. And now? She sat down heavily at my kitchen table. "I've had the absolute gamut of emotions," she said, playing nervously with her hands as she spoke, as if she needed a cigarette. "I think when one's working, one works between absolute confidence and absolute doubt, and I got a huge dollop of each. I was terribly confident that I could write well after 'Frozen,' and then this opened a chasm of doubt." She looked up at me. "I'm terribly sorry," she said.

  Lavery began to explain: "What happens when I write is that I find that I'm somehow zoning on a number of things. I find that I've cut things out of newspapers because the story or something in them is interesting to me, and seems to me to have a place onstage. Then it starts coagulating. It's like the soup starts thickening. And then a story, which is also a structure, starts emerging. I'd been reading thrillers like 'The Silence of the Lambs,' about fiendishly clever serial killers. I'd also seen a documentary of the victims of the Yorkshire killers, Myra Hindley and Ian Brady, who were called the Moors Murderers. They spirited away several children. It seemed to me that killing somehow wasn't fiendishly clever. It was the opposite of clever. It was as banal and stupid and destructive as it could be. There are these interviews with the survivors, and what struck me was that they appeared to be frozen in time. And one of them said, 'If that man was out now, I'm a forgiving man but I couldn't forgive him. I'd kill him.' That's in 'Frozen.' I was thinking about that. Then my mother went into hospital for a very simple operation, and the surgeon punctured her womb, and therefore her intestine, and she got peritonitis and died."

  When Lavery started talking about her mother, she stopped, and had to collect herself. "She was seventy-four, and what occurred to me is that I utterly forgave him. I thought it was an honest mistake. I'm very sorry it happened to my mother, but it's an honest mistake." Lavery's feelings confused her, though, because she could think of people in her own life whom she had held grudges against for years, for the most trivial of reasons. "In a lot of ways, 'Frozen' was an attempt to understand the nature of forgiveness," she said.

  Lavery settled, in the end, on a play with three characters. The first is a serial killer named Ralph, who kidnaps and murders a young girl. The second is the murdered girl's mother, Nancy. The third is a psychiatrist from New York, Agnetha, who goes to England to examine Ralph. In the course of the play, the three lives slowly intersect--and the characters gradually change and become "unfrozen" as they come to terms with the idea of forgiveness. For the character of Ralph, Lavery says that she drew on a book about a serial killer titled "The Murder of Childhood," by Ray Wyre and Tim Tate. For the character of Nancy, she drew on an article written in the Guardian by a woman named Marian Partington, whose sister had been murdered by the serial killers Frederick and Rosemary West. And, for the character of Agnetha, Lavery drew on a reprint of my article that she had read in a British publication. "I wanted a scientist who would understand," Lavery said--a scientist who could explain how it was possible to forgive a man who had killed your daughter, who could explain that a serial killing was not a crime of evil but a crime of illness. "I wanted it to be accurate," she added.

  So why didn't she credit me and Lewis? How could she have been so meticulous about accuracy but not about attribution? Lavery didn't have an answer. "I thought it was O.K. to use it," she said with an embarrassed shrug. "It never occurred to me to ask you. I thought it was news."

  She was aware of how hopelessly inadequate that sounded, and when she went on to say that my article had been in a big folder of source material that she had used in the writing of the play, and that the folder had got lost during the play's initial run, in Birmingham, she was aware of how inadequate that sounded, too.

  But then Lavery began to talk about Marian Partington, her other important inspiration, and her story became more complicated. While she was writing "Frozen," Lavery said, she wrote to Partington to inform her of how much she was relying on Partington's experiences. And when "Frozen" opened in London she and Partington met and talked. In reading through articles on Lavery in the British press, I found this, from the Guardian two years ago, long before the accusations of plagiarism surfaced:

Lavery is aware of the debt she owes to Partington's writing and is eager to acknowledge it.
"I always mention it, because I am aware of the enormous debt that I owe to the generosity of Marian Partington's piece . . . . You have to be hugely careful when writing something like this, because it touches on people's shattered lives and you wouldn't want them to come across it unawares."

  Lavery wasn't indifferent to other people's intellectual property, then; she was just indifferent to my intellectual property. That's because, in her eyes, what she took from me was different. It was, as she put it, "news." She copied my description of Dorothy Lewis's collaborator, Jonathan Pincus, conducting a neurological examination. She copied the description of the disruptive neurological effects of prolonged periods of high stress. She copied my transcription of the television interview with Franklin. She reproduced a quote that I had taken from a study of abused children, and she copied a quotation from Lewis on the nature of evil. She didn't copy my musings, or conclusions, or structure. She lifted sentences like "It is the function of the cortex--and, in particular, those parts of the cortex beneath the forehead, known as the frontal lobes--to modify the impulses that surge up from within the brain, to provide judgment, to organize behavior and decision-making, to learn and adhere to rules of everyday life." It is difficult to have pride of authorship in a sentence like that. My guess is that it's a reworked version of something I read in a textbook. Lavery knew that failing to credit Partington would have been wrong. Borrowing the personal story of a woman whose sister was murdered by a serial killer matters because that story has real emotional value to its owner. As Lavery put it, it touches on someone's shattered life. Are boilerplate descriptions of physiological functions in the same league?

  It also matters how Lavery chose to use my words. Borrowing crosses the line when it is used for a derivative work. It's one thing if you're writing a history of the Kennedys, like Doris Kearns Goodwin, and borrow, without attribution, from another history of the Kennedys. But Lavery wasn't writing another profile of Dorothy Lewis. She was writing a play about something entirely new--about what would happen if a mother met the man who killed her daughter. And she used my descriptions of Lewis's work and the outline of Lewis's life as a building block in making that confrontation plausible. Isn't that the way creativity is supposed to work? Old words in the service of a new idea aren't the problem. What inhibits creativity is new words in the service of an old idea.

  And this is the second problem with plagiarism. It is not merely extremist. It has also become disconnected from the broader question of what does and does not inhibit creativity. We accept the right of one writer to engage in a full-scale knockoff of another--think how many serial-killer novels have been cloned from "The Silence of the Lambs." Yet, when Kathy Acker incorporated parts of a Harold Robbins sex scene verbatim in a satiric novel, she was denounced as a plagiarist (and threatened with a lawsuit). When I worked at a newspaper, we were routinely dispatched to "match" a story from the Times: to do a new version of someone else's idea. But had we "matched" any of the Times' words--even the most banal of phrases--it could have been a firing offense. The ethics of plagiarism have turned into the narcissism of small differences: because journalism cannot own up to its heavily derivative nature, it must enforce originality on the level of the sentence.

  Dorothy Lewis says that one of the things that hurt her most about "Frozen" was that Agnetha turns out to have had an affair with her collaborator, David Nabkus. Lewis feared that people would think she had had an affair with her collaborator, Jonathan Pincus. "That's slander," Lewis told me. "I'm recognizable in that. Enough people have called me and said, 'Dorothy, it's about you,' and if everything up to that point is true, then the affair becomes true in the mind. So that is another reason that I feel violated. If you are going to take the life of somebody, and make them absolutely identifiable, you don't create an affair, and you certainly don't have that as a climax of the play."

  It is easy to understand how shocking it must have been for Lewis to sit in the audience and see her "character" admit to that indiscretion. But the truth is that Lavery has every right to create an affair for Agnetha, because Agnetha is not Dorothy Lewis. She is a fictional character, drawn from Lewis's life but endowed with a completely imaginary set of circumstances and actions. In real life, Lewis kissed Ted Bundy on the cheek, and in some versions of "Frozen" Agnetha kisses Ralph. But Lewis kissed Bundy only because he kissed her first, and there's a big difference between responding to a kiss from a killer and initiating one. When we first see Agnetha, she's rushing out of the house and thinking murderous thoughts on the airplane. Dorothy Lewis also charges out of her house and thinks murderous thoughts. But the dramatic function of that scene is to make us think, in that moment, that Agnetha is crazy. And the one inescapable fact about Lewis is that she is not crazy: she has helped get people to rethink their notions of criminality because of her unshakable command of herself and her work. Lewis is upset not just about how Lavery copied her life story, in other words, but about how Lavery changed her life story. She's not merely upset about plagiarism. She's upset about art--about the use of old words in the service of a new idea--and her feelings are perfectly understandable, because the alterations of art can be every bit as unsettling and hurtful as the thievery of plagiarism. It's just that art is not a breach of ethics.

  When I read the original reviews of "Frozen," I noticed that time and again critics would use, without attribution, some version of the sentence "The difference between a crime of evil and a crime of illness is the difference between a sin and a symptom." That's my phrase, of course. I wrote it. Lavery borrowed it from me, and now the critics were borrowing it from her. The plagiarist was being plagiarized. In this case, there is no "art" defense: nothing new was being done with that line. And this was not "news." Yet do I really own "sins and symptoms"? There is a quote by Gandhi, it turns out, using the same two words, and I'm sure that if I were to plow through the body of English literature I would find the path littered with crimes of evil and crimes of illness. The central fact about the "Phantom" case is that Ray Repp, if he was borrowing from Andrew Lloyd Webber, certainly didn't realize it, and Andrew Lloyd Webber didn't realize that he was borrowing from himself. Creative property, Lessig reminds us, has many lives--the newspaper arrives at our door, it becomes part of the archive of human knowledge, then it wraps fish. And, by the time ideas pass into their third and fourth lives, we lose track of where they came from, and we lose control of where they are going. The final dishonesty of the plagiarism fundamentalists is to encourage us to pretend that these chains of influence and evolution do not exist, and that a writer's words have a virgin birth and an eternal life. I suppose that I could get upset about what happened to my words. I could also simply acknowledge that I had a good, long ride with that line--and let it go.

  "It's been absolutely bloody, really, because it attacks my own notion of my character," Lavery said, sitting at my kitchen table. A bouquet of flowers she had brought were on the counter behind her. "It feels absolutely terrible. I've had to go through the pain for being careless. I'd like to repair what happened, and I don't know how to do that. I just didn't think I was doing the wrong thing . . . and then the article comes out in the New York Times and every continent in the world." There was a long silence. She was heartbroken. But, more than that, she was confused, because she didn't understand how six hundred and seventy-five rather ordinary words could bring the walls tumbling down. "It's been horrible and bloody." She began to cry. "I'm still composting what happened. It will be for a purpose . . . whatever that purpose is.
GO TO TOP MENU

  The Man in the Gray Flannel Suit put the war behind him. Why can't we?

  1.

  When Tom Rath, the hero of Sloan Wilson's 1955 novel "The Man in the Gray Flannel Suit, " comes home to Connecticut each day from his job in Manhattan, his wife mixes him a Martini. If he misses the train, he'll duck into the bar at Grand Central Terminal and have a highball, or perhaps a Scotch. On Sunday mornings, Rath and his wife lie around drinking Martinis. Once, Rath takes a tumbler of Martinis to bed, and after finishing it drifts off to sleep. Then his wife wakes him up in the middle of the night, wanting to talk. "I will if you get me a drink," he says. She comes back with a glass half full of ice and gin. "On Greentree Avenue cocktail parties started at seven-thirty, when the men came home from New York, and they usually continued without any dinner until three or four o'clock in the morning," Wilson writes of the tidy neighborhood in Westport where Rath and countless other young, middle-class families live. "Somewhere around nine-thirty in the evening, Martinis and Manhattans would give way to highballs, but the formality of eating anything but hors d'oeuvres in-between had been entirely omitted."

  "The Man in the Gray Flannel Suit" is about a public-relations specialist who lives in the suburbs, works for a media company in midtown, and worries about money, job security, and educating his children. It was an enormous best-seller. Gregory Peck played Tom Rath in the Hollywood version, and today, on the eve of the fiftieth anniversary of the book's publication, many of the themes the novel addresses seem strikingly contemporary. But in other ways "The Man in the Gray Flannel Suit" is utterly dated. The details are all wrong. Tom Rath, despite an introspective streak, is supposed to be a figure of middle-class normalcy. But by our standards he and almost everyone else in the novel look like alcoholics. The book is supposed to be an argument for the importance of family over career. But Rath's three children--the objects of his sacrifice--are so absent from the narrative and from Rath's consciousness that these days he'd be called an absentee father.

  The most discordant note, though, is struck by the account of Rath's experience in the Second World War. He had, it becomes clear, a terrible war. As a paratrooper in Europe, he and his close friend Hank Mahoney find themselves trapped--starving and freezing--behind enemy lines, and end up killing two German sentries in order to take their sheepskin coats. But Rath doesn't quite kill one of them, and Mahoney urges him to finish the job:

  Tom had knelt beside the sentry. He had not thought it would be difficult, but the tendons of the boy's neck had proved tough, and suddenly the sentry had started to sit up. In a rage Tom had plunged the knife repeatedly into his throat, ramming it home with all his strength until he had almost severed the head from the body.

  At the end of the war, Rath and Mahoney are transferred to the Pacific theatre for the invasion of the island of Karkow. There Rath throws a hand grenade and inadvertently kills his friend. He crawls over to Hank's body, calling out his name. "Tom had put his hand under Mahoney's arm and turned him over," Wilson writes. "Mahoney's entire chest had been torn away, leaving the naked lungs and splintered ribs exposed."

  Rath picks up the body and runs back toward his own men, dodging enemy fire. Coming upon a group of Japanese firing from a cave, he props the body up, crawls within fifteen feet of the machine gun, tosses in two grenades, and then finishes off the lone survivor with a knife. He takes Hank's body into a bombed-out pillbox and tries to resuscitate his friend's corpse. The medics tell him that Hank has been dead for hours. He won't listen. In a daze, he runs with the body toward the sea.

  Wilson's description of Mahoney's death is as brutal and moving a description of the madness of combat as can be found in postwar fiction. But what happens to Rath as a result of that day in Karkow? Not much. It does not destroy him, or leave him permanently traumatized. The part of Rath's war experience that leaves him truly guilt-ridden is the adulterous affair that he has with a woman named Maria while waiting for redeployment orders in Rome. In the elevator of his midtown office, he runs into a friend who knew Maria, and learns that he fathered a son. He obsessively goes over and over the affair in his mind, trying to square his feeling toward Maria with his love for his wife, and his marriage is fully restored only when he confesses to the existence of his Italian child. Killing his best friend, by contrast, is something that comes up and then gets tucked away. As Rath sat on the beach, and Mahoney's body was finally taken away, Wilson writes:

A major, coming to squat beside him, said, "Some of these goddamn sailors got heads. They went ashore and got Jap heads, and they tried to boil them in the galley to get the skulls for souvenirs."
Tom had shrugged and said nothing. The fact that he had been too quick to throw a hand grenade and had killed Mahoney, the fact that some young sailors had wanted skulls for souvenirs, and the fact that a few hundred men had lost their lives to take the island of Karkow--all these facts were simply incomprehensible and had to be forgotten. That, he had decided, was the final truth of the war, and he had greeted it with relief, greeted it eagerly, the simple fact that it was incomprehensible and had to be forgotten. Things just happen, he had decided; they happen and they happen again, and anybody who tries to make sense out of it goes out of his mind.

  You couldn't write that scene today, at least not without irony. No soldier, according to our contemporary understanding, could ever shrug off an experience like that. Today, it is Rath's affair with Maria that would be rationalized and explained away. He was a soldier, after all, in the midst of war. Who knew if he would ever see his wife again? Tim O'Brien's best-selling 1994 novel "In the Lake of the Woods" has a narrative structure almost identical to that of "The Man in the Gray Flannel Suit." O'Brien's hero, John Wade, is present at a massacre of civilians in the Vietnamese village of Thuan Yen. He kills a fellow-soldier--a man he loved like a brother. And, just like Rath, Wade sits down at the end of the long afternoon of the worst day of his war and tries to wish the memory away:

  And then later still, snagged in the sunlight, he gave himself over to forgetfulness. "Go away," he murmured. He waited a moment, then said it again, firmly, much louder, and the little village began to vanish inside its own rosy glow. Here, he reasoned, was the most majestic trick of all. In the months and years ahead, John Wade would remember Thuan Yen the way chemical nightmares are remembered, impossible combinations, impossible events, and over time the impossibility itself would become the richest and deepest and most profound memory
This could not have happened. Therefore it did not.
Already he felt better.

  But John Wade cannot forget. That's the point of O'Brien's book. "The Man in the Gray Flannel Suit" ends with Tom Rath stronger, and his marriage renewed. Wade falls apart, and when he returns home to the woman he left behind he wakes up screaming in his sleep. By the end of the novel, the past has come back and destroyed Wade, and one reason for the book's power is the inevitability of that disaster. This is the difference between a novel written in the middle of the last century and a novel written at the end of the century. Somehow in the intervening decades our understanding of what it means to experience a traumatic event has changed. We believe in John Wade now, not Tom Rath, and half a century after the publication of "The Man in the Gray Flannel Suit" it's worth wondering whether we've got it right.

  2.

  Several years ago, three psychologists--Bruce Rind, Robert Bauserman, and Philip Tromovitch--published an article on childhood sexual abuse in Psychological Bulletin, one of academic psychology's most prestigious journals. It was what psychologists call a meta-analysis. The three researchers collected fifty-nine studies that had been conducted over the years on the long-term psychological effects of childhood sexual abuse (C.S.A.), and combined the data, in order to get the most definitive and statistically powerful result possible.

  What most studies of sexual abuse show is that if you gauge the psychological health of young adults--typically college students--using various measures of mental health (alcohol problems, depression, anxiety, eating disorders, obsessive-compulsive symptoms, social adjustment, sleeping problems, suicidal thoughts and behavior, and so on), those with a history of childhood sexual abuse will have more problems across the board than those who weren't abused. That makes intuitive sense. But Rind and his colleagues wanted to answer that question more specifically: how much worse off were the sexually abused? The fifty-nine studies were run through a series of sophisticated statistical tests. Studies from different times and places were put on the same scale. The results were surprising. The difference between the psychological health of those who had been abused and those who hadn't, they found, was marginal. It was two-tenths of a standard deviation. "That's like the difference between someone with an I.Q. of 100 and someone with an I.Q. of 97," Rind says. "Ninety-seven is statistically different from 100. But it's a trivial difference."

  Then Rind and his colleagues went one step further. A significant percentage of people who were sexually abused as children grew up in families with a host of other problems, like violence, neglect, and verbal abuse. So, to the extent that the sexually abused were damaged, what caused the damage--the sexual abuse, or the violence and neglect that so often accompanied the abuse? The data suggested that it was the latter, and, if you account for such factors, that two-tenths of a standard deviation shrinks even more. "The real gap is probably smaller than 100 and 97," Rind says. "It might be 98, or maybe it's 99." The studies analyzed by Rind and his colleagues show that some victims of sexual abuse don't even regard themselves, in retrospect, as victims. Among the male college students surveyed, for instance, Rind and his colleagues found that "37 percent viewed their C.S.A. experiences as positive at the time they occurred," while forty-two per cent viewed them as positive when reflecting back on them.

  The Rind article was published in the summer of 1998, and almost immediately it was denounced by conservative groups and lambasted in the media. Laura Schlessinger--a popular radio talk-show host known as Dr. Laura--called it "junk science." In Washington, Representative Matt Salmon called it "the Emancipation Proclamation for pedophiles," while Representative Tom DeLay accused it of "normalizing pedophilia." They held a press conference at which they demanded that the American Psychological Association censure the paper. In July of 1999, a year after its publication, both the House and the Senate overwhelmingly passed resolutions condemning the analysis. Few articles in the history of academic psychology have created such a stir.

  But why? It's not as if the authors said that C.S.A. was a good thing. They just suggested that it didn't cause as many problems as we'd thought--and the question of whether C.S.A. is morally wrong doesn't hinge on its long-term consequences. Nor did the study say that sexual abuse was harmless. On average, the researchers concluded, the long-term damage is small. But that average is made up of cases where the damage is hard to find (like C.S.A. involving adolescent boys) and cases where the damage is quite significant (like father-daughter incest). Rind was trying to help psychologists focus on what was truly harmful. And, when it came to the effects of things like physical abuse and neglect, he and his colleagues sounded the alarm. "What happens in physical abuse is that it doesn't happen once," Rind says. "It happens time and time again. And, when it comes to neglect, the research shows that is the most noxious factor of all--worse than physical abuse. Why? Because it's not practiced for one week. It's a persistent thing. It's a permanent feature of the parent-child relationship. These are the kinds of things that cause problems in adulthood."

  All Rind and his colleagues were saying is that sexual abuse is often something that people eventually can get over, and one of the reasons that the Rind study was so unacceptable is that we no longer think that traumatic experiences are things we can get over. We believe that the child who is molested by an uncle or a priest, on two or three furtive occasions, has to be permanently scarred by the experience--just as the soldier who accidentally kills his best friend must do more than sit down on the beach and decide that sometimes things just "happen."

  In a recent history of the Rind controversy, the psychologist Scott Lilienfeld pointed out that when we find out that something we thought was very dangerous actually isn't that dangerous after all we usually regard what we've learned as good news. To him, the controversy was a paradox, and he is quite right. This attachment we have to John Wade over Tom Rath is not merely a preference for one kind of war narrative over another. It is a shift in perception so profound that the United States Congress could be presented with evidence of the unexpected strength and resilience of the human spirit and reject it without a single dissenting vote.

  3.

  In "The Man in the Gray Flannel Suit," Tom Rath works for Ralph Hopkins, who is the president of the United Broadcasting Company. Hopkins has decided that he wants to play a civic role in the issue of mental health, and Rath's job is to write his speeches and handle public relations connected to the project. "It all started when a group of doctors called on me a few months ago," Hopkins tells Rath, when he hires him for the job. "They apparently felt that there is too little public understanding of the whole question of mental illness, and that a campaign like the fight against cancer or polio is needed." Again and again, in the novel, the topic of mental health surfaces. Rath's father, we learn, suffered a nervous breakdown after serving in the trenches of the First World War, and died in what may well have been a suicide. His grandmother, whose death sets the book's plot in motion, wanders in and out of lucidity at the end of her life. Hopkins, in a hilarious scene, recalls his unsatisfactory experience with a psychiatrist. To Wilson's readers, this preoccupation would not have seemed out of place. In 1955, the population of New York State's twenty-seven psychiatric hospitals was nearly ninety-four thousand. (Today, largely because of anti-psychotic drugs, it is less than six thousand.) It was impossible to drive any distance from Manhattan and not be confronted with ominous, hulking reminders of psychiatric distress: the enormous complex across the Triborough Bridge, on Wards Island; Sagamore and Pilgrim Hospitals, on Long Island; Creedmoor, in Queens. Mental health mattered to the reader of the nineteen-fifties, in a way that, say, aids mattered in the novels of the late nineteen-eighties.

  But Wilson draws a very clear line between the struggles of the Raths and the plight of those suffering from actual mental illness. At one point, for example, Rath's wife, Betsy, wonders why nothing is fun anymore:

It probably would take a psychiatrist to answer that. Maybe Tom and I both ought to visit one, she thought. What's the matter? the psychiatrist would say, and I would reply, I don't know--nothing seems to be much fun any more. All of a sudden the music stopped, and it didn't start again. Is that strange, or does it happen to everyone about the time when youth starts to go?
The psychiatrist would have an explanation, Betsy thought, but I don't want to hear it. People rely too much on explanations these days, and not enough on courage and action. . . . Tom has a good job, and he'll get his enthusiasm back, be a success at it. Everything's going to be fine. It does no good to wallow in night thoughts. In God we trust, and that's that.

  This is not denial, much as it may sound like it. Betsy Rath is not saying that her husband doesn't have problems. She's just saying that, in all likelihood, Tom will get over his problems. This is precisely the idea that lies at the heart of the Rind meta-analysis. Once you've separated out the small number of seriously damaged people--the victims of father-daughter incest, or of prolonged neglect and physical abuse--the balance of C.S.A. survivors are pretty much going to be fine. The same is true, it turns out, of other kinds of trauma. The Columbia University psychologist George Bonanno, for instance, followed a large number of men and women who had recently lost a spouse. "In the bereavement area, the assumption has been that when people lose a loved one there is a kind of unitary process that everybody must go through," Bonanno says. "That process has been called grief work. The grief must be processed. It must be examined. It must be fully understood, then finished. It was the same kind of assumption that dominated the trauma world. The idea was that everybody exposed to these kinds of events will have to go through the same kind of process if they are to recover. And if you don't do this, if you have somehow inhibited or buried the experience, the assumption was that you would pay in the long run."

  Instead, Bonanno found a wide range of responses. Some people went through a long and painful grieving process; others a period of debilitating depression. But by far the most common response was resilience: the majority of those who had just suffered from one of the most painful experiences of their lives never lapsed into serious depression, experienced a relatively brief period of grief symptoms, and soon returned to normal functioning. These people were not necessarily the hardiest or the healthiest. They just managed, by one means or another, to muddle through.

  "Most people just plain cope well," Bonanno says. "The vast majority of people get over traumatic events, and get over them remarkably well. Only a small subset--five to fifteen per cent--struggle in a way that says they need help."

  What these patterns of resilience suggest is that human beings are naturally endowed with a kind of psychological immune system, which keeps us in balance and overcomes wild swings to either end of the emotional spectrum. Most of us aren't resilient just in the wake of bad experiences, after all. We're also resilient in the wake of wonderful experiences; the joy of a really good meal, or winning a tennis match, or getting praised by a boss doesn't last that long, either. "One function of emotions is to signal to people quickly which things in their environments are dangerous and should be avoided and which are positive and should be approached," Timothy Wilson, a psychologist at the University of Virginia, has said. "People have very fast emotional reactions to events that serve as signals, informing them what to do. A problem with prolonged emotional reactions to past events is that it might be more difficult for these signals to get through. If people are still in a state of bliss over yesterday's success, today's dangers and hazards might be more difficult to recognize." (Wilson, incidentally, is Sloan Wilson's nephew.)

  Wilson and his longtime collaborator, Daniel T. Gilbert, argue that a distinctive feature of this resilience is that people don't realize that they possess it. People are bad at forecasting their emotions--at appreciating how well, under most circumstances, they will recover. Not long ago, for instance, Gilbert, Wilson, and two other researchers--Carey Morewedge and Jane Risen--asked passengers at a subway station in Cambridge, Massachusetts, how much regret they thought they would feel if they arrived on the platform just as a train was pulling away. Then they approached passengers who really had arrived just as their train was leaving, and asked them how they felt. They found that the predictions of how bad it would feel to have just barely missed a train were on average greater than reports of how it actually felt to watch the train pull away. We suffer from what Wilson and Gilbert call an impact bias: we always assume that our emotional states will last much longer than they do. We forget that other experiences will compete for our attention and emotions. We forget that our psychological immune system will kick in and take away the sting of adversity. "When I talk about our research, I say to people, 'I'm not telling you that bad things don't hurt,'" Gilbert says. "Of course they do. It would be perverse to say that having a child or a spouse die is not a big deal. All I'm saying is that the reality doesn't meet the expectation."

  This is the difference between our own era and the one of half a century ago--between "The Man in the Gray Flannel Suit" and "In the Lake of the Woods." Sloan Wilson's book came from a time and a culture that had the confidence and wisdom to understand this truth. "I love you more than I can tell," Rath says to his wife at the end of the novel. It's an ending that no one would write today, but only because we have become blind to the fact that the past--in all but the worst of cases--sooner or later fades away. Betsy turns back to her husband:

"I want you to be able to talk to me about the war. It might help us to understand each other. Did you really kill seventeen men?"
"Yes."
"Do you want to talk about it now?"
"No. It's not that I want to and can't--it's just that I'd rather think about the future. About getting a new car and driving up to Vermont with you tomorrow."
"That will be fun. It's not an insane world. At least, our part of it doesn't have to be."
GO TO TOP MENU

  How to think about prescription drugs.

  1.

  Ten years ago, the multinational pharmaceutical company AstraZeneca launched what was known inside the company as the Shark Fin Project. The team for the project was composed of lawyers, marketers, and scientists, and its focus was a prescription drug known as Prilosec, a heartburn medication that, in one five-year stretch of its extraordinary history, earned AstraZeneca twenty-six billion dollars. The patent on the drug was due to expire in April of 2001. The name Shark Fin was a reference to what Prilosec sales--and AstraZeneca's profits--would look like if nothing was done to fend off the ensuing low-priced generic competition.

  The Shark Fin team drew up a list of fifty options. One idea was to devise a Prilosec 2.0--a version that worked faster or longer, or was more effective. Another idea was to combine it with a different heartburn remedy, or to change the formulation, so that it came in a liquid gel or in an extended-release form. In the end, AstraZeneca decided on a subtle piece of chemical reëngineering. Prilosec, like many drugs, is composed of two "isomers"--a left-hand and a right-hand version of the molecule. In some cases, removing one of the isomers can reduce side effects or make a drug work a little bit better, and in all cases the Patent Office recognizes something with one isomer as a separate invention from something with two. So AstraZeneca cut Prilosec in half.

  AstraZeneca then had to prove that the single-isomer version of the drug was better than regular Prilosec. It chose as its target something called erosive esophagitis, a condition in which stomach acid begins to bubble up and harm the lining of the esophagus. In one study, half the patients took Prilosec, and half took Son of Prilosec. After one month, the two drugs were dead even. But after two months, to the delight of the Shark Fin team, the single-isomer version edged ahead--with a ninety-per-cent healing rate versus Prilosec's eighty-seven per cent. The new drug was called Nexium. A patent was filed, the F.D.A. gave its blessing, and, in March of 2001, Nexium hit the pharmacy shelves priced at a hundred and twenty dollars for a month's worth of pills. To keep cheaper generics at bay, and persuade patients and doctors to think of Nexium as state of the art, AstraZeneca spent half a billion dollars in marketing and advertising in the year following the launch. It is now one of the half-dozen top-selling drugs in America.

  In the political uproar over prescription-drug costs, Nexium has become a symbol of everything that is wrong with the pharmaceutical industry. The big drug companies justify the high prices they charge--and the extraordinary profits they enjoy--by arguing that the search for innovative, life-saving medicines is risky and expensive. But Nexium is little more than a repackaged version of an old medicine. And the hundred and twenty dollars a month that AstraZeneca charges isn't to recoup the costs of risky research and development; the costs were for a series of clinical trials that told us nothing we needed to know, and a half-billion-dollar marketing campaign selling the solution to a problem we'd already solved. "The Prilosec pattern, repeated across the pharmaceutical industry, goes a long way to explain why the nation's prescription drug bill is rising an estimated 17 % a year even as general inflation is quiescent," the Wall Street Journal concluded, in a front-page article that first revealed the Shark Fin Project.

  In "The Truth About the Drug Companies: How They Deceive Us and What to Do About It" (Random House; $24.95), Marcia Angell offers an even harsher assessment. Angell used to be the editor-in-chief of The New England Journal of Medicine, which is among the most powerful positions in American medicine, and in her view drug companies are troubled and corrupt. She thinks that they charge too much, engage in deceptive research, produce inferior products, borrow their best ideas from government-funded scientists, and buy the affections of physicians with trips and gifts. To her, the story of Nexium and drugs like it is proof that the pharmaceutical industry is "now primarily a marketing machine to sell drugs of dubious benefit."

  Of course, it is also the case that Nexium is a prescription drug: every person who takes Nexium was given the drug with the approval of a doctor--and doctors are professionals who ought to know that there are many cheaper ways to treat heartburn. If the patient was coming in for the first time, the doctor could have prescribed what's known as an H2 antagonist, such as a generic version of Tagamet (cimetidine), which works perfectly well for many people and costs only about twenty-eight dollars a month. If the patient wasn't responding to Tagamet, the doctor could have put him on the cheaper, generic form of Prilosec, omeprazole.

  The patient's insurance company could easily have stepped in as well. It could have picked up the tab for Nexium only if the patient had first tried generic Tagamet. Or it could have discouraged Nexium use, by requiring anyone who wanted the drug to pay the difference between it and generic omeprazole. Both the physician and the insurance company, meanwhile, could have sent the patient to any drugstore in America, where he or she would have found, next to the Maalox and the Pepcid, a package of over-the-counter Prilosec. O.T.C. Prilosec is identical to prescription Prilosec and effectively equivalent to prescription Nexium, and it costs only twenty dollars a month.

  Throughout the current debate over prescription-drug costs--as seniors have gone on drug-buying bus trips to Canada, as state Medicaid programs and employers have become increasingly angry over rising health-care costs, and as John Kerry has made reining in the pharmaceutical industry a central theme of his Presidential campaign--the common assumption has been that the rise of drugs like Nexium is entirely the fault of the pharmaceutical industry. Is it? If doctors routinely prescribe drugs like Nexium and insurers routinely pay for them, after all, there is surely more than one culprit in the prescription-drug mess.

  2.

  The problem with the way we think about prescription drugs begins with a basic misunderstanding about drug prices. The editorial board of the Times has pronounced them much too high; Marcia Angell calls them "intolerable." The perception that the drug industry is profiteering at the expense of the American consumer has given pharmaceutical firms a reputation on a par with that of cigarette manufacturers.

  In fact, the complaint is only half true. The "intolerable" prices that Angell writes about are confined to the brand-name sector of the American drug marketplace. As the economists Patricia Danzon and Michael Furukawa recently pointed out in the journal Health Affairs, drugs still under patent protection are anywhere from twenty-five to forty per cent more expensive in the United States than in places like England, France, and Canada. Generic drugs are another story. Because there are so many companies in the United States that step in to make drugs once their patents expire, and because the price competition among those firms is so fierce, generic drugs here are among the cheapest in the world. And, according to Danzon and Furukawa's analysis, when prescription drugs are converted to over-the-counter status no other country even comes close to having prices as low as the United States.

  It is not accurate to say, then, that the United States has higher prescription-drug prices than other countries. It is accurate to say only that the United States has a different pricing system from that of other countries. Americans pay more for drugs when they first come out and less as the drugs get older, while the rest of the world pays less in the beginning and more later. Whose pricing system is cheaper? It depends. If you are taking Mevacor for your cholesterol, the 20-mg. pill is two-twenty-five in America and less than two dollars if you buy it in Canada. But generic Mevacor (lovastatin) is about a dollar a pill in Canada and as low as sixty-five cents a pill in the United States. Of course, not every drug comes in a generic version. But so many important drugs have gone off-patent recently that the rate of increase in drug spending in the United States has fallen sharply for the past four years. And so many other drugs are going to go off-patent in the next few years--including the top-selling drug in this country, the anti-cholesterol medication Lipitor--that many Americans who now pay more for their drugs than their counterparts in other Western countries could soon be paying less.

  The second misconception about prices has to do with their importance in driving up over-all drug costs. In one three-year period in the mid-nineteen-nineties, for example, the amount of money spent in the United States on asthma medication increased by almost a hundred per cent. But none of that was due to an increase in the price of asthma drugs. It was largely the result of an increase in the prevalence of usage--that is, in the number of people who were given a diagnosis of the disease and who then bought drugs to treat it. Part of that hundred-per-cent increase was also the result of a change in what's known as the intensity of drug use: in the mid-nineties, doctors were becoming far more aggressive in their attempts to prevent asthma attacks, and in those three years people with asthma went from filling about nine prescriptions a year to filling fourteen prescriptions a year. Last year, asthma costs jumped again, by twenty-six per cent, and price inflation played a role. But, once again, the big factor was prevalence. And this time around there was also a change in what's called the therapeutic mix; in an attempt to fight the disease more effectively, physicians are switching many of their patients to newer, better, and more expensive drugs, like Merck's Singulair.

  Asthma is not an isolated case. In 2003, the amount that Americans spent on cholesterol-lowering drugs rose 23.8 per cent, and similar increases are forecast for the next few years. Why the increase? Well, the baby boomers are aging, and so are at greater risk for heart attacks. The incidence of obesity is increasing. In 2002, the National Institutes of Health lowered the thresholds for when people with high cholesterol ought to start taking drugs like Lipitor and Mevacor. In combination, those factors are having an enormous impact on both the prevalence and the intensity of cholesterol treatment. All told, prescription-drug spending in the United States rose 9.1 per cent last year. Only three of those percentage points were due to price increases, however, which means that inflation was about the same in the drug sector as it was in the over-all economy. Angell's book and almost every other account of the prescription-drug crisis take it for granted that cost increases are evidence of how we've been cheated by the industry. In fact, drug expenditures are rising rapidly in the United States not so much because we're being charged more for prescription drugs but because more people are taking more medications in more expensive combinations. It's not price that matters; it's volume.

  3.

  This is a critical fact, and it ought to fundamentally change the way we think about the problem of drug costs. Last year, hospital expenditures rose by the same amount as drug expenditures--nine per cent. Yet almost all of that (eight percentage points) was due to inflation. That's something to be upset about: when it comes to hospital services, we're spending more and getting less. When it comes to drugs, though, we're spending more and we're getting more, and that makes the question of how we ought to respond to rising drug costs a little more ambiguous.

  Take CareSource, a nonprofit group that administers Medicaid for close to four hundred thousand patients in Ohio and Michigan. CareSource runs a tightly managed pharmacy program and substitutes generics for brand-name drugs whenever possible. Nonetheless, the group's pharmacy managers are forecasting at least ten-per-cent increases in their prescription-drug spending in the upcoming year. The voters of Ohio and Michigan can hardly be happy with that news. Then again, it's not as if that money were being wasted.

  The drug that CareSource spends more money on than any other is Singulair, Merck's new asthma pill. That's because Medicaid covers a lot of young, lowerincome families, where asthma is epidemic and Singulair is a highly effective drug. Isn't the point of having a Medicaid program to give the poor and the ailing a chance to live a healthy life? This year, too, the number of patients covered by CareSource who are either blind or disabled or have received a diagnosis of aids grew from fifteen to eighteen per cent. The treatment of aids is one of the pharmaceutical industry's great success stories: drugs are now available that can turn what was once a death sentence into a manageable chronic disease. The evidence suggests, furthermore, that aggressively treating diseases like aids and asthma saves money in the long term by preventing far more expensive hospital visits. But there is no way to treat these diseases in the short term--and make sick people healthy--without spending more on drugs.

  The economist J. D. Klienke points out that if all physicians followed the treatment guidelines laid down by the National Institutes of Health the number of Americans being treated for hypertension would rise from twenty million to forty-three million, the use of asthma medication would increase somewhere between twofold and tenfold, and the number of Americans on one of the so-called "statin" class of cholesterol-lowering medications would increase by at least a factor of ten. By these measures, it doesn't seem that we are spending too much on prescription drugs. If the federal government's own medical researchers are to be believed, we're spending too little.

  4.

  The fact that volume matters more than price also means that the emphasis of the prescription-drug debate is all wrong. We've been focussed on the drug manufacturers. But decisions about prevalence, therapeutic mix, and intensity aren't made by the producers of drugs. They're made by the consumers of drugs.

  This is why increasing numbers of employers have in recent years made use of what are known as Pharmacy Benefit Managers, or P.B.M.s. The P.B.M.s draw up drug formularies--lists of preferred medications. They analyze clinical-trials data to find out which drugs are the most cost-effective. In a category in which there are many equivalent options, they bargain with drug firms, offering to deliver all their business to one company in exchange for a discount. They build incentives into prescription-drug plans to encourage intelligent patient behavior. If someone wants to take a brand-name oral contraceptive and there is a generic equivalent available, for example, a P.B.M. might require her to pay the price difference. In the case of something like heartburn, the P.B.M. might require patients to follow what's called step therapy--to try the cheaper H2 antagonists first, and only if that fails to move to a proton-pump inhibitor like omeprazole. Employers who used two or more of these strategies last year saw a decrease of almost five per cent in their pharmacy spending.

  There is no mention of these successes in "The Truth About the Drug Companies." Though much of the book is concerned with the problem of such costs, P.B.M.s, the principal tool that private health-care plans use to control rising drug costs, are dismissed in a few paragraphs. Angell's focus, instead, is on the behavior of the pharmaceutical industry. An entire chapter, for instance, centers on the fact that the majority of drugs produced by the pharmaceutical industry are either minor variations or duplicates of drugs already on the market. Merck pioneered the statin category with Mevacor. Now we have Pfizer's Lipitor, Bristol-Myers Squibb's Pravachol, Novartis's Lescol, AstraZeneca's Crestor, and Merck's second entrant, Zocor--all of which do pretty much the same thing. Angell thinks that these "me-too" drugs are a waste of time and money, and that the industry should devote its resources to the development of truly innovative drugs instead. In one sense, she's right: we need a cure for Alzheimer's much more than we need a fourth or fifth statin. Yet me-too drugs are what drive prices down. The presence of more than one drug in a given category gives P.B.M.s their leverage when it comes time to bargain with pharmaceutical companies.

  With the passage of the Medicare prescription-drug-insurance legislation, late last year, the competition created by me-toos has become even more important. The bill gives responsibility for managing the drug benefit to P.B.M.s. In each therapeutic category, Medicare will set guidelines for how many and what kinds of drugs the P.B.M.s will have to include, and then the P.B.M.s will negotiate directly with drug companies for lower prices. Some analysts predict that, as long as Medicare is smart about how it defines the terms of the benefit, the discounts--particularly in crowded therapeutic categories like the statins--could be considerable. Angell appears to understand none of this. "Medicare will have to pay whatever drug companies charge," she writes, bafflingly, "and it will have to cover expensive me-too drugs as well as more cost-effective ones."

  5.

  The core problem in bringing drug spending under control, in other words, is persuading the users and buyers and prescribers of drugs to behave rationally, and the reason we're in the mess we're in is that, so far, we simply haven't done a very good job of that. "The sensitivity on the part of employers is turned up pretty high on this," Robert Nease, who heads applied decision analysis for one of the nation's largest P.B.M.s, the St. Louis-based Express Scripts, says. "This is not an issue about how to cut costs without affecting quality. We know how to do that. We know that generics work as well as brands. We know that there are proven step therapies. The problem is that we haven't communicated to members that we aren't cheating them."

  Among the costliest drug categories, for instance, is the new class of antiinflammatory drugs known as cox-2 inhibitors. The leading brand, Celebrex, has been heavily advertised, and many patients suffering from arthritis or similar conditions ask for Celebrex when they see their physician, believing that a cox-2 inhibitor is a superior alternative to the previous generation of nonsteroidal anti-inflammatories (known as nsaids), such as ibuprofen. (The second leading cox-2 inhibitor, Merck's Vioxx, has just been taken off the market because of links to an elevated risk of heart attacks and strokes.) The clinical evidence, however, suggests that the cox-2s aren't any better at relieving pain than the nsaids. It's just that in a very select group of patients they have a lower risk of side effects like ulcers or bleeding.

  "There are patients at high risk--people who have or have had an ulcer in the past, who are on blood-thinning medication, or who are of an advanced age," Nease says. "That specific group you would likely start immediately on a cox-2." Anyone else, he says, should really be started on a generic nsaid first. "The savings here are enormous," he went on. "The cox-2s are between a hundred and two hundred dollars a month, and the generic nsaids are pennies a day--and these are drugs that people take day in, day out, for years and years." But that kind of change can't be implemented unilaterally: the health plan and the employer have to explain to employees that in their case a brand-new, hundreddollar drug may not be any better than an old, one-dollar drug.

  Similarly, a P.B.M. might choose to favor one of the six available statins on its formulary--say, AstraZeneca's Crestor--because AstraZeneca gave it the biggest discount. But that requires, once again, a conversation between the health plan and the employee: the person who has happily been taking Pfizer's anti-cholesterol drug Lipitor for several years has to be convinced that Crestor is just as good, and the plan has to be very sure that Crestor is just as good.

  The same debates are going on right now in Washington, as the Medicare program decides how to implement the new drug benefit. In practice, the P.B.M.s will be required to carry a choice of drugs in every therapeutic category. But how do you define a therapeutic category? Are drugs like Nexium and Prilosec and Prevacid--all technically known as proton-pump inhibitors--in one category, and the H2 antagonists in another? Or are they all in one big category? The first approach maximizes the choices available. The second approach maximizes the bargaining power of P.B.M.s. Deciding which option to take will have a big impact on how much we end up paying for prescription drugs--and it's a decision that has nothing to do with the drug companies. It's up to us; it requires physicians, insurers, patients, and government officials to reach some kind of consensus about what we want from our medical system, and how much we are willing to pay for it. AstraZeneca was able to do some chemical sleight of hand, spend half a billion on advertising, and get away with the "reinvention" of its heartburn drug only because that consensus hasn't yet been reached. For sellers to behave responsibly, buyers must first behave intelligently. And if we want to create a system where millions of working and elderly Americans don't have to struggle to pay for prescription drugs that's also up to us. We could find it in our hearts to provide all Americans with adequate health insurance. It is only by the most spectacular feat of cynicism that our political system's moral negligence has become the fault of the pharmaceutical industry.

  There is a second book out this fall on the prescription-drug crisis, called "Overdosed America" (HarperCollins; $24.95), by John Abramson, who teaches at Harvard Medical School. At one point, Abramson discusses a study that he found in a medical journal concluding that the statin Pravachol lowered the risk of stroke in patients with coronary heart disease by nineteen per cent. That sounds like a significant finding, but, as Abramson shows, it isn't. In the six years of the study, 4.5 per cent of those taking a placebo had a stroke versus 3.7 per cent of those on Pravachol. In the real world, that means that for every thousand people you put on Pravachol you prevent one stroke--which, given how much the drug costs, comes to at least $1.2 million per stroke prevented. On top of that, the study's participants had an average age of sixty-two and most of them were men. Stroke victims, however, are more likely to be female, and, on average, much older--and the patients older than seventy in the study who were taking Pravachol had more strokes than those who were on a placebo.

  Here is a classic case of the kind of thing that bedevils the American health system--dubious findings that, without careful evaluation, have the potential to drive up costs. But whose fault is it? It's hard to blame Pravachol's manufacturer, Bristol-Myers Squibb. The study's principal objective was to look at Pravachol's effectiveness in fighting heart attacks; the company was simply using that patient population to make a secondary observation about strokes. In any case, Bristol-Myers didn't write up the results. A group of cardiologists from New Zealand and Australia did, and they hardly tried to hide Pravachol's shortcomings in women and older people. All those data are presented in a large chart on the study's third page. What's wrong is the context in which the study's findings are presented. The abstract at the beginning ought to have been rewritten. The conclusion needs a much clearer explanation of how the findings add to our understanding of stroke prevention. There is no accompanying commentary that points out the extreme cost-ineffectiveness of Pravachol as a stroke medication--and all those are faults of the medical journal's editorial staff. In the end, the fight to keep drug spending under control is principally a matter of information, of proper communication among everyone who prescribes and pays for and ultimately uses drugs about what works and what doesn't, and what makes economic sense and what doesn't--and medical journals play a critical role in this process. As Abramson writes:

When I finished analyzing the article and understood that the title didn't tell the whole story, that the findings were not statistically significant, and that Pravachol appeared to cause more strokes in the population at greater risk, it felt like a violation of the trust that doctors (including me) place in the research published in respected medical journals.

  The journal in which the Pravachol article appeared, incidentally, was the New England Journal of Medicine. And its editor at the time the paper was accepted for publication? Dr. Marcia Angell. Physician, heal thyself.
GO TO TOP MENU

  Employers love personality tests. But what do they really reveal?

  1.

  When Alexander (Sandy) Nininger was twenty-three, and newly commissioned as a lieutenant in the United States Army, he was sent to the South Pacific to serve with the 57th Infantry of the Philippine Scouts. It was January, 1942. The Japanese had just seized Philippine ports at Vigan, Legazpi, Lamon Bay, and Lingayen, and forced the American and Philippine forces to retreat into Bataan, a rugged peninsula on the South China Sea. There, besieged and outnumbered, the Americans set to work building a defensive line, digging foxholes and constructing dikes and clearing underbrush to provide unobstructed sight lines for rifles and machine guns. Nininger's men were on the line's right flank. They labored day and night. The heat and the mosquitoes were nearly unbearable.

  Quiet by nature, Nininger was tall and slender, with wavy blond hair. As Franklin M. Reck recounts in "Beyond the Call of Duty," Nininger had graduated near the top of his class at West Point, where he chaired the lecture-and-entertainment committee. He had spent many hours with a friend, discussing everything from history to the theory of relativity. He loved the theatre. In the evenings, he could often be found sitting by the fireplace in the living room of his commanding officer, sipping tea and listening to Tchaikovsky. As a boy, he once saw his father kill a hawk and had been repulsed. When he went into active service, he wrote a friend to say that he had no feelings of hate, and did not think he could ever kill anyone out of hatred. He had none of the swagger of the natural warrior. He worked hard and had a strong sense of duty.

  In the second week of January, the Japanese attacked, slipping hundreds of snipers through the American lines, climbing into trees, turning the battlefield into what Reck calls a "gigantic possum hunt." On the morning of January 12th, Nininger went to his commanding officer. He wanted, he said, to be assigned to another company, one that was in the thick of the action, so he could go hunting for Japanese snipers.

  He took several grenades and ammunition belts, slung a Garand rifle over his shoulder, and grabbed a sub machine gun. Starting at the point where the fighting was heaviest--near the position of the battalion's K Company--he crawled through the jungle and shot a Japanese soldier out of a tree. He shot and killed snipers. He threw grenades into enemy positions. He was wounded in the leg, but he kept going, clearing out Japanese positions for the other members of K Company, behind him. He soon ran out of grenades and switched to his rifle, and then, when he ran out of ammunition, used only his bayonet. He was wounded a second time, but when a medic crawled toward him to help bring him back behind the lines Nininger waved him off. He saw a Japanese bunker up ahead. As he leaped out of a shell hole, he was spun around by a bullet to the shoulder, but he kept charging at the bunker, where a Japanese officer and two enlisted men were dug in. He dispatched one soldier with a double thrust of his bayonet, clubbed down the other, and bayonetted the officer. Then, with outstretched arms, he collapsed face down. For his heroism, Nininger was posthumously awarded the Medal of Honor, the first American soldier so decorated in the Second World War.

  2.

  Suppose that you were a senior Army officer in the early days of the Second World War and were trying to put together a crack team of fearless and ferocious fighters. Sandy Nininger, it now appears, had exactly the right kind of personality for that assignment, but is there any way you could have known this beforehand? It clearly wouldn't have helped to ask Nininger if he was fearless and ferocious, because he didn't know that he was fearless and ferocious. Nor would it have worked to talk to people who spent time with him. His friend would have told you only that Nininger was quiet and thoughtful and loved the theatre, and his commanding officer would have talked about the evenings of tea and Tchaikovsky. With the exception, perhaps, of the Scarlet Pimpernel, a love of music, theatre, and long afternoons in front of a teapot is not a known predictor of great valor. What you need is some kind of sophisticated psychological instrument, capable of getting to the heart of his personality.

  Over the course of the past century, psychology has been consumed with the search for this kind of magical instrument. Hermann Rorschach proposed that great meaning lay in the way that people described inkblots. The creators of the Minnesota Multiphasic Personality Inventory believed in the revelatory power of true-false items such as "I have never had any black, tarry-looking bowel movements" or "If the money were right, I would like to work for a circus or a carnival." Today, Annie Murphy Paul tells us in her fascinating new book, "Cult of Personality," that there are twenty-five hundred kinds of personality tests. Testing is a four-hundred-million-dollar-a-year industry. A hefty percentage of American corporations use personality tests as part of the hiring and promotion process. The tests figure in custody battles and in sentencing and parole decisions. "Yet despite their prevalence--and the importance of the matters they are called upon to decide--personality tests have received surprisingly little scrutiny," Paul writes. We can call in the psychologists. We can give Sandy Nininger a battery of tests. But will any of it help?

  One of the most popular personality tests in the world is the Myers-Briggs Type Indicator (M.B.T.I.), a psychological-assessment system based on Carl Jung's notion that people make sense of the world through a series of psychological frames. Some people are extroverts, some are introverts. Some process information through logical thought. Some are directed by their feelings. Some make sense of the world through intuitive leaps. Others collect data through their senses. To these three categories-- (I)ntroversion/(E)xtroversion, i(N)tuition/(S)ensing, (T)hinking/(F)eeling--the Myers-Briggs test adds a fourth: (J)udging/(P)erceiving. Judgers "like to live in a planned, orderly way, seeking to regulate and manage their lives," according to an M.B.T.I. guide, whereas Perceivers "like to live in a flexible, spontaneous way, seeking to experience and understand life, rather than control it." The M.B.T.I. asks the test-taker to answer a series of "forced-choice" questions, where one choice identifies you as belonging to one of these paired traits. The basic test takes twenty minutes, and at the end you are presented with a precise, multidimensional summary of your personality-your type might be INTJ or ESFP, or some other combination. Two and a half million Americans a year take the Myers-Briggs. Eighty-nine companies out of the Fortune 100 make use of it, for things like hiring or training sessions to help employees "understand" themselves or their colleagues. Annie Murphy Paul says that at the eminent consulting firm McKinsey, " 'associates' often know their colleagues' four-letter M.B.T.I. types by heart," the way they might know their own weight or (this being McKinsey) their S.A.T. scores.

  It is tempting to think, then, that we could figure out the Myers-Briggs type that corresponds best to commando work, and then test to see whether Sandy Nininger fits the profile. Unfortunately, the notion of personality type is not nearly as straightforward as it appears. For example, the Myers-Briggs poses a series of items grouped around the issue of whether you--the test-taker--are someone who likes to plan your day or evening beforehand or someone who prefers to be spontaneous. The idea is obviously to determine whether you belong to the Judger or Perceiver camp, but the basic question here is surprisingly hard to answer. I think I'm someone who likes to be spontaneous. On the other hand, I have embarked on too many spontaneous evenings that ended up with my friends and me standing on the sidewalk, looking at each other and wondering what to do next. So I guess I'm a spontaneous person who recognizes that life usually goes more smoothly if I plan first, or, rather, I'm a person who prefers to be spontaneous only if there's someone around me who isn't. Does that make me spontaneous or not? I'm not sure. I suppose it means that I'm somewhere in the middle.

  This is the first problem with the Myers-Briggs. It assumes that we are either one thing or another--Intuitive or Sensing, Introverted or Extroverted. But personality doesn't fit into neat binary categories: we fall somewhere along a continuum.

  Here's another question: Would you rather work under a boss (or a teacher) who is good-natured but often inconsistent, or sharp-tongued but always logical?

  On the Myers-Briggs, this is one of a series of questions intended to establish whether you are a Thinker or a Feeler. But I'm not sure I know how to answer this one, either. I once had a good-natured boss whose inconsistency bothered me, because he exerted a great deal of day-to-day control over my work. Then I had a boss who was quite consistent and very sharp-tongued--but at that point I was in a job where day-to-day dealings with my boss were minimal, so his sharp tongue didn't matter that much. So what do I want in a boss? As far as I can tell, the only plausible answer is: It depends. The Myers-Briggs assumes that who we are is consistent from one situation to another. But surely what we want in a boss, and how we behave toward our boss, is affected by what kind of job we have.

  This is the gist of the now famous critique that the psychologist Walter Mischel has made of personality testing. One of Mischel's studies involved watching children interact with one another at a summer camp. Aggressiveness was among the traits that he was interested in, so he watched the children in five different situations: how they behaved when approached by a peer, when teased by a peer, when praised by an adult, when punished by an adult, and when warned by an adult. He found that how aggressively a child responded in one of those situations wasn't a good predictor of how that same child responded in another situation. Just because a boy was aggressive in the face of being teased by another boy didn't mean that he would be aggressive in the face of being warned by an adult. On the other hand, if a child responded aggressively to being teased by a peer one day, it was a pretty good indicator that he'd respond aggressively to being teased by a peer the next day. We have a personality in the sense that we have a consistent pattern of behavior. But that pattern is complex and that personality is contingent: it represents an interaction between our internal disposition and tendencies and the situations that we find ourselves in.

  It's not surprising, then, that the Myers-Briggs has a large problem with consistency: according to some studies, more than half of those who take the test a second time end up with a different score than when they took it the first time. Since personality is continuous, not dichotomous, clearly some people who are borderline Introverts or Feelers one week slide over to Extroversion or Thinking the next week. And since personality is contingent, not stable, how we answer is affected by which circumstances are foremost in our minds when we take the test. If I happen to remember my first boss, then I come out as a Thinker. If my mind is on my second boss, I come out as a Feeler. When I took the Myers-Briggs, I scored as an INTJ. But, if odds are that I'm going to be something else if I take the test again, what good is it?

  Once, for fun, a friend and I devised our own personality test. Like the M.B.T.I., it has four dimensions. The first is Canine/Feline. In romantic relationships, are you the pursuer, who runs happily to the door, tail wagging? Or are you the pursued? The second is More/Different. Is it your intellectual style to gather and master as much information as you can or to make imaginative use of a discrete amount of information? The third is Insider/Outsider. Do you get along with your parents or do you define yourself outside your relationship with your mother and father? And, finally, there is Nibbler/Gobbler. Do you work steadily, in small increments, or do everything at once, in a big gulp? I'm quite pleased with the personality inventory we devised. It directly touches on four aspects of life and temperament-romance, cognition, family, and work style--that are only hinted at by Myers-Briggs. And it can be completed in under a minute, nineteen minutes faster than Myers-Briggs, an advantage not to be dismissed in today's fast-paced business environment. Of course, the four traits it measures are utterly arbitrary, based on what my friend and I came up with over the course of a phone call. But then again surely all universal dichotomous typing systems are arbitrary.

  Where did the Myers-Briggs come from, after all? As Paul tells us, it began with a housewife from Washington, D.C., named Katharine Briggs, at the turn of the last century. Briggs had a daughter, Isabel, an only child for whom (as one relative put it) she did "everything but breathe." When Isabel was still in her teens, Katharine wrote a book-length manuscript about her daughter's remarkable childhood, calling her a "genius" and "a little Shakespeare." When Isabel went off to Swarthmore College, in 1915, the two exchanged letters nearly every day. Then, one day, Isabel brought home her college boyfriend and announced that they were to be married. His name was Clarence (Chief) Myers. He was tall and handsome and studying to be a lawyer, and he could not have been more different from the Briggs women. Katharine and Isabel were bold and imaginative and intuitive. Myers was practical and logical and detail-oriented. Katharine could not understand her future son-in-law. "When the blissful young couple returned to Swarthmore," Paul writes, "Katharine retreated to her study, intent on 'figuring out Chief.' "She began to read widely in psychology and philosophy. Then, in 1923, she came across the first English translation of Carl Jung's "Psychological Types." "This is it!" Katharine told her daughter. Paul recounts, "In a dramatic display of conviction she burned all her own research and adopted Jung's book as her 'Bible,' as she gushed in a letter to the man himself. His system explained it all: Lyman [Katharine's husband], Katharine, Isabel, and Chief were introverts; the two men were thinkers, while the women were feelers; and of course the Briggses were intuitives, while Chief was a senser." Encouraged by her mother, Isabel--who was living in Swarthmore and writing mystery novels--devised a paper-and-pencil test to help people identify which of the Jungian categories they belonged to, and then spent the rest of her life tirelessly and brilliantly promoting her creation.

  The problem, as Paul points out, is that Myers and her mother did not actually understand Jung at all. Jung didn't believe that types were easily identifiable, and he didn't believe that people could be permanently slotted into one category or another. "Every individual is an exception to the rule," he wrote; to "stick labels on people at first sight," in his view, was "nothing but a childish parlor game." Why is a parlor game based on my desire to entertain my friends any less valid than a parlor game based on Katharine Briggs's obsession with her son-in-law?

  3.

  The problems with the Myers-Briggs suggest that we need a test that is responsive to the complexity and variability of the human personality. And that is why, not long ago, I found myself in the office of a psychologist from New Jersey named Lon Gieser. He is among the country's leading experts on what is called the Thematic Apperception Test (T.A.T.), an assessment tool developed in the nineteen-thirties by Henry Murray, one of the most influential psychologists of the twentieth century.

  I sat in a chair facing Gieser, as if I were his patient. He had in his hand two dozen or so pictures--mostly black-and-white drawings--on legal-sized cards, all of which had been chosen by Murray years before. "These pictures present a series of scenes," Gieser said to me. "What I want you to do with each scene is tell a story with a beginning, a middle, and an end." He handed me the first card. It was of a young boy looking at a violin. I had imagined, as Gieser was describing the test to me, that it would be hard to come up with stories to match the pictures. As I quickly discovered, though, the exercise was relatively effortless: the stories just tumbled out.

  "This is a young boy," I began. "His parents want him to take up the violin, and they've been encouraging him. I think he is uncertain whether he wants to be a violin player, and maybe even resents the imposition of having to play this instrument, which doesn't seem to have any appeal for him. He's not excited or thrilled about this. He'd rather be somewhere else. He's just sitting there looking at it, and dreading having to fulfill this parental obligation."

  I continued in that vein for a few more minutes. Gieser gave me another card, this one of a muscular man clinging to a rope and looking off into the distance. "He's climbing up, not climbing down," I said, and went on:

  It's out in public. It's some kind of big square, in Europe, and there is some kind of spectacle going on. It's the seventeenth or eighteenth century. The King is coming by in a carriage, and this man is shimmying up, so he can see over everyone else and get a better view of the King. I don't get the sense that he's any kind of highborn person. I think he aspires to be more than he is. And he's kind of getting a glimpse of the King as a way of giving himself a sense of what he could be, or what his own future could be like.

  We went on like this for the better part of an hour, as I responded to twelve cards--each of people in various kinds of ambiguous situations. One picture showed a woman slumped on the ground, with some small object next to her; another showed an attractive couple in a kind of angry embrace, apparently having an argument. (I said that the fight they were having was staged, that each was simply playing a role.) As I talked, Gieser took notes. Later, he called me and gave me his impressions. "What came out was the way you deal with emotion," he said. "Even when you recognized the emotion, you distanced yourself from it. The underlying motive is this desire to avoid conflict. The other thing is that when there are opportunities to go to someone else and work stuff out, your character is always going off alone. There is a real avoidance of emotion and dealing with other people, and everyone goes to their own corners and works things out on their own."

  How could Gieser make such a confident reading of my personality after listening to me for such a short time? I was baffled by this, at first, because I felt that I had told a series of random and idiosyncratic stories. When I listened to the tape I had made of the session, though, I saw what Gieser had picked up on: my stories were exceedingly repetitive in just the way that he had identified. The final card that Gieser gave me was blank, and he asked me to imagine my own picture and tell a story about it. For some reason, what came to mind was Andrew Wyeth's famous painting "Christina's World," of a woman alone in a field, her hair being blown by the wind. She was from the city, I said, and had come home to see her family in the country: "I think she is taking a walk. She is pondering some piece of important news. She has gone off from the rest of the people to think about it." Only later did I realize that in the actual painting the woman is not strolling through the field. She is crawling, desperately, on her hands and knees. How obvious could my aversion to strong emotion be?

  The T.A.T. has a number of cards that are used to assess achievement--that is, how interested someone is in getting ahead and succeeding in life. One is the card of the man on the rope; another is the boy looking at his violin. Gieser, in listening to my stories, concluded that I was very low in achievement:

  Some people say this kid is dreaming about being a great violinist, and he's going to make it. With you, it wasn't what he wanted to do at all. His parents were making him do it. With the rope climbing, some people do this Tarzan thing. They climb the pole and get to the top and feel this great achievement. You have him going up the rope--and why is he feeling the pleasure? Because he's seeing the King. He's still a nobody in the public square, looking at the King.

  Now, this is a little strange. I consider myself quite ambitious. On a questionnaire, if you asked me to rank how important getting ahead and being successful was to me, I'd check the "very important" box. But Gieser is suggesting that the T.A.T. allowed him to glimpse another dimension of my personality.

  This idea--that our personality can hold contradictory elements--is at the heart of "Strangers to Ourselves," by the social psychologist Timothy D. Wilson. He is one of the discipline's most prominent researchers, and his book is what popular psychology ought to be (and rarely is): thoughtful, beautifully written, and full of unexpected insights. Wilson's interest is in what he calls the "adaptive unconscious" (not to be confused with the Freudian unconscious). The adaptive unconscious, in Wilson's description, is a big computer in our brain which sits below the surface and evaluates, filters, and looks for patterns in the mountain of data that come in through our senses. That system, Wilson argues, has a personality: it has a set of patterns and responses and tendencies that are laid down by our genes and our early-childhood experiences. These patterns are stable and hard to change, and we are only dimly aware of them. On top of that, in his schema we have another personality: it's the conscious identity that we create for ourselves with the choices we make, the stories we tell about ourselves, and the formal reasons we come up with to explain our motives and feelings. Yet this "constructed self" has no particular connection with the personality of our adaptive unconscious. In fact, they could easily be at odds. Wilson writes:

  The adaptive unconscious is more likely to influence people's uncontrolled, implicit responses, whereas the constructed self is more likely to influence people's deliberative, explicit responses. For example, the quick, spontaneous decision of whether to argue with a co-worker is likely to be under the control of one's nonconscious needs for power and affiliation. A more thoughtful decision about whether to invite a co-worker over for dinner is more likely to be under the control of one's conscious, self-attributed motives.

  When Gieser said that he thought I was low in achievement, then, he presumably saw in my stories an unconscious ambivalence toward success. The T.A.T., he believes, allowed him to go beyond the way I viewed myself and arrive at a reading with greater depth and nuance.

  Even if he's right, though, does this help us pick commandos? I'm not so sure. Clearly, underneath Sandy Nininger's peaceful façade there was another Nininger capable of great bravery and ferocity, and a T.A.T. of Nininger might have given us a glimpse of that part of who he was. But let's not forget that he volunteered for the front lines: he made a conscious decision to put himself in the heat of the action. What we really need is an understanding of how those two sides of his personality interact in critical situations. When is Sandy Nininger's commitment to peacefulness more, or less, important than some unconscious ferocity? The other problem with the T.A.T., of course, is that it's a subjective instrument. You could say that my story about the man climbing the rope is evidence that I'm low in achievement or you could say that it shows a strong desire for social mobility. The climber wants to look down--not up--at the King in order to get a sense "of what he could be." You could say that my interpretation that the couple's fighting was staged was evidence of my aversion to strong emotion. Or you could say that it was evidence of my delight in deception and role-playing. This isn't to question Gieser's skill or experience as a diagnostician. The T.A.T. is supposed to do no more than identify themes and problem areas, and I'm sure Gieser would be happy to put me on the couch for a year to explore those themes and see which of his initial hypotheses had any validity. But the reason employers want a magical instrument for measuring personality is that they don't have a year to work through the ambiguities. They need an answer now.

  4.

  A larger limitation of both Myers-Briggs and the T.A.T. is that they are indirect. Tests of this kind require us first to identify a personality trait that corresponds to the behavior we're interested in, and then to figure out how to measure that trait--but by then we're two steps removed from what we're after. And each of those steps represents an opportunity for error and distortion. Shouldn't we try, instead, to test directly for the behavior we're interested in? This is the idea that lies behind what's known as the Assessment Center, and the leading practitioner of this approach is a company called Development Dimensions International, or D.D.I.

  Companies trying to evaluate job applicants send them to D.D.I.'s headquarters, outside Pittsburgh, where they spend the day role-playing as business executives. When I contacted D.D.I., I was told that I was going to be Terry Turner, the head of the robotics division of a company called Global Solutions.

  I arrived early in the morning, and was led to an office. On the desk was a computer, a phone, and a tape recorder. In the corner of the room was a video camera, and on my desk was an agenda for the day. I had a long telephone conversation with a business partner from France. There were labor difficulties at an overseas plant. A new product--a robot for the home-had run into a series of technical glitches. I answered e-mails. I prepared and recorded a talk for a product-launch meeting. I gave a live interview to a local television reporter. In the afternoon, I met with another senior Global Solutions manager, and presented a strategic plan for the future of the robotics division. It was a long, demanding day at the office, and when I left, a team of D.D.I. specialists combed through copies of my e-mails, the audiotapes of my phone calls and my speech, and the videotapes of my interviews, and analyzed me across four dimensions: interpersonal skills, leadership skills, business-management skills, and personal attributes. A few weeks later, I was given my report. Some of it was positive: I was a quick learner. I had good ideas. I expressed myself well, and--I was relieved to hear--wrote clearly. But, as the assessment of my performance made plain, I was something less than top management material:

  Although you did a remarkable job addressing matters, you tended to handle issues from a fairly lofty perch, pitching good ideas somewhat unilaterally while lobbing supporting rationale down to the team below. . . . Had you brought your team closer to decisions by vesting them with greater accountability, responsibility and decision-making authority, they would have undoubtedly felt more engaged, satisfied and valued. . . .In a somewhat similar vein, but on a slightly more interpersonal level, while you seemed to recognize the value of collaboration and building positive working relationships with people, you tended to take a purely businesslike approach to forging partnerships. You spoke of win/win solutions from a business perspective and your rationale for partnering and collaboration seemed to be based solely on business logic. Additionally, at times you did not respond to some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view.

  Ouch! Of course, when the D.D.I. analysts said that I did not respond to "some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view," they didn't mean that I was an insensitive person. They meant that I was insensitive in the role of manager. The T.A.T. and M.B.T.I. aimed to make global assessments of the different aspects of my personality. My day as Terry Turner was meant to find out only what I'm like when I'm the head of the robotics division of Global Solutions. That's an important difference. It respects the role of situation and contingency in personality. It sidesteps the difficulty of integrating my unconscious self with my constructed self by looking at the way that my various selves interact in the real world. Most important, it offers the hope that with experience and attention I can construct a more appropriate executive "self." The Assessment Center is probably the best method that employers have for evaluating personality.

  But could an Assessment Center help us identify the Sandy Niningers of the world? The center makes a behavioral prediction, and, as solid and specific as that prediction is, people are least predictable at those critical moments when prediction would be most valuable. The answer to the question of whether my Terry Turner would be a good executive is, once again: It depends. It depends on what kind of company Global Solutions is, and on what kind of respect my co-workers have for me, and on how quickly I manage to correct my shortcomings, and on all kinds of other things that cannot be anticipated. The quality of being a good manager is, in the end, as irreducible as the quality of being a good friend. We think that a friend has to be loyal and nice and interesting--and that's certainly a good start. But people whom we don't find loyal, nice, or interesting have friends, too, because loyalty, niceness, and interestingness are emergent traits. They arise out of the interaction of two people, and all we really mean when we say that someone is interesting or nice is that they are interesting or nice to us.

  All these difficulties do not mean that we should give up on the task of trying to understand and categorize one another. We could certainly send Sandy Nininger to an Assessment Center, and find out whether, in a make-believe battle, he plays the role of commando with verve and discipline. We could talk to his friends and discover his love of music and theatre. We could find out how he responded to the picture of the man on a rope. We could sit him down and have him do the Myers-Briggs and dutifully note that he is an Introverted, Intuitive, Thinking Judger, and, for good measure, take an extra minute to run him through my own favorite personality inventory and type him as a Canine, Different, Insider Gobbler. We will know all kinds of things about him then. His personnel file will be as thick as a phone book, and we can consult our findings whenever we make decisions about his future. We just have to acknowledge that his file will tell us little about the thing we're most interested in. For that, we have to join him in the jungles of Bataan.
GO TO TOP MENU

  Mustard now comes in dozens of varieties. Why has ketchup stayed the same?

  1.

  Many years ago, one mustard dominated the supermarket shelves: French's.=It came in a plastic bottle.=People used it on hot dogs and bologna.=It was a yellow mustard, made from ground white mustard seed with turmeric and vinegar, which gave it a mild, slightly metallic taste.=If you looked hard in the grocery store, you might find something in the specialty-foods section called Grey Poupon, which was Dijon mustard, made from the more pungent brown mustard seed.=In the early seventies, Grey Poupon was no more than a hundred-thousand-dollar-a-year business.=Few people knew what it was or how it tasted, or had any particular desire for an alternative to French's or the runner-up, Gulden's.=Then one day the Heublein Company, which owned Grey Poupon, discovered something remarkable: if you gave people a mustard taste test, a significant number had only to try Grey Poupon once to switch from yellow mustard.=In the food world that almost never happens; even among the most successful food brands, only about one in a hundred have that kind of conversion rate.=Grey Poupon was magic.

  So Heublein put Grey Poupon in a bigger glass jar, with an enamelled label and enough of a whiff of Frenchness to make it seem as if it were still being made in Europe (it was made in Hartford, Connecticut, from Canadian mustard seed and white wine).=The company ran tasteful print ads in upscale food magazines.=They put the mustard in little foil packets and distributed them with airplane meals--which was a brand-new idea at the time.=Then they hired the Manhattan ad agency Lowe Marschalk to do something, on a modest budget, for television.=The agency came back with an idea: A Rolls-Royce is driving down a country road.=There's a man in the back seat in a suit with a plate of beef on a silver tray.=He nods to the chauffeur, who opens the glove compartment.=Then comes what is known in the business as the "reveal."=The chauffeur hands back a jar of Grey Poupon.=Another Rolls-Royce pulls up alongside.=A man leans his head out the window.="Pardon me.=Would you have any Grey Poupon?"

  In the cities where the ads ran, sales of Grey Poupon leaped forty to fifty per cent, and whenever Heublein bought airtime in new cities sales jumped by forty to fifty per cent again.=Grocery stores put Grey Poupon next to French's and Gulden's.=By the end of the nineteen-eighties Grey Poupon was the most powerful brand in mustard.="The tagline in the commercial was that this was one of life's finer pleasures," Larry Elegant, who wrote the original Grey Poupon spot, says, "and that, along with the Rolls-Royce, seemed to impart to people's minds that this was something truly different and superior."

  The rise of Grey Poupon proved that the American supermarket shopper was willing to pay more--in this case, $3.99 instead of $1.49 for eight ounces--as long as what they were buying carried with it an air of sophistication and complex aromatics.=Its success showed, furthermore, that the boundaries of taste and custom were not fixed: that just because mustard had always been yellow didn't mean that consumers would use only yellow mustard.=It is because of Grey Poupon that the standard American supermarket today has an entire mustard section.=And it is because of Grey Poupon that a man named Jim Wigon decided, four years ago, to enter the ketchup business.=Isn't the ketchup business today exactly where mustard was thirty years ago? There is Heinz and, far behind, Hunt's and Del Monte and a handful of private-label brands.=Jim Wigon wanted to create the Grey Poupon of ketchup.

  Wigon is from Boston.=He's a thickset man in his early fifties, with a full salt-and-pepper beard.=He runs his ketchup business--under the brand World's Best Ketchup--out of the catering business of his partner, Nick Schiarizzi, in Norwood, Massachusetts, just off Route 1, in a low-slung building behind an industrial-equipment-rental shop.=He starts with red peppers, Spanish onions, garlic, and a high-end tomato paste.=Basil is chopped by hand, because the buffalo chopper bruises the leaves.=He uses maple syrup, not corn syrup, which gives him a quarter of the sugar of Heinz.=He pours his ketchup into a clear glass ten-ounce jar, and sells it for three times the price of Heinz, and for the past few years he has crisscrossed the country, peddling World's Best in six flavors--regular, sweet, dill, garlic, caramelized onion, and basil--to specialty grocery stores and supermarkets.=If you were in Zabar's on Manhattan's Upper West Side a few months ago, you would have seen him at the front of the store, in a spot between the sushi and the gefilte fish.=He was wearing a World's Best baseball cap, a white shirt, and a red-stained apron.=In front of him, on a small table, was a silver tureen filled with miniature chicken and beef meatballs, a box of toothpicks, and a dozen or so open jars of his ketchup.="Try my ketchup!" Wigon said, over and over, to anyone who passed.="If you don't try it, you're doomed to eat Heinz the rest of your life."

  In the same aisle at Zabar's that day two other demonstrations were going on, so that people were starting at one end with free chicken sausage, sampling a slice of prosciutto, and then pausing at the World's Best stand before heading for the cash register.=They would look down at the array of open jars, and Wigon would impale a meatball on a toothpick, dip it in one of his ketchups, and hand it to them with a flourish.=The ratio of tomato solids to liquid in World's Best is much higher than in Heinz, and the maple syrup gives it an unmistakable sweet kick.=Invariably, people would close their eyes, just for a moment, and do a subtle double take.=Some of them would look slightly perplexed and walk away, and others would nod and pick up a jar.="You know why you like it so much?" he would say, in his broad Boston accent, to the customers who seemed most impressed.="Because you've been eating bad ketchup all " Jim Wigon had a simple vision: build a better ketchup--the way Grey Poupon built a better mustard--and the world will beat a path to your door.=If only it were that easy.

  2.

  The story of World's Best Ketchup cannot properly be told without a man from White Plains, New York, named Howard Moskowitz.=Moskowitz is sixty, short and round, with graying hair and huge gold-rimmed glasses.=When he talks, he favors the Socratic monologue--a series of questions that he poses to himself, then answers, punctuated by "ahhh" and much vigorous nodding.=He is a lineal descendant of the legendary eighteenth-century Hasidic rabbi known as the Seer of Lublin.=He keeps a parrot.=At Harvard, he wrote his doctoral dissertation on psychophysics, and all the rooms on the ground floor of his food-testing and market-research business are named after famous psychophysicists.=("Have you ever heard of the name Rose Marie Pangborn? Ahhh.=She was a professor at Davis.=Very famous.=This is the Pangborn kitchen.") Moskowitz is a man of uncommon exuberance and persuasiveness: if he had been your freshman statistics professor, you would today be a statistician.="My favorite writer? Gibbon," he burst out, when we met not long ago.=He had just been holding forth on the subject of sodium solutions.="Right now I'm working my way through the Hales history of the Byzantine Empire.=Holy shit! Everything is easy until you get to the Byzantine Empire.=It's impossible.=One emperor is always killing the others, and everyone has five wives or three husbands.=It's very Byzantine."

  Moskowitz set up shop in the seventies, and one of his first clients was Pepsi.=The artificial sweetener aspartame had just become available, and Pepsi wanted Moskowitz to figure out the perfect amount of sweetener for a can of Diet Pepsi.=Pepsi knew that anything below eight per cent sweetness was not sweet enough and anything over twelve per cent was too sweet.=So Moskowitz did the logical thing.=He made up experimental batches of Diet Pepsi with every conceivable degree of sweetness--8 per cent, 8.25 per cent, 8.5, and on and on up to 12--gave them to hundreds of people, and looked for the concentration that people liked the most.=But the data were a mess--there wasn't a pattern--and one day, sitting in a diner, Moskowitz realized why.=They had been asking the wrong question.=There was no such thing as the perfect Diet Pepsi.=They should have been looking for the perfect Diet Pepsis.

  It took a long time for the food world to catch up with Howard Moskowitz.=He knocked on doors and tried to explain his idea about the plural nature of perfection, and no one answered.=He spoke at food-industry conferences, and audiences shrugged.=But he could think of nothing else.="It's like that Yiddish expression," he says.="Do you know it? To a worm in horseradish, the world is horseradish!" Then, in 1986, he got a call from the Campbell's Soup Company.=They were in the spaghetti-sauce business, going up against Ragú with their Prego brand.=Prego was a little thicker than Ragú, with diced tomatoes as opposed to Ragú's purée, and, Campbell's thought, had better pasta adherence.=But, for all that, Prego was in a slump, and Campbell's was desperate for new ideas.

  Standard practice in the food industry would have been to convene a focus group and ask spaghetti eaters what they wanted.=But Moskowitz does not believe that consumers--even spaghetti lovers--know what they desire if what they desire does not yet exist.="The mind," as Moskowitz is fond of saying, "knows not what the tongue wants."=Instead, working with the Campbell's kitchens, he came up with forty-five varieties of spaghetti sauce.=These were designed to differ in every conceivable way: spiciness, sweetness, tartness, saltiness, thickness, aroma, mouth feel, cost of ingredients, and so forth.=He had a trained panel of food tasters analyze each of those varieties in depth.=Then he took the prototypes on the road--to New York, Chicago, Los Angeles, and Jacksonville--and asked people in groups of twenty-five to eat between eight and ten small bowls of different spaghetti sauces over two hours and rate them on a scale of one to a hundred.=When Moskowitz charted the results, he saw that everyone had a slightly different definition of what a perfect spaghetti sauce tasted like.=If you sifted carefully through the data, though, you could find patterns, and Moskowitz learned that most people's preferences fell into one of three broad groups: plain, spicy, and extra-chunky, and of those three the last was the most important.=Why? Because at the time there was no extra-chunky spaghetti sauce in the supermarket.=Over the next decade, that new category proved to be worth hundreds of millions of dollars to Prego.="We all said, 'Wow!' " Monica Wood, who was then the head of market research for Campbell's, recalls.="Here there was this third segment--people who liked their spaghetti sauce with lots of stuff in it--and it was completely untapped.=So in about 1989-90 we launched Prego extra-chunky.=It was extraordinarily successful."

  It may be hard today, fifteen years later--when every brand seems to come in multiple varieties--to appreciate how much of a breakthrough this was.=In those years, people in the food industry carried around in their heads the notion of a platonic dish--the version of a dish that looked and tasted absolutely right.=At Ragú and Prego, they had been striving for the platonic spaghetti sauce, and the platonic spaghetti sauce was thin and blended because that's the way they thought it was done in Italy.=Cooking, on the industrial level, was consumed with the search for human universals.=Once you start looking for the sources of human variability, though, the old orthodoxy goes out the window.=Howard Moskowitz stood up to the Platonists and said there are no universals.

  Moskowitz still has a version of the computer model he used for Prego fifteen years ago.=It has all the coded results from the consumer taste tests and the expert tastings, split into the three categories (plain, spicy, and extra-chunky) and linked up with the actual ingredients list on a spreadsheet.="You know how they have a computer model for building an aircraft," Moskowitz said as he pulled up the program on his computer.="This is a model for building spaghetti sauce.=Look, every variable is here."=He pointed at column after column of ratings.="So here are the ingredients.=I'm a brand manager for Prego.=I want to optimize one of the segments.=Let's start with Segment 1."=In Moskowitz's program, the three spaghetti-sauce groups were labelled Segment 1, Segment 2, and Segment 3.=He typed in a few commands, instructing the computer to give him the formulation that would score the highest with those people in Segment 1.=The answer appeared almost immediately: a specific recipe that, according to Moskowitz's data, produced a score of 78 from the people in Segment 1.=But that same formulation didn't do nearly as well with those in Segment 2 and Segment 3.=They scored it 67 and 57, respectively.=Moskowitz started again, this time asking the computer to optimize for Segment 2.=This time the ratings came in at 82, but now Segment 1 had fallen ten points, to 68.="See what happens?" he said.="If I make one group happier, I piss off another group.=We did this for coffee with General Foods, and we found that if you create only one product the best you can get across all the segments is a 60--if you're lucky.=That's if you were to treat everybody as one big happy family.=But if I do the sensory segmentation, I can get 70, 71, 72.=Is that big? Ahhh.=It's a very big difference.=In coffee, a 71 is something you'll die for."

  When Jim Wigon set up shop that day in Zabar's, then, his operating assumption was that there ought to be some segment of the population that preferred a ketchup made with Stanislaus tomato paste and hand-chopped basil and maple syrup.=That's the Moskowitz theory.=But there is theory and there is practice.=By the end of that long day, Wigon had sold ninety jars.=But he'd also got two parking tickets and had to pay for a hotel room, so he wasn't going home with money in his pocket.=For the year, Wigon estimates, he'll sell fifty thousand jars--which, in the universe of condiments, is no more than a blip.="I haven't drawn a paycheck in five years," Wigon said as he impaled another meatball on a toothpick.="My wife is killing me."=And it isn't just World's Best that is struggling.=In the gourmet-ketchup world, there is River Run and Uncle Dave's, from Vermont, and Muir Glen Organic and Mrs.=Tomato Head Roasted Garlic Peppercorn Catsup, in California, and dozens of others--and every year Heinz's overwhelming share of the ketchup market just grows.

  It is possible, of course, that ketchup is waiting for its own version of that Rolls-Royce commercial, or the discovery of the ketchup equivalent of extra-chunky--the magic formula that will satisfy an unmet need.=It is also possible, however, that the rules of Howard Moskowitz, which apply to Grey Poupon and Prego spaghetti sauce and to olive oil and salad dressing and virtually everything else in the supermarket, don't apply to ketchup.

  3.

  Tomato ketchup is a nineteenth-century creation--the union of the English tradition of fruit and vegetable sauces and the growing American infatuation with the tomato.=But what we know today as ketchup emerged out of a debate that raged in the first years of the last century over benzoate, a preservative widely used in late-nineteenth-century condiments.=Harvey Washington Wiley, the chief of the Bureau of Chemistry in the Department of Agriculture from 1883 to 1912, came to believe that benzoates were not safe, and the result was an argument that split the ketchup world in half.=On one side was the ketchup establishment, which believed that it was impossible to make ketchup without benzoate and that benzoate was not harmful in the amounts used.=On the other side was a renegade band of ketchup manufacturers, who believed that the preservative puzzle could be solved with the application of culinary science.=The dominant nineteenth-century ketchups were thin and watery, in part because they were made from unripe tomatoes, which are low in the complex carbohydrates known as pectin, which add body to a sauce.=But what if you made ketchup from ripe tomatoes, giving it the density it needed to resist degradation? Nineteenth-century ketchups had a strong tomato taste, with just a light vinegar touch.=The renegades argued that by greatly increasing the amount of vinegar, in effect protecting the tomatoes by pickling them, they were making a superior ketchup: safer, purer, and better tasting.=They offered a money-back guarantee in the event of spoilage.=They charged more for their product, convinced that the public would pay more for a better ketchup, and they were right.=The benzoate ketchups disappeared.=The leader of the renegade band was an entrepreneur out of Pittsburgh named Henry J.=Heinz.

  The world's leading expert on ketchup's early years is Andrew F.=Smith, a substantial man, well over six feet, with a graying mustache and short wavy black hair.=Smith is a scholar, trained as a political scientist, intent on bringing rigor to the world of food.=When we met for lunch not long ago at the restaurant Savoy in SoHo (chosen because of the excellence of its hamburger and French fries, and because Savoy makes its own ketchup--a dark, peppery, and viscous variety served in a white porcelain saucer), Smith was in the throes of examining the origins of the croissant for the upcoming "Oxford Encyclopedia of Food and Drink in America," of which he is the editor-in-chief.=Was the croissant invented in 1683, by the Viennese, in celebration of their defeat of the invading Turks? Or in 1686, by the residents of Budapest, to celebrate their defeat of the Turks? Both explanations would explain its distinctive crescent shape--since it would make a certain cultural sense (particularly for the Viennese) to consecrate their battlefield triumphs in the form of pastry.=But the only reference Smith could find to either story was in the Larousse Gastronomique of 1938.="It just doesn't check out," he said, shaking his head wearily.

  Smith's specialty is the tomato, however, and over the course of many scholarly articles and books--"The History of Home-Made Anglo-American Tomato Ketchup," for Petits Propos Culinaires, for example, and "The Great Tomato Pill War of the 1830's," for The Connecticut Historical Society Bulletin--Smith has argued that some critical portion of the history of culinary civilization could be told through this fruit.=Cortez brought tomatoes to Europe from the New World, and they inexorably insinuated themselves into the world's cuisines.=The Italians substituted the tomato for eggplant.=In northern India, it went into curries and chutneys.="The biggest tomato producer in the world today?" Smith paused, for dramatic effect.="China.=You don't think of tomato being a part of Chinese cuisine, and it wasn't ten years ago.=But it is now."=Smith dipped one of my French fries into the homemade sauce.="It has that raw taste," he said, with a look of intense concentration.="It's fresh ketchup.=You can taste the tomato."=Ketchup was, to his mind, the most nearly perfect of all the tomato's manifestations.=It was inexpensive, which meant that it had a firm lock on the mass market, and it was a condiment, not an ingredient, which meant that it could be applied at the discretion of the food eater, not the food preparer.="There's a quote from Elizabeth Rozin I've always loved," he said.=Rozin is the food theorist who wrote the essay "Ketchup and the Collective Unconscious," and Smith used her conclusion as the epigraph of his ketchup book: ketchup may well be "the only true culinary expression of the melting pot, and .=.=.=its special and unprecedented ability to provide something for everyone makes it the Esperanto of cuisine."=Here is where Henry Heinz and the benzoate battle were so important: in defeating the condiment Old Guard, he was the one who changed the flavor of ketchup in a way that made it universal.

  4.

  There are five known fundamental tastes in the human palate: salty, sweet, sour, bitter, and umami.=Umami is the proteiny, full-bodied taste of chicken soup, or cured meat, or fish stock, or aged cheese, or mother's milk, or soy sauce, or mushrooms, or seaweed, or cooked tomato.="Umami adds body," Gary Beauchamp, who heads the Monell Chemical Senses Center, in Philadelphia, says.="If you add it to a soup, it makes the soup seem like it's thicker--it gives it sensory heft.=It turns a soup from salt water into a food."=When Heinz moved to ripe tomatoes and increased the percentage of tomato solids, he made ketchup, first and foremost, a potent source of umami.=Then he dramatically increased the concentration of vinegar, so that his ketchup had twice the acidity of most other ketchups; now ketchup was sour, another of the fundamental tastes.=The post-benzoate ketchups also doubled the concentration of sugar--so now ketchup was also sweet--and all along ketchup had been salty and bitter.=These are not trivial issues.=Give a baby soup, and then soup with MSG (an amino-acid salt that is pure umami), and the baby will go back for the MSG soup every time, the same way a baby will always prefer water with sugar to water alone.=Salt and sugar and umami are primal signals about the food we are eating--about how dense it is in calories, for example, or, in the case of umami, about the presence of proteins and amino acids.=What Heinz had done was come up with a condiment that pushed all five of these primal buttons.=The taste of Heinz's ketchup began at the tip of the tongue, where our receptors for sweet and salty first appear, moved along the sides, where sour notes seem the strongest, then hit the back of the tongue, for umami and bitter, in one long crescendo.=How many things in the supermarket run the sensory spectrum like this?

  A number of years ago, the H.=J.=Heinz Company did an extensive market-research project in which researchers went into people's homes and watched the way they used ketchup.="I remember sitting in one of those households," Casey Keller, who was until recently the chief growth officer for Heinz, says.="There was a three-year-old and a six-year-old, and what happened was that the kids asked for ketchup and Mom brought it out.=It was a forty-ounce bottle.=And the three-year-old went to grab it himself, and Mom intercepted the bottle and said, 'No, you're not going to do that.' She physically took the bottle away and doled out a little dollop.=You could see that the whole thing was a bummer."=For Heinz, Keller says, that moment was an epiphany.=A typical five-year-old consumes about sixty per cent more ketchup than a typical forty-year-old, and the company realized that it needed to put ketchup in a bottle that a toddler could control.="If you are four--and I have a four-year-old--he doesn't get to choose what he eats for dinner, in most cases," Keller says.="But the one thing he can control is ketchup.=It's the one part of the food experience that he can customize and personalize."=As a result, Heinz came out with the so-called EZ Squirt bottle, made out of soft plastic with a conical nozzle.=In homes where the EZ Squirt is used, ketchup consumption has grown by as much as twelve per cent.

  There is another lesson in that household scene, though.=Small children tend to be neophobic: once they hit two or three, they shrink from new tastes.=That makes sense, evolutionarily, because through much of human history that is the age at which children would have first begun to gather and forage for themselves, and those who strayed from what was known and trusted would never have survived.=There the three-year-old was, confronted with something strange on his plate--tuna fish, perhaps, or Brussels sprouts--and he wanted to alter his food in some way that made the unfamiliar familiar.=He wanted to subdue the contents of his plate.=And so he turned to ketchup, because, alone among the condiments on the table, ketchup could deliver sweet and sour and salty and bitter and umami, all at once.

  5.

  Last February, Edgar Chambers IV, who runs the sensory-analysis center at Kansas State University, conducted a joint assessment of World's Best and Heinz.=He has seventeen trained tasters on his staff, and they work for academia and industry, answering the often difficult question of what a given substance tastes like.=It is demanding work.=Immediately after conducting the ketchup study, Chambers dispatched a team to Bangkok to do an analysis of fruit--bananas, mangoes, rose apples, and sweet tamarind.=Others were detailed to soy and kimchi in South Korea, and Chambers's wife led a delegation to Italy to analyze ice cream.

  The ketchup tasting took place over four hours, on two consecutive mornings.=Six tasters sat around a large, round table with a lazy Susan in the middle.=In front of each panelist were two one-ounce cups, one filled with Heinz ketchup and one filled with World's Best.=They would work along fourteen dimensions of flavor and texture, in accordance with the standard fifteen-point scale used by the food world.=The flavor components would be divided two ways: elements picked up by the tongue and elements picked up by the nose.=A very ripe peach, for example, tastes sweet but it also smells sweet--which is a very different aspect of sweetness.=Vinegar has a sour taste but also a pungency, a vapor that rises up the back of the nose and fills the mouth when you breathe out.=To aid in the rating process, the tasters surrounded themselves with little bowls of sweet and sour and salty solutions, and portions of Contadina tomato paste, Hunt's tomato sauce, and Campbell's tomato juice, all of which represent different concentrations of tomato-ness.

  After breaking the ketchup down into its component parts, the testers assessed the critical dimension of "amplitude," the word sensory experts use to describe flavors that are well blended and balanced, that "bloom" in the mouth.="The difference between high and low amplitude is the difference between my son and a great pianist playing 'Ode to Joy' on the piano," Chambers says.="They are playing the same notes, but they blend better with the great pianist."=Pepperidge Farm shortbread cookies are considered to have high amplitude.=So are Hellman's mayonnaise and Sara Lee poundcake.=When something is high in amplitude, all its constituent elements converge into a single gestalt.=You can't isolate the elements of an iconic, high-amplitude flavor like Coca-Cola or Pepsi.=But you can with one of those private-label colas that you get in the supermarket.="The thing about Coke and Pepsi is that they are absolutely gorgeous," Judy Heylmun, a vice-president of Sensory Spectrum, Inc., in Chatham, New Jersey, says.="They have beautiful notes--all flavors are in balance.=It's very hard to do that well.=Usually, when you taste a store cola it's"-- and here she made a series of pik! pik! pik! sounds--"all the notes are kind of spiky, and usually the citrus is the first thing to spike out.=And then the cinnamon.=Citrus and brown spice notes are top notes and very volatile, as opposed to vanilla, which is very dark and deep.=A really cheap store brand will have a big, fat cinnamon note sitting on top of everything."

  Some of the cheaper ketchups are the same way.=Ketchup aficionados say that there's a disquieting unevenness to the tomato notes in Del Monte ketchup: Tomatoes vary, in acidity and sweetness and the ratio of solids to liquid, according to the seed variety used, the time of year they are harvested, the soil in which they are grown, and the weather during the growing season.=Unless all those variables are tightly controlled, one batch of ketchup can end up too watery and another can be too strong.=Or try one of the numerous private-label brands that make up the bottom of the ketchup market and pay attention to the spice mix; you may well find yourself conscious of the clove note or overwhelmed by a hit of garlic.=Generic colas and ketchups have what Moskowitz calls a hook--a sensory attribute that you can single out, and ultimately tire of.

  The tasting began with a plastic spoon.=Upon consideration, it was decided that the analysis would be helped if the ketchups were tasted on French fries, so a batch of fries were cooked up, and distributed around the table.=Each tester, according to protocol, took the fries one by one, dipped them into the cup--all the way, right to the bottom--bit off the portion covered in ketchup, and then contemplated the evidence of their senses.=For Heinz, the critical flavor components--vinegar, salt, tomato I.D.=(over-all tomato-ness), sweet, and bitter--were judged to be present in roughly equal concentrations, and those elements, in turn, were judged to be well blended.=The World's Best, though, "had a completely different view, a different profile, from the Heinz," Chambers said.=It had a much stronger hit of sweet aromatics--4.0 to 2.5--and outstripped Heinz on tomato I.D.=by a resounding 9 to 5.5.=But there was less salt, and no discernible vinegar.="The other comment from the panel was that these elements were really not blended at all," Chambers went on.="The World's Best product had really low amplitude."=According to Joyce Buchholz, one of the panelists, when the group judged aftertaste, "it seemed like a certain flavor would hang over longer in the case of World's Best--that cooked-tomatoey flavor."

  But what was Jim Wigon to do? To compete against Heinz, he had to try something dramatic, like substituting maple syrup for corn syrup, ramping up the tomato solids.=That made for an unusual and daring flavor.=World's Best Dill ketchup on fried catfish, for instance, is a marvellous thing.=But it also meant that his ketchup wasn't as sensorily complete as Heinz, and he was paying a heavy price in amplitude.="Our conclusion was mainly this," Buchholz said.="We felt that World's Best seemed to be more like a sauce."=She was trying to be helpful.

  There is an exception, then, to the Moskowitz rule.=Today there are thirty-six varieties of Ragú spaghetti sauce, under six rubrics--Old World Style, Chunky Garden Style, Robusto, Light, Cheese Creations, and Rich & Meaty--which means that there is very nearly an optimal spaghetti sauce for every man, woman, and child in America.=Measured against the monotony that confronted Howard Moskowitz twenty years ago, this is progress.=Happiness, in one sense, is a function of how closely our world conforms to the infinite variety of human preference.=But that makes it easy to forget that sometimes happiness can be found in having what we've always had and everyone else is having.="Back in the seventies, someone else--I think it was Ragú--tried to do an 'Italian'-style ketchup," Moskowitz said.="They failed miserably."=It was a conundrum: what was true about a yellow condiment that went on hot dogs was not true about a tomato condiment that went on hamburgers, and what was true about tomato sauce when you added visible solids and put it in a jar was somehow not true about tomato sauce when you added vinegar and sugar and put it in a bottle.=Moskowitz shrugged.="I guess ketchup is ketchup."
GO TO TOP MENU

  Fifty years ago, the mall was born. America would never be the same.

  1.

  Victor Gruen was short, stout, and unstoppable, with a wild head of hair and eyebrows like unpruned hedgerows.=According to a profile in Fortune (and people loved to profile Victor Gruen), he was a "torrential talker with eyes as bright as mica and a mind as fast as mercury."= In the office, he was famous for keeping two or three secretaries working full time, as he moved from one to the next, dictating non-stop in his thick Viennese accent.=He grew up in the well-to-do world of prewar Jewish Vienna, studying architecture at the Vienna Academy of Fine Arts--the same school that, a few years previously, had turned down a fledgling artist named Adolf Hitler.=At night, he performed satirical cabaret theatre in smoke-filled cafés.=He emigrated in 1938, the same week as Freud, when one of his theatre friends dressed up as a Nazi Storm Trooper and drove him and his wife to the airport.=They took the first plane they could catch to Zurich, made their way to England, and then boarded the S.S.=Statendam for New York, landing, as Gruen later remembered, "with an architect's degree, eight dollars, and no English."= On the voyage over, he was told by an American to set his sights high--"don't try to wash dishes or be a waiter, we have millions of them"--but Gruen scarcely needed the advice.=He got together with some other German émigrés and formed the Refugee Artists Group.=George S.=Kaufman's wife was their biggest fan.=Richard Rodgers and Al Jolson gave them money.=Irving Berlin helped them with their music.=Gruen got on the train to Princeton and came back with a letter of recommendation from Albert Einstein.=By the summer of 1939, the group was on Broadway, playing eleven weeks at the Music Box.=Then, as M.=Jeffrey Hartwick recounts in "Mall Maker," his new biography of Gruen, one day he went for a walk in midtown and ran into an old friend from Vienna, Ludwig Lederer, who wanted to open a leather-goods boutique on Fifth Avenue.=Victor agreed to design it, and the result was a revolutionary storefront, with a kind of mini-arcade in the entranceway, roughly seventeen by fifteen feet: six exquisite glass cases, spotlights, and faux marble, with green corrugated glass on the ceiling.=It was a "customer trap."= This was a brand-new idea in American retail design, particularly on Fifth Avenue, where all the carriage-trade storefronts were flush with the street.=The critics raved.=Gruen designed Ciro's on Fifth Avenue, Steckler's on Broadway, Paris Decorators on the Bronx Concourse, and eleven branches of the California clothing chain Grayson's.=In the early fifties, he designed an outdoor shopping center called Northland outside Detroit for J.=L.=Hudson's.=It covered a hundred and sixty-three acres and had nearly ten thousand parking spaces.=This was little more than a decade and a half since he stepped off the boat, and when Gruen watched the bulldozers break ground he turned to his partner and said, "My God but we've got a lot of nerve."

  But Gruen's most famous creation was his next project, in the town of Edina, just outside Minneapolis.=He began work on it almost exactly fifty years ago.=It was called Southdale.=It cost twenty million dollars, and had seventy-two stores and two anchor department-store tenants, Donaldson's and Dayton's.=Until then, most shopping centers had been what architects like to call "extroverted," meaning that store windows and entrances faced both the parking area and the interior pedestrian walkways.=Southdale was introverted: the exterior walls were blank, and all the activity was focussed on the inside.=Suburban shopping centers had always been in the open, with stores connected by outdoor passageways.=Gruen had the idea of putting the whole complex under one roof, with air-conditioning for the summer and heat for the winter.=Almost every other major shopping center had been built on a single level, which made for punishingly long walks.=Gruen put stores on two levels, connected by escalators and fed by two-tiered parking.=In the middle he put a kind of town square, a "garden court" under a skylight, with a fishpond, enormous sculpted trees, a twenty-one-foot cage filled with bright-colored birds, balconies with hanging plants, and a café.=The result, Hardwick writes, was a sensation:

  Journalists from all of the country's top magazines came for the Minneapolis shopping center's opening.=Life, Fortune, Time, Women's Wear Daily, the New York Times, Business Week and Newsweek all covered the event.=The national and local press wore out superlatives attempting to capture the feeling of Southdale.="The Splashiest Center in the U.=S.," Life sang.=The glossy weekly praised the incongruous combination of a "goldfish pond, birds, art and 10 acres of stores all. . . under one Minnesota roof."= A "pleasure-dome-with-parking," Time cheered.=One journalist announced that overnight Southdale had become an integral "part of the American Way."

  Southdale Mall still exists.=It is situated off I-494, south of downtown Minneapolis and west of the airport--a big concrete box in a sea of parking.=The anchor tenants are now J.=C.=Penney and Marshall Field's, and there is an Ann Taylor and a Sunglass Hut and a Foot Locker and just about every other chain store that you've ever seen in a mall.=It does not seem like a historic building, which is precisely why it is one.=Fifty years ago, Victor Gruen designed a fully enclosed, introverted, multitiered, double-anchor-tenant shopping complex with a garden court under a skylight--and today virtually every regional shopping center in America is a fully enclosed, introverted, multitiered, double-anchor-tenant complex with a garden court under a skylight.=Victor Gruen didn't design a building; he designed an archetype.=For a decade, he gave speeches about it and wrote books and met with one developer after another and waved his hands in the air excitedly, and over the past half century that archetype has been reproduced so faithfully on so many thousands of occasions that today virtually every suburban American goes shopping or wanders around or hangs out in a Southdale facsimile at least once or twice a month.=Victor Gruen may well have been the most influential architect of the twentieth century.=He invented the mall.

  2.

  One of Gruen's contemporaries in the early days of the mall was a man named A.=Alfred Taubman, who also started out as a store designer.=In 1950, when Taubman was still in his twenties, he borrowed five thousand dollars, founded his own development firm, and, three years later, put up a twenty-six-store open-air shopping center in Flint, Michigan.=A few years after that, inspired by Gruen, he matched Southdale with an enclosed mall of his own in Hayward, California, and over the next half century Taubman put together what is widely considered one of the finest collections of shopping malls in the world.=The average American mall has annual sales of around three hundred and forty dollars per square foot.=Taubman's malls average sales close to five hundred dollars per square foot.=If Victor Gruen invented the mall, Alfred Taubman perfected it.=One day not long ago, I asked Taubman to take me to one of his shopping centers and explain whatever it was that first drew people like him and Victor Gruen to the enclosed mall fifty years ago.

  Taubman, who just turned eighty, is an imposing man with a wry sense of humor who wears bespoke three-piece suits and peers down at the world through half-closed eyes.=He is the sort of old-fashioned man who refers to merchandise as "goods" and apparel as "soft goods" and who can glance at a couture gown from halfway across the room and come within a few dollars of its price.=Recently, Taubman's fortunes took a turn for the worse when Sotheby's, which he bought in 1983, ran afoul of antitrust laws and he ended up serving a year-long prison sentence on price-fixing charges.=Then his company had to fend off a hostile takeover bid led by Taubman's archrival, the Indianapolis-based Simon Property Group.=But, on a recent trip from his Manhattan offices to the Mall at Short Hills, a half hour's drive away in New Jersey, Taubman was in high spirits.=Short Hills holds a special place in his heart.="When I bought that property in 1980, there were only seven stores that were still in business," Taubman said, sitting in the back of his limousine.="It was a disaster.=It was done by a large commercial architect who didn't understand what he was doing."= Turning it around took four renovations.=Bonwit Teller and B.=Altman--two of the original anchor tenants--were replaced by Neiman Marcus, Saks, Nordstrom, and Macy's.=Today, Short Hills has average sales of nearly eight hundred dollars per square foot; according to the Greenberg Group, it is the third-most-successful covered mall in the country.=When Taubman and I approached the mall, the first thing he did was peer out at the parking garage.=It was just before noon on a rainy Thursday.=The garage was almost full.="Look at all the cars!" he said, happily.

  Taubman directed the driver to stop in front of Bloomingdale's, on the mall's north side.=He walked through the short access corridor, paused, and pointed at the floor.=It was made up of small stone tiles.="People used to use monolithic terrazzo in centers," he said.="But it cracked easily and was difficult to repair.=Women, especially, tend to have thin soles.=We found that they are very sensitive to the surface, and when they get on one of those terrazzo floors it's like a skating rink.=They like to walk on the joints.=The only direct contact you have with the building is through the floor.=How you feel about it is very important."= Then he looked up and pointed to the second floor of the mall.=The handrails were transparent.="We don't want anything to disrupt the view," Taubman said.=If you're walking on the first level, he explained, you have to be able, at all times, to have an unimpeded line of sight not just to the stores in front of you but also to the stores on the second level.=The idea is to overcome what Taubman likes to call "threshold resistance," which is the physical and psychological barrier that stands between a shopper and the inside of a store.="You buy something because it is available and attractive," Taubman said.="You can't have any obstacles.=The goods have to be all there."= When Taubman was designing stores in Detroit, in the nineteen-forties, he realized that even the best arcades, like those Gruen designed on Fifth Avenue, weren't nearly as good at overcoming threshold resistance as an enclosed mall, because with an arcade you still had to get the customer through the door.="People assume we enclose the space because of air-conditioning and the weather, and that's important," Taubman said.="But the main reason is that it allows us to open up the store to the customer."

  Taubman began making his way down the mall.=He likes the main corridors of his shopping malls to be no more than a thousand feet long--the equivalent of about three city blocks--because he believes that three blocks is about as far as peak shopping interest can be sustained, and as he walked he explained the logic behind what retailers like to call "adjacencies."= There was Brooks Brothers, where a man might buy a six-hundred-dollar suit, right across from Johnston & Murphy, where the same man might buy a two-hundred-dollar pair of shoes.=The Bose electronics store was next to Brookstone and across from the Sharper Image, so if you got excited about some electronic gizmo in one store you were steps away from getting even more excited by similar gizmos in two other stores.=Gucci, Versace, and Chanel were placed near the highest-end department stores, Neiman Marcus and Saks.="Lots of developers just rent out their space like you'd cut a salami," Taubman explained.="They rent the space based on whether it fits, not necessarily on whether it makes any sense."= Taubman shook his head.=He gestured to a Legal Sea Foods restaurant, where he wanted to stop for lunch.=It was off the main mall, at the far end of a short entry hallway, and it was down there for a reason.=A woman about to spend five thousand dollars at Versace doesn't want to catch a whiff of sautéed grouper as she tries on an evening gown.=More to the point, people eat at Legal Sea Foods only during the lunch and dinner hours--which means that if you put the restaurant in the thick of things, you'd have a dead spot in the middle of your mall for most of the day.

  At the far end of the mall is Neiman Marcus, and Taubman wandered in, exclaimed over a tray of men's ties, and delicately examined the stitching in the women's evening gowns in the designer department.="Hi, my name is Alfred Taubman--I'm your landlord," he said, bending over to greet a somewhat startled sales assistant.=Taubman plainly loves Neiman Marcus, and with good reason: well-run department stores are the engines of malls.=They have powerful brand names, advertise heavily, and carry extensive cosmetics lines (shopping malls are, at bottom, delivery systems for lipstick)--all of which generate enormous shopping traffic.=The point of a mall--the reason so many stores are clustered together in one building--is to allow smaller, less powerful retailers to share in that traffic.=A shopping center is an exercise in coöperative capitalism.=It is considered successful (and the mall owner makes the most money) when the maximum number of department-store customers are lured into the mall.

  Why, for instance, are so many malls, like Short Hills, two stories?=Back at his office, on Fifth Avenue, Taubman took a piece of paper and drew a simple cross-section of a two-story building.="You have two levels, all right?=You have an escalator here and an escalator here."= He drew escalators at both ends of the floors.="The customer comes into the mall, walks down the hall, gets on the escalator up to the second level.=Goes back along the second floor, down the escalator, and now she's back where she started from.=She's seen every store in the center, right?=Now you put on a third level.=Is there any reason to go up there?=No."= A full circuit of a two-level mall takes you back to the beginning.=It encourages you to circulate through the whole building.=A full circuit of a three-level mall leaves you at the opposite end of the mall from your car.=Taubman was the first to put a ring road around the mall--which he did at his mall in Hayward--for the same reason: if you want to get shoppers into every part of the building, they should be distributed to as many different entry points as possible.=At Short Hills--and at most Taubman malls--the ring road rises gently as you drive around the building, so at least half of the mall entrances are on the second floor.="We put fifteen per cent more parking on the upper level than on the first level, because people flow like water," Taubman said.="They go down much easier than they go up.=And we put our vertical transportation--the escalators--on the ends, so shoppers have to make the full loop."

  This is the insight that drove the enthusiasm for the mall fifty years ago--that by putting everything under one roof, the retailer and the developer gained, for the first time, complete control over their environment.=Taubman fusses about lighting, for instance: he believes that next to the skylights you have to put tiny lights that will go on when the natural light fades, so the dusk doesn't send an unwelcome signal to shoppers that it is time to go home; and you have to recess the skylights so that sunlight never reflects off the storefront glass, obscuring merchandise.=Can you optimize lighting in a traditional downtown?=The same goes for parking.=Suppose that there was a downtown where the biggest draw was a major department store.=Ideally, you ought to put the garage across the street and two blocks away, so shoppers, on their way from their cars and to their destination, would pass by the stores in between--dramatically increasing the traffic for all the intervening merchants.=But in a downtown, obviously, you can't put a parking garage just anywhere, and even if you could, you couldn't insure that the stores in that high-traffic corridor had the optimal adjacencies, or that the sidewalk would feel right under the thin soles of women's shoes.=And because the stores are arrayed along a road with cars on it, you don't really have a mall where customers can wander from side to side.=And what happens when they get to the department store?=It's four or five floors high, and shoppers are like water, remember: they flow downhill.=So it's going to be hard to generate traffic on the upper levels.=There is a tendency in America to wax nostalgic for the traditional downtown, but those who first believed in the mall--and understood its potential--found it hard to look at the old downtown with anything but frustration.="In Detroit, prior to the nineteen-fifties, the large department stores, like Hudson's, controlled everything, like zoning," Taubman said.="They were generous to local politicians.=They had enormous clout, and that's why when Sears wanted to locate in downtown Detroit they were told they couldn't.=So Sears put a store in Highland Park and on Oakland Boulevard, and built a store on the East Side, and it was able to get some other stores to come with them, and before long there were three mini-downtowns in the suburbs.=They used to call them hot spots."= This happened more than half a century ago.=But it was clear that Taubman has never quite got over how irrational the world outside the mall can be: downtown Detroit chased away traffic.

  3.

  Planning and control were of even greater importance to Gruen.=He was, after all, a socialist--and he was Viennese.=In the middle of the nineteenth century, Vienna had demolished the walls and other fortifications that had ringed the city since medieval times, and in the resulting open space built the Ringstrasse--a meticulously articulated addition to the old city.=Architects and urban planners solemnly outlined their ideas.=There were apartment blocks, and public squares and government buildings, and shopping arcades, each executed in what was thought to be the historically appropriate style.=The Rathaus was done in high Gothic; the Burgtheatre in early Baroque; the University was pure Renaissance; and the Parliament was classical Greek.=It was all part of the official Viennese response to the populist uprisings of 1848: if Austria was to remake itself as a liberal democracy, Vienna had to be physically remade along democratic lines.=The Parliament now faced directly onto the street.=The walls that separated the élite of Vienna from the unwashed in the suburbs were torn down.=And, most important, a ring road, or Ringstrasse--a grand mall--was built around the city, with wide sidewalks and expansive urban views, where Viennese of all backgrounds could mingle freely on their Sunday-afternoon stroll.=To the Viennese reformers of the time, the quality of civic life was a function of the quality of the built environment, and Gruen thought that principle applied just as clearly to the American suburbs.

  Not long after Southdale was built, Gruen gave the keynote address at a Progressive Architecture awards ceremony in New Orleans, and he took the occasion to lash out at American suburbia, whose roads, he said, were "avenues of horror," "flanked by the greatest collection of vulgarity--billboards, motels, gas stations, shanties, car lots, miscellaneous industrial equipment, hot dog stands, wayside stores--ever collected by mankind."= American suburbia was chaos, and the only solution to chaos was planning.=When Gruen first drew up the plans for Southdale, he placed the shopping center at the heart of a tidy four-hundred-and-sixty-three-acre development, complete with apartment buildings, houses, schools, a medical center, a park, and a lake.=Southdale was not a suburban alternative to downtown Minneapolis.=It was the Minneapolis downtown you would get if you started over and corrected all the mistakes that were made the first time around.="There is nothing suburban about Southdale except its location," Architectural Record stated when it reviewed Gruen's new creation.=It is an imaginative distillation of what makes downtown magnetic: the variety, the individuality, the lights, the color, even the crowds--for Southdale's pedestrian-scale spaces insure a busyness and a bustle.

  Added to this essence of existing downtowns are all kinds of things that ought to be there if downtown weren't so noisy and dirty and chaotic--sidewalk cafés, art, islands of planting, pretty paving.=Other shopping centers, however pleasant, seem provincial in contrast with the real thing--the city downtown.=But in Minneapolis, it is the downtown that appears pokey and provincial in contrast with Southdale's metropolitan character.

  One person who wasn't dazzled by Southdale was Frank Lloyd Wright.="What is this, a railroad station or a bus station?" he asked, when he came for a tour.="You've got a garden court that has all the evils of the village street and none of its charm."= But no one much listened to Frank Lloyd Wright.=When it came to malls, it was only Victor Gruen's vision that mattered.

  4.

  Victor Gruen's grand plan for Southdale was never realized.=There were no parks or schools or apartment buildings--just that big box in a sea of parking.=Nor, with a few exceptions, did anyone else plan the shopping mall as the centerpiece of a tidy, dense, multi-use development.=Gruen was right about the transformative effect of the mall on retailing.=But in thinking that he could reënact the lesson of the Ringstrasse in American suburbia he was wrong, and the reason was that in the mid-nineteen-fifties the economics of mall-building suddenly changed.

  At the time of Southdale, big shopping centers were a delicate commercial proposition.=One of the first big postwar shopping centers was Shopper's World, in Framingham, Massachusetts, designed by an old business partner of Gruen's from his Fifth Avenue storefront days.=Shopper's World was an open center covering seventy acres, with forty-four stores, six thousand parking spaces, and a two-hundred-and-fifty-thousand-square-foot Jordan Marsh department store--and within two years of its opening, in 1951, the developer was bankrupt.=A big shopping center simply cost too much money, and it took too long for a developer to make that money back.=Gruen thought of the mall as the centerpiece of a carefully planned new downtown because he felt that that was the only way malls would ever get built: you planned because you had to plan.=Then, in the mid-fifties, something happened that turned the dismal economics of the mall upside down: Congress made a radical change in the tax rules governing depreciation.

  Under tax law, if you build an office building, or buy a piece of machinery for your factory, or make any capital purchase for your business, that investment is assumed to deteriorate and lose some part of its value from wear and tear every year.=As a result, a business is allowed to set aside some of its income, tax-free, to pay for the eventual cost of replacing capital investments.=For tax purposes, in the early fifties the useful life of a building was held to be forty years, so a developer could deduct one-fortieth of the value of his building from his income every year.=A new forty-million-dollar mall, then, had an annual depreciation deduction of a million dollars.=What Congress did in 1954, in an attempt to stimulate investment in manufacturing, was to "accelerate" the depreciation process for new construction.=Now, using this and other tax loopholes, a mall developer could recoup the cost of his investment in a fraction of the time.=As the historian Thomas Hanchett argues, in a groundbreaking paper in The American Historical Review, the result was a "bonanza" for developers.=In the first few years after a shopping center was built, the depreciation deductions were so large that the mall was almost certainly losing money, at least on paper--which brought with it enormous tax benefits.=For instance, in a front-page article in 1961 on the effect of the depreciation changes, the Wall Street Journal described the finances of a real-estate investment company called Kratter Corp.=Kratter's revenue from its real-estate operations in 1960 was $9,997,043.=Deductions from operating expenses and mortgage interest came to $4,836,671, which left a healthy income of $5.16 million.=Then came depreciation, which came to $6.9 million, so now Kratter's healthy profit had been magically turned into a "loss" of $1.76 million.=Imagine that you were one of five investors in Kratter.=The company's policy was to distribute nearly all of its pre-depreciation revenue to its investors, so your share of their earnings would be roughly a million dollars.=Ordinarily, you'd pay a good chunk of that in taxes.=But that million dollars wasn't income.=After depreciation, Kratter didn't make any money.=That million dollars was "return on capital," and it was tax-free.

  Suddenly it was possible to make much more money investing in things like shopping centers than buying stocks, so money poured into real-estate investment companies.=Prices rose dramatically.=Investors were putting up buildings, taking out as much money from them as possible using accelerated depreciation, then selling them four or five years later at a huge profit--whereupon they built an even bigger building, because the more expensive the building was, the more the depreciation allowance was worth.

  Under the circumstances, who cared whether the shopping center made economic sense for the venders?=Shopping centers and strip malls became what urban planners call "catalytic," meaning that developers weren't building them to serve existing suburban communities; they were building them on the fringes of cities, beyond residential developments, where the land was cheapest.=Hanchett points out, in fact, that in many cases the growth of malls appears to follow no demographic logic at all.=Cortland, New York, for instance, barely grew at all between 1950 and 1970.=Yet in those two decades Cortland gained six new shopping plazas, including the four-hundred-thousand-square-foot enclosed Cortlandville Mall.=In the same twenty-year span, the Scranton area actually shrank by seventy-three thousand people while gaining thirty-one shopping centers, including three enclosed malls.=In 1953, before accelerated depreciation was put in place, one major regional shopping center was built in the United States.=Three years later, after the law was passed, that number was twenty-five.=In 1953, new shopping-center construction of all kinds totalled six million square feet.=By 1956, that figure had increased five hundred per cent.=This was also the era that fast-food restaurants and Howard Johnsons and Holiday Inns and muffler shops and convenience stores began to multiply up and down the highways and boulevards of the American suburbs--and as these developments grew, others followed to share in the increased customer traffic.=Malls led to malls, and in turn those malls led to the big stand-alone retailers like Wal-Mart and Target, and then the "power centers" of three or four big-box retailers, like Circuit City, Staples, Barnes & Noble.=Victor Gruen intended Southdale to be a dense, self-contained downtown.=Today, fifteen minutes down an "avenue of horror" from Southdale is the Mall of America, the largest mall in the country, with five hundred and twenty stores, fifty restaurants, and twelve thousand parking spaces--and one can easily imagine that one day it, too, may give way to something newer and bigger.

  5.

  Once, in the mid-fifties, Victor Gruen sat down with a writer from The New Yorker's Talk of the Town to give his thoughts on how to save New York City.=The interview took place in Gruen's stylish offices on West Twelfth Street, in an old Stanford White building, and one can only imagine the reporter, rapt, as Gruen held forth, eyebrows bristling.=First, Gruen said, Manhattan had to get rid of its warehouses and its light manufacturing.=Then, all the surface traffic in midtown--the taxis, buses, and trucks--had to be directed into underground tunnels.=He wanted to put superhighways around the perimeter of the island, buttressed by huge double-decker parking garages.=The jumble of tenements and town houses and apartment blocks that make up Manhattan would be replaced by neat rows of hundred-and-fifty-story residential towers, arrayed along a ribbon of gardens, parks, walkways, theatres, and cafés.

  Mr.=G.=lowered his brows and glared at us.="You are troubled by all those tunnels, are you not?" he inquired.="You wonder whether there is room for them in the present underground jungle of pipes and wires.=Did you never think how absurd it is to bury beneath tons of solid pavement equipment that is bound to go on the blink from time to time?" He leaped from his chair and thrust an imaginary pneumatic drill against his polished study floor.="Rat-a-tat-tat!" he exclaimed.="Night and day! Tear up the streets! Then pave them! Then tear 'em up again!" Flinging aside the imaginary drill, he threw himself back in his chair.="In my New York of the future, all pipes and wires will be strung along the upper sides of those tunnels, above a catwalk, accessible to engineers and painted brilliant colors to delight rather than appall the eye."

  Postwar America was an intellectually insecure place, and there was something intoxicating about Gruen's sophistication and confidence.=That was what took him, so dramatically, from standing at New York Harbor with eight dollars in his pocket to Broadway, to Fifth Avenue, and to the heights of Northland and Southdale.=He was a European intellectual, an émigré, and, in the popular mind, the European émigré represented vision, the gift of seeing something grand in the banality of postwar American life.=When the European visionary confronted a drab and congested urban landscape, he didn't tinker and equivocate; he levelled warehouses and buried roadways and came up with a thrilling plan for making things right.="The chief means of travel will be walking," Gruen said, of his reimagined metropolis.="Nothing like walking for peace of mind."= At Northland, he said, thousands of people would show up, even when the stores were closed, just to walk around.=It was exactly like Sunday on the Ringstrasse.=With the building of the mall, Old World Europe had come to suburban Detroit.

  What Gruen had, as well, was an unshakable faith in the American marketplace.=Malls teach us, he once said, that "it's the merchants who will save our urban civilization.='Planning' isn't a dirty word to them; good planning means good business."= He went on, "Sometimes self-interest has remarkable spiritual consequences."= Gruen needed to believe this, as did so many European intellectuals from that period, dubbed by the historian Daniel Horowitz "celebratory émigrés."= They had fled a place of chaos and anxiety, and in American consumer culture they sought a bulwark against the madness across the ocean.=They wanted to find in the jumble of the American marketplace something as grand as the Vienna they had lost--the place where the unconscious was meticulously dissected by Dr.=Freud on Berggasse, and where shrines to European civilization--to the Gothic, the Baroque, the Renaissance, and the ancient Greek traditions--were erected on the Ringstrasse.=To Americans, nothing was more flattering than this.=Who didn't want to believe that the act of levelling warehouses and burying roadways had spiritual consequences?=But it was, in the end, too good to be true.=This wasn't the way America worked at all.

  A few months ago, Alfred Taubman gave a speech to a real-estate trade association in Detroit, about the prospects for the city's downtown, and one of the things he talked about was Victor Gruen's Northland.=It was simply too big, Taubman said.=Hudson's, the Northland anchor tenant, already had a flagship store in downtown Detroit.=So why did Gruen build a six-hundred-thousand-square-foot satellite at Northland, just a twenty-minute drive away?=Satellites were best at a hundred and fifty thousand to two hundred thousand square feet.=But at six hundred thousand square feet they were large enough to carry every merchandise line that the flagship store carried, which meant no one had any reason to make the trek to the flagship anymore.=Victor Gruen said the lesson of Northland was that the merchants would save urban civilization.=He didn't appreciate that it made a lot more sense, for his client, to save civilization at a hundred and fifty thousand square feet than at six hundred thousand square feet.=The lesson of America was that the grandest of visions could be derailed by the most banal of details, like the size of the retail footprint, or whether Congress set the depreciation allowance at forty years or twenty years.

  When, late in life, Gruen came to realize this, it was a powerfully disillusioning experience.=He revisited one of his old shopping centers, and saw all the sprawling development around it, and pronounced himself in "severe emotional shock."= Malls, he said, had been disfigured by "the ugliness and discomfort of the land-wasting seas of parking" around them.=Developers were interested only in profit.="I refuse to pay alimony for those bastard developments," he said in a speech in London, in 1978.=He turned away from his adopted country.=He had fixed up a country house outside of Vienna, and soon he moved back home for good.=But what did he find when he got there?=Just south of old Vienna, a mall had been built--in his anguished words, a "gigantic shopping machine."= It was putting the beloved independent shopkeepers of Vienna out of business.=It was crushing the life of his city.=He was devastated.=Victor Gruen invented the shopping mall in order to make America more like Vienna.=He ended up making Vienna more like America.
GO TO TOP MENU

  How the S.U.V. ran over automotive safety.

  1.

  In the summer of 1996, the Ford Motor Company began building the Expedition, its new, full-sized S.U.V., at the Michigan Truck Plant, in the Detroit suburb of Wayne.= The Expedition was essentially the F-150 pickup truck with an extra set of doors and two more rows of seats--and the fact that it was a truck was critical.= Cars have to meet stringent fuel-efficiency regulations.= Trucks don't.= The handling and suspension and braking of cars have to be built to the demanding standards of drivers and passengers.= Trucks only have to handle like, well, trucks.= Cars are built with what is called unit-body construction.= To be light enough to meet fuel standards and safe enough to meet safety standards, they have expensive and elaborately engineered steel skeletons, with built-in crumple zones to absorb the impact of a crash.= Making a truck is a lot more rudimentary.= You build a rectangular steel frame.= The engine gets bolted to the front.= The seats get bolted to the middle.= The body gets lowered over the top.= The result is heavy and rigid and not particularly safe.= But it's an awfully inexpensive way to build an automobile.= Ford had planned to sell the Expedition for thirty-six thousand dollars, and its best estimate was that it could build one for twenty-four thousand--which, in the automotive industry, is a terrifically high profit margin.= Sales, the company predicted, weren't going to be huge.= After all, how many Americans could reasonably be expected to pay a twelve-thousand-dollar premium for what was essentially a dressed-up truck? But Ford executives decided that the Expedition would be a highly profitable niche product.= They were half right.= The "highly profitable" part turned out to be true.= Yet, almost from the moment Ford's big new S.U.V.s rolled off the assembly line in Wayne, there was nothing "niche" about the Expedition.

  Ford had intended to split the assembly line at the Michigan Truck Plant between the Expedition and the Ford F-150 pickup.= But, when the first flood of orders started coming in for the Expedition, the factory was entirely given over to S.U.V.s.= The orders kept mounting.= Assembly-line workers were put on sixty- and seventy-hour weeks.= Another night shift was added.= The plant was now running twenty-four hours a day, six days a week.= Ford executives decided to build a luxury version of the Expedition, the Lincoln Navigator.= They bolted a new grille on the Expedition, changed a few body panels, added some sound insulation, took a deep breath, and charged forty-five thousand dollars--and soon Navigators were flying out the door nearly as fast as Expeditions.= Before long, the Michigan Truck Plant was the most profitable of Ford's fifty-three assembly plants.= By the late nineteen-nineties, it had become the most profitable factory of any industry in the world.= In 1998, the Michigan Truck Plant grossed eleven billion dollars, almost as much as McDonald's made that year.= Profits were $3.=7 billion.= Some factory workers, with overtime, were making two hundred thousand dollars a year.= The demand for Expeditions and Navigators was so insatiable that even when a blizzard hit the Detroit region in January of 1999--burying the city in snow, paralyzing the airport, and stranding hundreds of cars on the freeway--Ford officials got on their radios and commandeered parts bound for other factories so that the Michigan Truck Plant assembly line wouldn't slow for a moment.= The factory that had begun as just another assembly plant had become the company's crown jewel.

  In the history of the automotive industry, few things have been quite as unexpected as the rise of the S.U.V. Detroit is a town of engineers, and engineers like to believe that there is some connection between the success of a vehicle and its technical merits.= But the S.U.V. boom was like Apple's bringing back the Macintosh, dressing it up in colorful plastic, and suddenly creating a new market.= It made no sense to them.= Consumers said they liked four-wheel drive.= But the overwhelming majority of consumers don't need four-wheel drive.= S.U.V. buyers said they liked the elevated driving position.= But when, in focus groups, industry marketers probed further, they heard things that left them rolling their eyes.= As Keith Bradsher writes in "High and Mighty"--perhaps the most important book about Detroit since Ralph Nader's "Unsafe at Any Speed"--what consumers said was "If the vehicle is up high, it's easier to see if something is hiding underneath or lurking behind it.=" Bradsher brilliantly captures the mixture of bafflement and contempt that many auto executives feel toward the customers who buy their S.U.V.s.= Fred J.= Schaafsma, a top engineer for General Motors, says, "Sport-utility owners tend to be more like 'I wonder how people view me,' and are more willing to trade off flexibility or functionality to get that.=" According to Bradsher, internal industry market research concluded that S.U.V.s tend to be bought by people who are insecure, vain, self-centered, and self-absorbed, who are frequently nervous about their marriages, and who lack confidence in their driving skills.= Ford's S.U.V. designers took their cues from seeing "fashionably dressed women wearing hiking boots or even work boots while walking through expensive malls.=" Toyota's top marketing executive in the United States, Bradsher writes, loves to tell the story of how at a focus group in Los Angeles "an elegant woman in the group said that she needed her full-sized Lexus LX 470 to drive up over the curb and onto lawns to park at large parties in Beverly Hills.=" One of Ford's senior marketing executives was even blunter: "The only time those S.U.V.s are going to be off-road is when they miss the driveway at 3 a.=m.="

  The truth, underneath all the rationalizations, seemed to be that S.U.V. buyers thought of big, heavy vehicles as safe: they found comfort in being surrounded by so much rubber and steel.= To the engineers, of course, that didn't make any sense, either: if consumers really wanted something that was big and heavy and comforting, they ought to buy minivans, since minivans, with their unit-body construction, do much better in accidents than S.U.V.s.= (In a thirty-five m.p.h. crash test, for instance, the driver of a Cadillac Escalade--the G.M. counterpart to the Lincoln Navigator--has a sixteen-per-cent chance of a life-threatening head injury, a twenty-per-cent chance of a life-threatening chest injury, and a thirty-five-per-cent chance of a leg injury.= The same numbers in a Ford Windstar minivan--a vehicle engineered from the ground up, as opposed to simply being bolted onto a pickup-truck frame--are, respectively, two per cent, four per cent, and one per cent.=) But this desire for safety wasn't a rational calculation.= It was a feeling.= Over the past decade, a number of major automakers in America have relied on the services of a French-born cultural anthropologist, G.= Clotaire Rapaille, whose speciality is getting beyond the rational--what he calls "cortex"--impressions of consumers and tapping into their deeper, "reptilian" responses.= And what Rapaille concluded from countless, intensive sessions with car buyers was that when S.U.V. buyers thought about safety they were thinking about something that reached into their deepest unconscious.= "The No.= 1 feeling is that everything surrounding you should be round and soft, and should give," Rapaille told me.= "There should be air bags everywhere.= Then there's this notion that you need to be up high.= That's a contradiction, because the people who buy these S.U.V.s know at the cortex level that if you are high there is more chance of a rollover.= But at the reptilian level they think that if I am bigger and taller I'm safer.= You feel secure because you are higher and dominate and look down.= That you can look down is psychologically a very powerful notion.= And what was the key element of safety when you were a child? It was that your mother fed you, and there was warm liquid.= That's why cupholders are absolutely crucial for safety.= If there is a car that has no cupholder, it is not safe.= If I can put my coffee there, if I can have my food, if everything is round, if it's soft, and if I'm high, then I feel safe.= It's amazing that intelligent, educated women will look at a car and the first thing they will look at is how many cupholders it has.=" During the design of Chrysler's PT Cruiser, one of the things Rapaille learned was that car buyers felt unsafe when they thought that an outsider could easily see inside their vehicles.= So Chrysler made the back window of the PT Cruiser smaller.= Of course, making windows smaller--and thereby reducing visibility--makes driving more dangerous, not less so.= But that's the puzzle of what has happened to the automobile world: feeling safe has become more important than actually being safe.

  2.

  One day this fall, I visited the automobile-testing center of Consumers Union, the organization that publishes Consumer Reports.= It is tucked away in the woods, in south-central Connecticut, on the site of the old Connecticut Speedway.= The facility has two skid pads to measure cornering, a long straightaway for braking tests, a meandering "handling" course that winds around the back side of the track, and an accident-avoidance obstacle course made out of a row of orange cones.= It is headed by a trim, white-haired Englishman named David Champion, who previously worked as an engineer with Land Rover and with Nissan.= On the day of my visit, Champion set aside two vehicles: a silver 2003 Chevrolet TrailBlazer--an enormous five-thousand-pound S.U.V.--and a shiny blue two-seater Porsche Boxster convertible.

  We started with the TrailBlazer.= Champion warmed up the Chevrolet with a few quick circuits of the track, and then drove it hard through the twists and turns of the handling course.= He sat in the bucket seat with his back straight and his arms almost fully extended, and drove with practiced grace: every movement smooth and relaxed and unhurried.= Champion, as an engineer, did not much like the TrailBlazer.= "Cheap interior, cheap plastic," he said, batting the dashboard with his hand.= "It's a little bit heavy, cumbersome.= Quiet.= Bit wallowy, side to side.= Doesn't feel that secure.= Accelerates heavily.= Once it gets going, it's got decent power.= Brakes feel a bit spongy.=" He turned onto the straightaway and stopped a few hundred yards from the obstacle course.

  Measuring accident avoidance is a key part of the Consumers Union evaluation.= It's a simple setup.= The driver has to navigate his vehicle through two rows of cones eight feet wide and sixty feet long.= Then he has to steer hard to the left, guiding the vehicle through a gate set off to the side, and immediately swerve hard back to the right, and enter a second sixty-foot corridor of cones that are parallel to the first set.= The idea is to see how fast you can drive through the course without knocking over any cones.= "It's like you're driving down a road in suburbia," Champion said.= "Suddenly, a kid on a bicycle veers out in front of you.= You have to do whatever it takes to avoid the kid.= But there's a tractor-trailer coming toward you in the other lane, so you've got to swing back into your own lane as quickly as possible.= That's the scenario.="

  Champion and I put on helmets.= He accelerated toward the entrance to the obstacle course.= "We do the test without brakes or throttle, so we can just look at handling," Champion said.= "I actually take my foot right off the pedals.=" The car was now moving at forty m.p.h. At that speed, on the smooth tarmac of the raceway, the TrailBlazer was very quiet, and we were seated so high that the road seemed somehow remote.= Champion entered the first row of cones.= His arms tensed.= He jerked the car to the left.= The TrailBlazer's tires squealed.= I was thrown toward the passenger-side door as the truck's body rolled, then thrown toward Champion as he jerked the TrailBlazer back to the right.= My tape recorder went skittering across the cabin.= The whole maneuver had taken no more than a few seconds, but it felt as if we had been sailing into a squall.= Champion brought the car to a stop.= We both looked back: the TrailBlazer had hit the cone at the gate.= The kid on the bicycle was probably dead.= Champion shook his head.= "It's very rubbery.= It slides a lot.= I'm not getting much communication back from the steering wheel.= It feels really ponderous, clumsy.= I felt a little bit of tail swing.="

  I drove the obstacle course next.= I started at the conservative speed of thirty-five m.p.h. I got through cleanly.= I tried again, this time at thirty-eight m.p.h., and that small increment of speed made a dramatic difference.= I made the first left, avoiding the kid on the bicycle.= But, when it came time to swerve back to avoid the hypothetical oncoming eighteen-wheeler, I found that I was wrestling with the car.= The protests of the tires were jarring.= I stopped, shaken.= "It wasn't going where you wanted it to go, was it?" Champion said.= "Did you feel the weight pulling you sideways? That's what the extra weight that S.U.V.s have tends to do.= It pulls you in the wrong direction.=" Behind us was a string of toppled cones.= Getting the TrailBlazer to travel in a straight line, after that sudden diversion, hadn't been easy.= "I think you took out a few pedestrians," Champion said with a faint smile.

  Next up was the Boxster.= The top was down.= The sun was warm on my forehead.= The car was low to the ground; I had the sense that if I dangled my arm out the window my knuckles would scrape on the tarmac.= Standing still, the Boxster didn't feel safe: I could have been sitting in a go-cart.= But when I ran it through the handling course I felt that I was in perfect control.= On the straightaway, I steadied the Boxster at forty-five m.p.h., and ran it through the obstacle course.= I could have balanced a teacup on my knee.= At fifty m.p.h., I navigated the left and right turns with what seemed like a twitch of the steering wheel.= The tires didn't squeal.= The car stayed level.= I pushed the Porsche up into the mid-fifties.= Every cone was untouched.= "Walk in the park!" Champion exclaimed as we pulled to a stop.

  Most of us think that S.U.V.s are much safer than sports cars.= If you asked the young parents of America whether they would rather strap their infant child in the back seat of the TrailBlazer or the passenger seat of the Boxster, they would choose the TrailBlazer.= We feel that way because in the TrailBlazer our chances of surviving a collision with a hypothetical tractor-trailer in the other lane are greater than they are in the Porsche.= What we forget, though, is that in the TrailBlazer you're also much more likely to hit the tractor-trailer because you can't get out of the way in time.= In the parlance of the automobile world, the TrailBlazer is better at "passive safety.=" The Boxster is better when it comes to "active safety," which is every bit as important.

  Consider the set of safety statistics compiled by Tom Wenzel, a scientist at Lawrence Berkeley National Laboratory, in California, and Marc Ross, a physicist at the University of Michigan.= The numbers are expressed in fatalities per million cars, both for drivers of particular models and for the drivers of the cars they hit.= (For example, in the first case, for every million Toyota Avalons on the road, forty Avalon drivers die in car accidents every year, and twenty people die in accidents involving Toyota Avalons.) The numbers below have been rounded:

  Make/Model
Toyota Avalon
Chrysler Town & Country
Toyota Camry
Volkswagen Jetta
Ford Windstar
Nissan Maxima
Honda Accord
Chevrolet Venture
Buick Century
Subaru Legacy/Outback
Mazda 626
Chevrolet Malibu
Chevrolet Suburban
Jeep Grand Cherokee
Honda Civic
Toyota Corolla
Ford Expedition
GMC Jimmy
Ford Taurus
Nissan Altima
Mercury Marquis
Nissan Sentra
Toyota 4Runner
Chevrolet Tahoe
Dodge Stratus
Lincoln Town Car
Ford Explorer
Pontiac Grand Am
Toyota Tacoma
Chevrolet Cavalier
Dodge Neon
Pontiac Sunfire
Ford F-Series

  Type
large
minivan
mid-size
subcompact
minivan
mid-size
mid-size
minivan
mid-size
compact
compact
mid-size
S.U.V.
S.U.V.
subcompact
subcompact
S.U.V.
S.U.V.
mid-size
compact
large
subcompact
S.U.V.
S.U.V.
mid-size
large
S.U.V.
compact
pickup
subcompact
subcompact
subcompact
pickup

  Driver
40
31
41
47
37
53
54
51
70
74
70
71
46
61
84
81
55
76
78
72
80
95
94
68
103
100
88
118
111
146
161
158
110

  Other
20
36
29
23
35
26
27
34
23
24
29
34
59
44
25
29
57
39
39
49
43
34
43
74
40
47
60
39
59
41
39
44
128

  Total
60
67
70
70
72
79
82
85
93
98
99
105
105
106
109
110
112
114
117
121
123
129
137
141
143
147
148
157
171
186
199
202
238

  Are the best performers the biggest and heaviest vehicles on the road? Not at all.= Among the safest cars are the midsize imports, like the Toyota Camry and the Honda Accord.= Or consider the extraordinary performance of some subcompacts, like the Volkswagen Jetta.= Drivers of the tiny Jetta die at a rate of just forty-seven per million, which is in the same range as drivers of the five-thousand-pound Chevrolet Suburban and almost half that of popular S.U.V. models like the Ford Explorer or the GMC Jimmy.= In a head-on crash, an Explorer or a Suburban would crush a Jetta or a Camry.= But, clearly, the drivers of Camrys and Jettas are finding a way to avoid head-on crashes with Explorers and Suburbans.= The benefits of being nimble--of being in an automobile that's capable of staying out of trouble--are in many cases greater than the benefits of being big.

  I had another lesson in active safety at the test track when I got in the TrailBlazer with another Consumers Union engineer, and we did three emergency-stopping tests, taking the Chevrolet up to sixty m.p.h. and then slamming on the brakes.= It was not a pleasant exercise.= Bringing five thousand pounds of rubber and steel to a sudden stop involves lots of lurching, screeching, and protesting.= The first time, the TrailBlazer took 146.=2 feet to come to a halt, the second time 151.=6 feet, and the third time 153.=4 feet.= The Boxster can come to a complete stop from sixty m.p.h. in about 124 feet.= That's a difference of about two car lengths, and it isn't hard to imagine any number of scenarios where two car lengths could mean the difference between life and death.

  3.

  The S.U.V. boom represents, then, a shift in how we conceive of safety--from active to passive.= It's what happens when a larger number of drivers conclude, consciously or otherwise, that the extra thirty feet that the TrailBlazer takes to come to a stop don't really matter, that the tractor-trailer will hit them anyway, and that they are better off treating accidents as inevitable rather than avoidable.= "The metric that people use is size," says Stephen Popiel, a vice-president of Millward Brown Goldfarb, in Toronto, one of the leading automotive market-research firms.= "The bigger something is, the safer it is.= In the consumer's mind, the basic equation is, If I were to take this vehicle and drive it into this brick wall, the more metal there is in front of me the better off I'll be.="

  This is a new idea, and one largely confined to North America.= In Europe and Japan, people think of a safe car as a nimble car.= That's why they build cars like the Jetta and the Camry, which are designed to carry out the driver's wishes as directly and efficiently as possible.= In the Jetta, the engine is clearly audible.= The steering is light and precise.= The brakes are crisp.= The wheelbase is short enough that the car picks up the undulations of the road.= The car is so small and close to the ground, and so dwarfed by other cars on the road, that an intelligent driver is constantly reminded of the necessity of driving safely and defensively.= An S.U.V. embodies the opposite logic.= The driver is seated as high and far from the road as possible.= The vehicle is designed to overcome its environment, not to respond to it.= Even four-wheel drive, seemingly the most beneficial feature of the S.U.V., serves to reinforce this isolation.= Having the engine provide power to all four wheels, safety experts point out, does nothing to improve braking, although many S.U.V. owners erroneously believe this to be the case.= Nor does the feature necessarily make it safer to turn across a slippery surface: that is largely a function of how much friction is generated by the vehicle's tires.= All it really does is improve what engineers call tracking--that is, the ability to accelerate without slipping in perilous conditions or in deep snow or mud.= Champion says that one of the occasions when he came closest to death was a snowy day, many years ago, just after he had bought a new Range Rover.= "Everyone around me was slipping, and I was thinking, Yeahhh.= And I came to a stop sign on a major road, and I was driving probably twice as fast as I should have been, because I could.= I had traction.= But I also weighed probably twice as much as most cars.= And I still had only four brakes and four tires on the road.= I slid right across a four-lane road.=" Four-wheel drive robs the driver of feedback.= "The car driver whose wheels spin once or twice while backing out of the driveway knows that the road is slippery," Bradsher writes.= "The SUV driver who navigates the driveway and street without difficulty until she tries to brake may not find out that the road is slippery until it is too late.=" Jettas are safe because they make their drivers feel unsafe.= S.U.V.s are unsafe because they make their drivers feel safe.= That feeling of safety isn't the solution; it's the problem.

  4.

  Perhaps the most troublesome aspect of S.U.V. culture is its attitude toward risk.= "Safety, for most automotive consumers, has to do with the notion that they aren't in complete control," Popiel says.= "There are unexpected events that at any moment in time can come out and impact them--an oil patch up ahead, an eighteen-wheeler turning over, something falling down.= People feel that the elements of the world out of their control are the ones that are going to cause them distress.="

  Of course, those things really aren't outside a driver's control: an alert driver, in the right kind of vehicle, can navigate the oil patch, avoid the truck, and swerve around the thing that's falling down.= Traffic-fatality rates vary strongly with driver behavior.= Drunks are 7. 6 times more likely to die in accidents than non-drinkers.= People who wear their seat belts are almost half as likely to die as those who don't buckle up.= Forty-year-olds are ten times less likely to get into accidents than sixteen-year-olds.= Drivers of minivans, Wenzel and Ross's statistics tell us, die at a fraction of the rate of drivers of pickup trucks.= That's clearly because minivans are family cars, and parents with children in the back seat are less likely to get into accidents.= Frank McKenna, a safety expert at the University of Reading, in England, has done experiments where he shows drivers a series of videotaped scenarios--a child running out the front door of his house and onto the street, for example, or a car approaching an intersection at too great a speed to stop at the red light--and asks people to press a button the minute they become aware of the potential for an accident.= Experienced drivers press the button between half a second and a second faster than new drivers, which, given that car accidents are events measured in milliseconds, is a significant difference.= McKenna's work shows that, with experience, we all learn how to exert some degree of control over what might otherwise appear to be uncontrollable events.= Any conception of safety that revolves entirely around the vehicle, then, is incomplete.= Is the Boxster safer than the TrailBlazer? It depends on who's behind the wheel.= In the hands of, say, my very respectable and prudent middle-aged mother, the Boxster is by far the safer car.= In my hands, it probably isn't.= On the open road, my reaction to the Porsche's extraordinary road manners and the sweet, irresistible wail of its engine would be to drive much faster than I should.= (At the end of my day at Consumers Union, I parked the Boxster, and immediately got into my own car to drive home.= In my mind, I was still at the wheel of the Boxster.= Within twenty minutes, I had a two-hundred-and-seventy-one-dollar speeding ticket.=) The trouble with the S.U.V. ascendancy is that it excludes the really critical component of safety: the driver.

  In psychology, there is a concept called learned helplessness, which arose from a series of animal experiments in the nineteen-sixties at the University of Pennsylvania.= Dogs were restrained by a harness, so that they couldn't move, and then repeatedly subjected to a series of electrical shocks.= Then the same dogs were shocked again, only this time they could easily escape by jumping over a low hurdle.= But most of them didn't; they just huddled in the corner, no longer believing that there was anything they could do to influence their own fate.= Learned helplessness is now thought to play a role in such phenomena as depression and the failure of battered women to leave their husbands, but one could easily apply it more widely.= We live in an age, after all, that is strangely fixated on the idea of helplessness: we're fascinated by hurricanes and terrorist acts and epidemics like sars--situations in which we feel powerless to affect our own destiny.= In fact, the risks posed to life and limb by forces outside our control are dwarfed by the factors we can control.= Our fixation with helplessness distorts our perceptions of risk.= "When you feel safe, you can be passive," Rapaille says of the fundamental appeal of the S.U.V. "Safe means I can sleep.= I can give up control.= I can relax.= I can take off my shoes.= I can listen to music.=" For years, we've all made fun of the middle-aged man who suddenly trades in his sedate family sedan for a shiny red sports car.= That's called a midlife crisis.= But at least it involves some degree of engagement with the act of driving.= The man who gives up his sedate family sedan for an S.U.V. is saying something far more troubling--that he finds the demands of the road to be overwhelming.= Is acting out really worse than giving up?

  5.

  On August 9, 2000, the Bridgestone Firestone tire company announced one of the largest product recalls in American history.= Because of mounting concerns about safety, the company said, it was replacing some fourteen million tires that had been used primarily on the Ford Explorer S.U.V. The cost of the recall--and of a follow-up replacement program initiated by Ford a year later--ran into billions of dollars.= Millions more were spent by both companies on fighting and settling lawsuits from Explorer owners, who alleged that their tires had come apart and caused their S.U.V.s to roll over.= In the fall of that year, senior executives from both companies were called to Capitol Hill, where they were publicly berated.= It was the biggest scandal to hit the automobile industry in years.= It was also one of the strangest.= According to federal records, the number of fatalities resulting from the failure of a Firestone tire on a Ford Explorer S.U.V., as of September, 2001, was two hundred and seventy-one.= That sounds like a lot, until you remember that the total number of tires supplied by Firestone to the Explorer from the moment the S.U.V. was introduced by Ford, in 1990, was fourteen million, and that the average life span of a tire is forty-five thousand miles.= The allegation against Firestone amounts to the claim that its tires failed, with fatal results, two hundred and seventy-one times in the course of six hundred and thirty billion vehicle miles.= Manufacturers usually win prizes for failure rates that low.= It's also worth remembering that during that same ten-year span almost half a million Americans died in traffic accidents.= In other words, during the nineteen-nineties hundreds of thousands of people were killed on the roads because they drove too fast or ran red lights or drank too much.= And, of those, a fair proportion involved people in S.U.V.s who were lulled by their four-wheel drive into driving recklessly on slick roads, who drove aggressively because they felt invulnerable, who disproportionately killed those they hit because they chose to drive trucks with inflexible steel-frame architecture, and who crashed because they couldn't bring their five-thousand-pound vehicles to a halt in time.= Yet, out of all those fatalities, regulators, the legal profession, Congress, and the media chose to highlight the .0005 per cent that could be linked to an alleged defect in the vehicle.

  But should that come as a surprise? In the age of the S.U.V., this is what people worry about when they worry about safety--not risks, however commonplace, involving their own behavior but risks, however rare, involving some unexpected event.= The Explorer was big and imposing.= It was high above the ground.= You could look down on other drivers.= You could see if someone was lurking behind or beneath it.= You could drive it up on someone's lawn with impunity.= Didn't it seem like the safest vehicle in the world?
GO TO TOP MENU

  The paradoxes of intelligence reform.

  1.

  In the fall of 1973, the Syrian Army began to gather a large number of tanks, artillery batteries, and infantry along its border with Israel. Simultaneously, to the south, the Egyptian Army cancelled all leaves, called up thousands of reservists, and launched a massive military exercise, building roads and preparing anti-aircraft and artillery positions along the Suez Canal. On October 4th, an Israeli aerial reconnaissance mission showed that the Egyptians had moved artillery into offensive positions. That evening, AMAN, the Israeli military intelligence agency, learned that portions of the Soviet fleet near Port Said and Alexandria had set sail, and that the Soviet government had begun airlifting the families of Soviet advisers out of Cairo and Damascus. Then, at four o'clock in the morning on October 6th, Israel's director of military intelligence received an urgent telephone call from one of the country's most trusted intelligence sources. Egypt and Syria, the source said, would attack later that day. Top Israeli officials immediately called a meeting. Was war imminent? The head of AMAN, Major General Eli Zeira, looked over the evidence and said he didn't think so. He was wrong. That afternoon, Syria attacked from the east, overwhelming the thin Israeli defenses in the Golan Heights, and Egypt attacked from the south, bombing Israeli positions and sending eight thousand infantry streaming across the Suez. Despite all the warnings of the previous weeks, Israeli officials were caught by surprise. Why couldn't they connect the dots?

  If you start on the afternoon of October 6th and work backward, the trail of clues pointing to an attack seems obvious; you'd have to conclude that something was badly wrong with the Israeli intelligence service. On the other hand, if you start several years before the Yom Kippur War and work forward, re-creating what people in Israeli intelligence knew in the same order that they knew it, a very different picture emerges. In the fall of 1973, Egypt and Syria certainly looked as if they were preparing to go to war. But, in the Middle East of the time, countries always looked as if they were going to war. In the fall of 1971, for instance, both Egypt's President and its minister of war stated publicly that the hour of battle was approaching. The Egyptian Army was mobilized. Tanks and bridging equipment were sent to the canal. Offensive positions were readied. And nothing happened. In December of 1972, the Egyptians mobilized again. The Army furiously built fortifications along the canal. A reliable source told Israeli intelligence that an attack was imminent. Nothing happened. In the spring of 1973, the President of Egypt told Newsweek that everything in his country "is now being mobilized in earnest for the resumption of battle." Egyptian forces were moved closer to the canal. Extensive fortifications were built along the Suez. Blood donors were rounded up. Civil-defense personnel were mobilized. Blackouts were imposed throughout Egypt. A trusted source told Israeli intelligence that an attack was imminent. It didn't come. Between January and October of 1973, the Egyptian Army mobilized nineteen times without going to war. The Israeli government couldn't mobilize its Army every time its neighbors threatened war. Israel is a small country with a citizen Army. Mobilization was disruptive and expensive, and the Israeli government was acutely aware that if its Army was mobilized and Egypt and Syria weren't serious about war, the very act of mobilization might cause them to become serious about war.

  Nor did the other signs seem remarkable. The fact that the Soviet families had been sent home could have signified nothing more than a falling-out between the Arab states and Moscow. Yes, a trusted source called at four in the morning, with definite word of a late afternoon attack, but his last two attack warnings had been wrong. What's more, the source said that the attack would come at sunset, and an attack so late in the day wouldn't leave enough time for opening air strikes. Israeli intelligence didn't see the pattern of Arab intentions, in other words, because, until Egypt and Syria actually attacked, on the afternoon of October 6, 1973, their intentions didn't form a pattern. They formed a Rorschach blot. What is clear in hindsight is rarely clear before the fact. It's an obvious point, but one that nonetheless bears repeating, particularly when we're in the midst of assigning blame for the surprise attack of September 11th.

  2.

  Of the many postmortems conducted after September 11th, the one that has received the most attention is "The Cell: Inside the 9/11 Plot, and Why the F.B.I. and C.I.A. Failed to Stop It" (Hyperion; $24.95), by John Miller, Michael Stone, and Chris Mitchell. The authors begin their tale with El Sayyid Nosair, the Egyptian who was arrested in November of 1990 for shooting Rabbi Meir Kahane, the founder of the Jewish Defense League, in the ballroom of the Marriott Hotel in midtown Manhattan. Nosair's apartment in New Jersey was searched, and investigators found sixteen boxes of files, including training manuals from the Army Special Warfare School; copies of teletypes that had been routed to the Joint Chiefs of Staff; bombmaking manuals; and maps, annotated in Arabic, of landmarks like the Statue of Liberty, Rockefeller Center, and the World Trade Center. According to "The Cell," Nosair was connected to gunrunners and to Islamic radicals in Brooklyn, who were in turn behind the World Trade Center bombing two and a half years later, which was masterminded by Ramzi Yousef, who then showed up in Manila in 1994, apparently plotting to kill the Pope, crash a plane into the Pentagon or the C.I.A., and bomb as many as twelve transcontinental airliners simultaneously. And who was Yousef associating with in the Philippines? Mohammed Khalifa, Wali Khan AminShah, and Ibrahim Munir, all of whom had fought alongside, pledged a loyalty oath to, or worked for a shadowy Saudi Arabian millionaire named Osama bin Laden.

  Miller was a network-television correspondent throughout much of the past decade, and the best parts of "The Cell" recount his own experiences in covering the terrorist story. He is an extraordinary reporter. At the time of the first World Trade Center attack, in February of 1993, he clapped a flashing light on the dashboard of his car and followed the wave of emergency vehicles downtown. (At the bombing site, he was continuously trailed by a knot of reporters - I was one of them - who had concluded that the best way to learn what was going on was to try to overhear his conversations.) Miller became friends with the F.B.I. agents who headed the New York counterterrorist office - Neil Herman and John O'Neill, in particular - and he became as obsessed with Al Qaeda as they were. He was in Yemen, with the F.B.I., after Al Qaeda bombed the U.S.S. Cole. In 1998, at the Marriott in Islamabad, he and his cameraman met someone known to them only as Akhtar, who spirited them across the border into the hills of Afghanistan to interview Osama bin Laden. In "The Cell," the period from 1990 through September 11th becomes a seamless, devastating narrative: the evolution of Al Qaeda. "How did this happen to us?" the book asks in its opening pages. The answer, the authors argue, can be found by following the "thread" connecting Kahane's murder to September 11th. In the events of the past decade, they declare, there is a clear "recurring pattern."

  The same argument is made by Senator Richard Shelby, vice-chairman of the Senate Select Committee on Intelligence, in his investigative report on September 11th, released this past December. The report is a lucid and powerful document, in which Shelby painstakingly points out all the missed or misinterpreted signals pointing to a major terrorist attack. The C.I.A. knew that two suspected Al Qaeda operatives, Khalid al-Mihdhar and Nawaf al-Hazmi, had entered the country, but the C.I.A. didn't tell the F.B.I. or the N.S.C. An F.B.I. agent in Phoenix sent a memo to headquarters that began with the sentence "The purpose of this communication is to advise the bureau and New York of the possibility of a coordinated effort by Osama Bin Laden to send students to the United States to attend civilian aviation universities and colleges." But the F.B.I. never acted on the information, and failed to connect it with reports that terrorists were interested in using airplanes as weapons. The F.B.I. took into custody the suspected terrorist Zacarias Moussaoui, on account of his suspicious behavior at flight school, but was unable to integrate his case into a larger picture of terrorist behavior. "The most fundamental problem . . . is our Intelligence Community's inability to 'connect the dots' available to it before September 11, 2001, about terrorists' interest in attacking symbolic American targets," the Shelby report states. The phrase "connect the dots" appears so often in the report that it becomes a kind of mantra. There was a pattern, as plain as day in retrospect, yet the vaunted American intelligence community simply could not see it.

  None of these postmortems, however, answer the question raised by the Yom Kippur War: Was this pattern obvious before the attack? This question - whether we revise our judgment of events after the fact - is something that psychologists have paid a great deal of attention to. For example, on the eve of Richard Nixon's historic visit to China, the psychologist Baruch Fischhoff asked a group of people to estimate the probability of a series of possible outcomes of the trip. What were the chances that the trip would lead to permanent diplomatic relations between China and the United States? That Nixon would meet with the leader of China, Mao Tse-tung, at least once? That Nixon would call the trip a success? As it turned out, the trip was a diplomatic triumph, and Fischhoff then went back to the same people and asked them to recall what their estimates of the different outcomes of the visit had been. He found that the subjects now, overwhelmingly, "remembered" being more optimistic than they had actually been. If you originally thought that it was unlikely that Nixon would meet with Mao, afterward, when the press was full of accounts of Nixon's meeting with Mao, you'd "remember" that you had thought the chances of a meeting were pretty good. Fischhoff calls this phenomenon "creeping determinism" - the sense that grows on us, in retrospect, that what has happened was actually inevitable - and the chief effect of creeping determinism, he points out, is that it turns unexpected events into expected events. As he writes, "The occurrence of an event increases its reconstructed probability and makes it less surprising than it would have been had the original probability been remembered."

  To read the Shelby report, or the seamless narrative from Nosair to bin Laden in "The Cell," is to be convinced that if the C.I.A. and the F.B.I. had simply been able to connect the dots what happened on September 11th should not have been a surprise at all. Is this a fair criticism or is it just a case of creeping determinism?

  3.

  On August 7, 1998, two Al Qaeda terrorists detonated a cargo truck filled with explosives outside the United States Embassy in Nairobi, killing two hundred and thirteen people and injuring more than four thousand. Miller, Stone, and Mitchell see the Kenyan Embassy bombing as a textbook example of intelligence failure. The C.I.A., they tell us, had identified an Al Qaeda cell in Kenya well before the attack, and its members were under surveillance. They had an eight-page letter, written by an Al Qaeda operative, speaking of the imminent arrival of "engineers" - the code word for bombmakers - in Nairobi. The United States Ambassador to Kenya, Prudence Bushnell, had begged Washington for more security. A prominent Kenyan lawyer and legislator says that the Kenyan intelligence service warned U.S. intelligence about the plot several months before August 7th, and in November of 1997 a man named Mustafa Mahmoud Said Ahmed, who worked for one of Osama bin Laden's companies, walked into the United States Embassy in Nairobi and told American intelligence of a plot to blow up the building. What did our officials do? They forced the leader of the Kenyan cell - a U.S. citizen - to return home, and then abruptly halted their surveillance of the group. They ignored the eight-page letter. They allegedly showed the Kenyan intelligence service's warning to the Mossad, which dismissed it, and after questioning Ahmed they decided that he wasn't credible. After the bombing, "The Cell" tells us, a senior State Department official phoned Bushnell and asked, "How could this have happened?"

  "For the first time since the blast," Miller, Stone, and Mitchell write, "Bushnell's horror turned to anger. There was too much history. 'I wrote you a letter,' she said."

  This is all very damning, but doesn't it fall into the creeping-determinism trap? It is not at all clear that it passes the creeping-determinism test. It's an edited version of the past. What we don't hear about is all the other people whom American intelligence had under surveillance, how many other warnings they received, and how many other tips came in that seemed promising at the time but led nowhere. The central challenge of intelligence gathering has always been the problem of "noise": the fact that useless information is vastly more plentiful than useful information. Shelby's report mentions that the F.B.I.'s counter terrorism division has sixty-eight thousand outstanding and unassigned leads dating back to 1995. And, of those, probably no more than a few hundred are useful. Analysts, in short, must be selective, and the decisions made in Kenya, by that standard, do not seem unreasonable. Surveillance on the cell was shut down, but, then, its leader had left the country. Bushnell warned Washington - but, as "The Cell" admits, there were bomb warnings in Africa all the time. Officials at the Mossad thought the Kenyan intelligence was dubious, and the Mossad ought to know. Ahmed may have worked for bin Laden but he failed a polygraph test, and it was also learned that he had previously given similar - groundless - warnings to other embassies in Africa. When a man comes into your office, fails a lie-detector test, and is found to have shopped the same unsubstantiated story all over town, can you be blamed for turning him out?

  Miller, Stone, and Mitchell make the same mistake when they quote from a transcript of a conversation that was recorded by Italian intelligence in August of 2001 between two Al Qaeda operatives, Abdel Kader Es Sayed and a man known as al Hilal. This, they say, is yet another piece of intelligence that "seemed to forecast the September 11 attacks."

"I've been studying airplanes," al Hilal tells Es Sayed. "If God wills, I hope to be able to bring you a window or a piece of a plane the next time I see you."

"What, is there a jihad planned?" Es Sayed asks. "In the future, listen to the news and remember these words: 'Up above,'" al Hilal replies.

Es Sayed thinks that al Hilal is referring to an operation in his native Yemen, but al Hilal corrects him: "But the surprise attack will come from the other country, one of those attacks you will never forget."

A moment later al Hilal says about the plan, "It is something terrifying that goes from south to north, east to west. The person who devised this plan is a madman, but a genius. He will leave them frozen [in shock]."

  This is a tantalizing exchange. It would now seem that it refers to September 11th. But in what sense was it a "forecast"? It gave neither time nor place nor method nor target. It suggested only that there were terrorists out there who liked to talk about doing something dramatic with an airplane - which did not, it must be remembered, reliably distinguish them from any other terrorists of the past thirty years.

  In the real world, intelligence is invariably ambiguous. Information about enemy intentions tends to be short on detail. And information that's rich in detail tends to be short on intentions. In April of 1941, for instance, the Allies learned that Germany had moved a huge army up to the Russian front. The intelligence was beyond dispute: the troops could be seen and counted. But what did it mean? Churchill concluded that Hitler wanted to attack Russia. Stalin concluded that Hitler was serious about attacking, but only if the Soviet Union didn't meet the terms of the German ultimatum. The British foreign secretary, Anthony Eden, thought that Hitler was bluffing, in the hope of winning further Russian concessions. British intelligence thought - at least, in the beginning - that Hitler simply wanted to reinforce his eastern frontier against a possible Soviet attack. The only way for this piece of intelligence to have been definitive wold have been if the Allies had a second piece of intelligence - like the phone call between al Hilal and Es Sayed - that demonstrated Germany's true purpose. Similarly, the only way the al Hilal phone call would have been definitive is if we'd also had intelligence as detailed as the Allied knowledge of German troop movements. But rarely do intelligence services have the luxury of both kinds of information. Nor are their analysts mind readers. It is only with hindsight that human beings acquire that skill.

  "The Cell" tells us that, in the final months before September 11th, Washington was frantic with worry:

A spike in phone traffic among suspected al Qaeda members in the early part of the summer [of 2001], as well as debriefings of [an al Qaeda operative in custody] who had begun cooperating with the government, convinced investigators that bin Laden was planning a significant operation - one intercepted al Qaeda message spoke of a "Hiroshima-type" event - and that he was planning it soon. Through the summer, the CIA repeatedly warned the White House that attacks were imminent.

  The fact that these worries did not protect us is not evidence of the limitations of the intelligence community. It is evidence of the limitations of intelligence.

  4.

  In the early nineteen-seventies, a professor of psychology at Stanford University named David L. Rosenhan gathered together a painter, a graduate student, a pediatrician, a psychiatrist, a housewife, and three psychologists. He told them to check into different psychiatric hospitals under aliases, with the complaint that they had been hearing voices. They were instructed to say that the voices were unfamiliar, and that they heard words like "empty,"Glad36"thud," and "hollow." Apart from that initial story, the pseudo patients were instructed to answer every question truthfully, to behave as they normally would, and to tell the hospital staff - at every opportunity - that the voices were gone and that they had experienced no further symptoms. The eight subjects were hospitalized, on average, for nineteen days. One was kept for almost two months. Rosenhan wanted to find out if the hospital staffs would ever see through the ruse. They never did.

  Rosenhan's test is, in a way, a classic intelligence problem. Here was a signal (a sane person) buried in a mountain of conflicting and confusing noise (a mental hospital), and the intelligence analysts (the doctors) were asked to connect the dots - and they failed spectacularly. In the course of their hospital stay, the eight pseudo patients were given a total of twenty-one hundred pills. They underwent psychiatric interviews, and sober case summaries documenting their pathologies were written up. They were asked by Rosenhan to take notes documenting how they were treated, and this quickly became part of their supposed pathology. "Patient engaging in writing behavior," one nurse ominously wrote in her notes. Having been labelled as ill upon admission, they could not shake the diagnosis. "Nervous?" a friendly nurse asked one of the subjects as he paced the halls one day. "No," he corrected her, to no avail, "bored."

  The solution to this problem seems obvious enough. Doctors and nurses need to be made alert to the possibility that sane people sometimes get admitted to mental hospitals. So Rosenhan went to a research-and-teaching hospital and informed the staff that at some point in the next three months he would once again send over one or more of his pseudo patients. This time, of the hundred and ninety-three patients admitted in the three-month period, forty-one were identified by at least one staff member as being almost certainly sane. Once again, however, they were wrong. Rosenhan hadn't sent anyone over. In attempting to solve one kind of intelligence problem (overdiagnosis), the hospital simply created another problem (underdiagnosis). This is the second, and perhaps more serious, consequence of creeping determinism: in our zeal to correct what we believe to be the problems of the past, we end up creating new problems for the future.

  Pearl Harbor, for example, was widely considered to be an organizational failure. The United States had all the evidence it needed to predict the Japanese attack, but the signals were scattered throughout the various intelligence services. The Army and the Navy didn't talk to each other. They spent all their time arguing and competing. This was, in part, why the Central Intelligence Agency was created, in 1947 - to insure that all intelligence would be collected and processed in one place. Twenty years after Pearl Harbor, the United States suffered another catastrophic intelligence failure, at the Bay of Pigs: the Kennedy Administration grossly underestimated the Cubans' capacity to fight and their support for Fidel Castro. This time, however, the diagnosis was completely different. As Irving L. Janis concluded in his famous study of "groupthink," the root cause of the Bay of Pigs fiasco was that the operation was conceived by a small, highly cohesive group whose close ties inhibited the beneficial effects of argument and competition. Centralization was now the problem. One of the most influential organizational sociologists of the postwar era, Harold Wilensky, went out of his way to praise the "constructive rivalry" fostered by Franklin D. Roosevelt, which, he says, is why the President had such formidable intelligence on how to attack the economic ills of the Great Depression. In his classic 1967 work "Organizational Intelligence," Wilensky pointed out that Roosevelt would

use one anonymous informant's information to challenge and check another's, putting both on their toes; he recruited strong personalities and structured their work so that clashes would be certain. . . . In foreign affairs, he gave Moley and Welles tasks that overlapped those of Secretary of State Hull; in conservation and power, he gave Ickes and Wallace identical missions; in welfare, confusing both functions and initials, he assigned PWA to Ickes, WPA to Hopkins; in politics, Farley found himself competing with other political advisors for control over patronage. The effect: the timely advertisement of arguments, with both the experts and the President pressured to consider the main choices as they came boiling up from below.

  The intelligence community that we had prior to September 11th was the direct result of this philosophy. The F.B.I. and the C.I.A. were supposed to be rivals, just as Ickes and Wallace were rivals. But now we've changed our minds. The F.B.I. and the C.I.A., Senator Shelby tells us disapprovingly, argue and compete with one another. The September 11th story, his report concludes, "should be an object lesson in the perils of failing to share information promptly and efficiently between (and within) organizations." Shelby wants recentralization and more focus on coöperation. He wants a "central national level knowledge-compiling entity standing above and independent from the disputatious bureaucracies." He thinks the intelligence service should be run by a small, highly cohesive group, and so he suggests that the F.B.I. be removed from the counterterrorism business entirely. The F.B.I., according to Shelby, is governed by

deeply-entrenched individual mindsets that prize the production of evidence-supported narratives of defendant wrongdoing over the drawing of probabilistic inferences based on incomplete and fragmentary information in order to support decision-making. . . . Law enforcement organizations handle information, reach conclusions, and ultimately just think differently than intelligence organizations. Intelligence analysts would doubtless make poor policemen, and it has become very clear that policemen make poor intelligence analysts.

  In his State of the Union Message, President George W. Bush did what Shelby wanted, and announced the formation of the Terrorist Threat Integration Center - a special unit combining the antiterrorist activities of the F.B.I. and the C.I.A. The cultural and organizational diversity of the intelligence business, once prized, is now despised.

  The truth is, though, that it is just as easy, in the wake of September 11th, to make the case for the old system. Isn't it an advantage that the F.B.I. doesn't think like the C.I.A.? It was the F.B.I., after all, that produced two of the most prescient pieces of analysis - the request by the Minneapolis office for a warrant to secretly search Zacarias Moussaoui's belongings, and the now famous Phoenix memo. In both cases, what was valuable about the F.B.I.'s analysis was precisely the way in which it differed from the traditional "big picture," probabilistic inference-making of the analyst. The F.B.I. agents in the field focussed on a single case, dug deep, and came up with an "evidence-supported narrative of defendant wrongdoing" that spoke volumes about a possible Al Qaeda threat.

  The same can be said for the alleged problem of rivalry. "The Cell" describes what happened after police in the Philippines searched the apartment that Ramzi Yousef shared with his co-conspirator, Abdul Hakim Murad. Agents from the F.B.I.'s counterterrorism unit immediately flew to Manila and "bumped up against the C.I.A." As the old adage about the Bureau and the Agency has it, the F.B.I. wanted to string Murad up and the C.I.A. wanted to string him along. The two groups eventually worked together, but only because they had to. It was a relationship "marred by rivalry and mistrust." But what's wrong with this kind of rivalry? As Miller, Stone, and Mitchell tell us, the real objection of Neil Herman - the F.B.I.'s former domestic counterterrorism chief - to "working with the C.I.A. had nothing to do with procedure. He just didn't think the Agency was going to be of any help in finding Ramzi Yousef. 'Back then, I don't think the C.I.A. could have found a person in a bathroom,'" Herman says. "'Hell, I don't think they could have found the bathroom.'" The assumption of the reformers is always that the rivalry between the F.B.I. and the C.I.A. is essentially marital, that it is the dysfunction of people who ought to work together but can't. But it could equally be seen as a version of the marketplace rivalry that leads to companies working harder and making better products.

  There is no such thing as a perfect intelligence system, and every seeming improvement involves a tradeoff. A couple of months ago, for example, a suspect in custody in Canada, who was wanted in New York on forgery charges, gave police the names and photographs of five Arab immigrants, who he said had crossed the border into the United States. The F.B.I. put out an alert on December 29th, posting the names and photographs on its Web site, in the "war on terrorism" section. Even President Bush joined in, saying, "We need to know why they have been smuggled into the country, what they're doing in the country." As it turned out, the suspect in Canada had made the story up. Afterward, an F.B.I. official said that the agency circulated the photographs in order to "err on the side of caution." Our intelligence services today are highly sensitive. But this kind of sensitivity is not without its costs. As the political scientist Richard K. Betts wrote in his essay "Analysis, War, and Decision: Why Intelligence Failures Are Inevitable,"Glad37"Making warning systems more sensitive reduces the risk of surprise, but increases the number of false alarms, which in turn reduces sensitivity." When we run out and buy duct tape to seal our windows against chemical attack, and nothing happens, and when the government's warning light is orange for weeks on end, and nothing happens, we soon begin to doubt every warning that comes our way. Why was the Pacific fleet at Pearl Harbor so unresponsive to signs of an impending Japanese attack? Because, in the week before December 7, 1941, they had checked out seven reports of Japanese submarines in the area - and all seven were false. Rosenhan's psychiatrists used to miss the sane; then they started to see sane people everywhere. That is a change, but it is not exactly progress.

  5.

  In the wake of the Yom Kippur War, the Israeli government appointed a special investigative commission, and one of the witnesses called was Major General Zeira, the head of AMAN. Why, they asked, had he insisted that war was not imminent? His answer was simple:

The Chief of Staff has to make decisions, and his decisions must be clear. The best support that the head of AMAN can give the Chief of Staff is to give a clear and unambiguous estimate, provided that it is done in an objective fashion. To be sure, the clearer and sharper the estimate, the clearer and sharper the mistake - but this is a professional hazard for the head of AMAN.

  The historians Eliot A. Cohen and John Gooch, in their book "Military Misfortunes," argue that it was Zeira's certainty that had proved fatal: "The culpable failure of AMAN's leaders in September and October 1973 lay not in their belief that Egypt would not attack but in their supreme confidence, which dazzled decision-makers. . . . Rather than impress upon the prime minister, the chief of staff and the minister of defense the ambiguity of the situation, they insisted - until the last day - that there would be no war, period."

  But, of course, Zeira gave an unambiguous answer to the question of war because that is what politicians and the public demanded of him. No one wants ambiguity. Today, the F.B.I. gives us color-coded warnings and speaks of "increased chatter" among terrorist operatives, and the information is infuriating to us because it is so vague. What does "increased chatter" mean? We want a prediction. We want to believe that the intentions of our enemies are a puzzle that intelligence services can piece together, so that a clear story emerges. But there rarely is a clear story - at least, not until afterward, when some enterprising journalist or investigative committee decides to write one.
GO TO TOP MENU

  What does 'Saturday Night Live' have in common with German philosophy?

  1.

  Lorne Michaels, the creator of "Saturday Night Live, was married to one of the show's writers, Rosie Shuster. One day when the show was still young, an assistant named Paula Davis went to Shuster's apartment in New York and found Dan Aykroyd getting out of her bed - which was puzzling, not just because Shuster was married to Michaels but because Aykroyd was supposedly seeing another member of the original "S.N.L." cast, Laraine Newman. Aykroyd and Gilda Radner had also been an item, back when the two of them worked for the Second City comedy troupe in Toronto, although by the time they got to New York they were just friends, in the way that everyone was friends with Radner. Second City was also where Aykroyd met John Belushi, because Belushi, who was a product of the Second City troupe in Chicago, came to Toronto to recruit for the "National Lampoon Radio Hour," which he starred in along with Radner and Bill Murray (who were also an item for a while). The writer Michael O'Donoghue (who famously voiced his aversion to the appearance of the Muppets on "S.N.L." by saying, "I don't write for felt") also came from The National Lampoon, as did another of the original writers, Anne Beatts (who was, in the impeccably ingrown logic of "S.N.L.," living with O'Donoghue). Chevy Chase came from a National Lampoon spinoff called "Lemmings," which also starred Belushi, doing his legendary Joe Cocker impersonation. Lorne Michaels hired Belushi after Radner, among others, insisted on it, and he hired Newman because he had worked with her on a Lily Tomlin special, and he hired Aykroyd because Michaels was also from Canada and knew him from the comedy scene there. When Aykroyd got the word, he came down from Toronto on his Harley.

  In the early days of "S.N.L.," as Tom Shales and James Andrew Miller tell us in "Live from New York" (Little, Brown; $25.95), everyone knew everyone and everyone was always in everyone else's business, and that fact goes a long way toward explaining the extraordinary chemistry among the show's cast. Belushi would stay overnight at people's apartments, and he was notorious for getting hungry in the middle of the night and leaving spaghetti-sauce imprints all over the kitchen, or setting fires by falling asleep with a lit joint. Radner would go to Jane Curtin's house and sit and watch Curtin and her husband, as if they were some strange species of mammal, and say things like "Oh, now you are going to turn the TV on together. How will you decide what to watch?" Newman would hang out at Radner's house, and Radner would be eating a gallon of ice cream and Newman would be snorting heroin. Then Radner would go to the bathroom to make herself vomit, and say, "I'm so full, I can't hear." And they would laugh. "There we were," Newman recalls, "practicing our illnesses together."

  The place where they all really lived, though, was the "S.N.L." office, on the seventeenth floor of NBC headquarters, at Rockefeller Center. The staff turned it into a giant dormitory, installing bunk beds and fooling around in the dressing rooms and staying up all night. Monday night was the first meeting, where ideas were pitched. On Tuesday, the writing started after dinner and continued straight through the night. The first read-through took place on Wednesday at three in the afternoon. And then came blocking and rehearsals and revisions. "It was emotional," the writer Alan Zweibel tells Shales and Miller. "We were a colony. I don't mean this in a bad way, but we were Guyana on the seventeenth floor. We didn't go out. We stayed there. It was a stalag of some sort." Rosie Shuster remembers waking up at the office and then going outside with Aykroyd, to "walk each other like dogs around 30 Rock just to get a little fresh air." On Saturdays, after the taping was finished, the cast would head downtown to a storefront that Belushi and Aykroyd had rented and dubbed the Blues Bar. It was a cheerless dive, with rats and crumbling walls and peeling paint and the filthiest toilets in all of New York. But did anyone care? "It was the end of the week and, well, you were psyched," Shuster recalls. "It was like you were buzzing, you'd get turbocharged from the intense effort of it, and then there's like adrenal burnout later. I remember sleeping at the Blues Bar, you know, as the light broke." Sometimes it went even later. "I remember rolling down the armor at the Blues Bar and closing the building at eleven o'clock Sunday morning - you know, when it was at its height - and saying good morning to the cops and firemen,"Aykroyd said. "S.N.L." was a television show, but it was also an adult fraternity house, united by bonds of drugs and sex and long hours and emotion and affection that went back years. "The only entrée to that boys club was basically by fucking somebody in the club," Anne Beatts tells Shales and Miller. "Which wasn't the reason you were fucking them necessarily. I mean, you didn't go "Oh, I want to get into this, I think I'll have to have sex with this person.' It was just that if you were drawn to funny people who were doing interesting things, then the only real way to get to do those things yourself was to make that connection."

  2.

  We are inclined to think that genuine innovators are loners, that they do not need the social reinforcement the rest of us crave. But that's not how it works, whether it's television comedy or, for that matter, the more exalted realms of art and politics and ideas. In his book "The Sociology of Philosophies," Randall Collins finds in all of known history only three major thinkers who appeared on the scene by themselves:the first-century Taoist metaphysician Wang Ch'ung, the fourteenth-century Zen mystic Bassui Tokusho, and the fourteenth-century Arabic philosopher Ibn Khaldun. Everyone else who mattered was part of a movement, a school, a band of followers and disciples and mentors and rivals and friends who saw each other all the time and had long arguments over coffee and slept with one another's spouses. Freud may have been the founder of psychoanalysis, but it really began to take shape in 1902, when Alfred Adler, Wilhelm Stekel, Max Kahane, and Rudolf Reitler would gather in Freud's waiting room on Wednesdays, to eat strudel and talk about the unconscious. The neo-Confucian movement of the Sung dynasty in China revolved around the brothers Ch'eng Hao and Ch'eng I, their teacher Chou Tun-i, their father's cousin Chang Tsai, and, of course, their neighbor Shao Yung. Pissarro and Degas enrolled in the École des Beaux-Arts at the same time, then Pissarro met Monet and, later, Cézanne at the Académie Suisse, Manet met Degas at the Louvre, Monet befriended Renoir at Charles Gleyre's studio, and Renoir, in turn, met Pissarro and Cézanne and soon enough everyone was hanging out at the Café Guerbois on the Rue des Batignolles. Collins's point is not that innovation attracts groups but that innovation is found in groups: that it tends to arise out of social interaction - conversation, validation, the intimacy of proximity, and the look in your listener's eye that tells you you're onto something. German Idealism, he notes, centered on Fichte, Schelling, and Hegel. Why? Because they all lived together in the same house. "Fichte takes the early lead," Collins writes,

  inspiring the others on a visit while they are young students at Tübingen in the 1790s, then turning Jena into a center for the philosophical movement to which a stream of the soon-to-be-eminent congregate; then on to Dresden in the heady years 1799-1800 to live with the Romantic circle of the Schlegel brothers (where August Schlegel's wife, Caroline, has an affair with Schelling, followed later by a scandalous divorce and remarriage). Fichte moves on to Berlin, allying with Schleiermacher (also of the Romantic circle) and with Humboldt to establish the new-style university; here Hegel eventually comes and founds his school, and Schopenhauer lectures fruitlessly in competition.

  There is a wonderful illustration of this social dimension of innovation in Jenny Uglow's new book, "The Lunar Men" (Farrar, Straus & Giroux; $30), which is the story of a remarkable group of friends in Birmingham in the mid-eighteenth century. Their leader was Erasmus Darwin, a physician, inventor, and scientist, who began thinking about evolution a full fifty years before his grandson Charles. Darwin met, through his medical practice, an industrialist named Mathew Boulton and, later, his partner James Watt, the steam-engine pioneer. They, in turn, got to know Josiah Wedgwood, he of the famous pottery, and Joseph Priestley, the preacher who isolated oxygen and became known as one of history's great chemists, and the industrialist Samuel Galton (whose son married Darwin's daughter and produced the legendary nineteenth-century polymath Francis Galton), and the innovative glass-and-chemicals entrepreneur James Keir, and on and on. They called themselves the Lunar Society because they arranged to meet at each full moon, when they would get together in the early afternoon to eat, piling the table high, Uglow tells us, with wine and "fish and capons, Cheddar and Stilton, pies and syllabubs." Their children played underfoot. Their wives chatted in the other room, and the Lunar men talked well into the night, clearing the table to make room for their models and plans and instruments. "They developed their own cryptic, playful language and Darwin, in particular, liked to phrase things as puzzles - like the charades and poetic word games people used to play," Uglow writes. "Even though they were down-to-earth champions of reason, a part of the delight was to feel they were unlocking esoteric secrets, exploring transmutations like alchemists of old."

  When they were not meeting, they were writing to each other with words of encouragement or advice or excitement. This was truly - in a phrase that is invariably and unthinkingly used in the pejorative - a mutual-admiration society. "Their inquiries ranged over the whole spectrum, from astronomy and optics to fossils and ferns," Uglow tells us, and she goes on:

  One person's passion - be it carriages, steam, minerals, chemistry, clocks - fired all the others. There was no neat separation of subjects. Letters between [William] Small and Watt were a kaleidoscope of invention and ideas, touching on steam-engines and cylinders; cobalt as a semi-metal; how to boil down copal, the resin of tropical trees, for varnish; lenses and clocks and colours for enamels; alkali and canals; acids and vapours - as well as the boil on Watt's nose.

  What were they doing? Darwin, in a lovely phrase, called it "philosophical laughing," which was his way of saying that those who depart from cultural or intellectual consensus need people to walk beside them and laugh with them to give them confidence. But there's more to it than that. One of the peculiar features of group dynamics is that clusters of people will come to decisions that are far more extreme than any individual member would have come to on his own. People compete with each other and egg each other on, showboat and grandstand; and along the way they often lose sight of what they truly believed when the meeting began. Typically, this is considered a bad thing, because it means that groups formed explicitly to find middle ground often end up someplace far away. But at times this quality turns out to be tremendously productive, because, after all, losing sight of what you truly believed when the meeting began is one way of defining innovation.

  Uglow tells us, for instance, that the Lunar men were active in the campaign against slavery. Wedgwood, Watt, and Darwin pushed for the building of canals, to improve transportation. Priestley came up with soda water and the rubber eraser, and James Keir was the man who figured out how to mass-produce soap, eventually building a twenty-acre soapworks in Tipton that produced a million pounds of soap a year. Here, surely, are all the hallmarks of group distortion. Somebody comes up with an ambitious plan for canals, and someone else tries to top that by building a really big soap factory, and in that feverish atmosphere someone else decides to top them all with the idea that what they should really be doing is fighting slavery.

  Uglow's book reveals how simplistic our view of groups really is. We divide them into cults and clubs, and dismiss the former for their insularity and the latter for their banality. The cult is the place where, cut off from your peers, you become crazy. The club is the place where, surrounded by your peers, you become boring. Yet if you can combine the best of those two - the right kind of insularity with the right kind of homogeneity - you create an environment both safe enough and stimulating enough to make great thoughts possible. You get Fichte, Schelling, and Hegel, and a revolution in Western philosophy. You get Darwin, Watt, Wedgwood, and Priestley, and the beginnings of the Industrial Revolution. And sometimes, on a more modest level, you get a bunch of people goofing around and bringing a new kind of comedy to network television.

  3.

  One of "S.N.L."'s forerunners was a comedy troupe based in San Francisco called the Committee. The Committee's heyday was in the nineteen-sixties, and its humor had the distinctive political bite of that period. In one of the group's memorable sketches, the actor Larry Hankin played a condemned prisoner being led to the electric chair by a warden, a priest, and a prison guard. Hankin was strapped in and the switch was thrown - and nothing happened. Hankin started to become abusive, and the three men huddled briefly together. Then, as Tony Hendra recounts, in "Going Too Far," his history of "boomer humor":

  They confer and throw the switch again. Still nothing. Hankin starts cackling with glee, doubly abusive. They throw it yet again. Nothing yet again. Hankin then demands to be set free - he can't be executed more than once, they're a bunch of assholes, double jeopardy, nyah-nyah, etc., etc. Totally desperate, the three confer once more, check that they're alone in the cell, and kick Hankin to death.

  Is that sketch funny? Some people thought so. When the Committee performed it at a benefit at the Vacaville prison, in California, the inmates laughed so hard they rioted. But others didn't, and even today it's clear that this humor is funny only to those who can appreciate the particular social and political sensibility of the Committee. We call new cultural or intellectual movements "circles" for a reason: the circle is a closed loop. You are either inside or outside. In "Live from New York," Lorne Michaels describes going to the White House to tape President Ford saying, "Live from New York, it's Saturday Night," the "S.N.L." intro: "We'd done two or three takes, and to relax him, I said to him - my sense of humor at the time - "Mr. President, if this works out, who knows where it will lead?' Which was completely lost on him." In another comic era, the fact that Ford did not laugh would be evidence of the joke's failure. But when Michaels says the joke "was completely lost on him" it isn't a disclaimer - it's the punch line. He said what he said because he knew Ford would not get it. As the writers of "Saturday Night Live" worked on sketches deep into the night, they were sustained by something like what sustained the Lunar men and the idealists in Tübingen - the feeling that they all spoke a private language.

  To those on the inside, of course, nothing is funnier than an inside joke. But the real significance of inside jokes is what they mean for those who aren't on the inside. Laughing at a joke creates an incentive to join the joke-teller. But not laughing - not getting the joke - creates an even greater incentive. We all want to know what we're missing, and this is one of the ways that revolutions spread from the small groups that spawn them.

  "One of Michaels's rules was, no groveling to the audience either in the studio or at home," Shales and Miller write. "The collective approach of the show's creators could be seen as a kind of arrogance, a stance of defiance that said in effect, "We think this is funny, and if you don't, you're wrong.' . . . To viewers raised on TV that was forever cajoling, importuning, and talking down to them, the blunt and gutsy approach was refreshing, a virtual reinvention of the medium."

  The successful inside joke, however, can never last. In "A Great Silly Grin" (Public Affairs; $27.50), a history of nineteen-sixties British satire, Humphrey Carpenter relates a routine done at the comedy club the Establishment early in the decade. The sketch was about the rebuilt Coventry Cathedral, which had been destroyed in the war, and the speaker was supposed to be the Cathedral's architect, Sir Basil Spence:

  First of all, of course, we owe an enormous debt of gratitude to the German people for making this whole project possible in the first place. Second, we owe a debt of gratitude to the people of Coventry itself, who when asked to choose between having a cathedral and having hospitals, schools and houses, plumped immediately (I'm glad to say) for the cathedral, recognizing, I think, the need of any community to have a place where the whole community can gather together and pray for such things as hospitals, schools and houses.

  When that bit was first performed, many Englishmen would have found it offensive. Now, of course, hardly anyone would. Mocking British establishment pieties is no longer an act of rebellion. It is the norm. Successful revolutions contain the seeds of their demise: they attract so many followers, eager to be in on the joke as well, that the circle breaks down. The inside becomes indistinguishable from the outside. The allure of exclusivity is gone.

  At the same time, the special bonds that created the circle cannot last forever. Sooner or later, the people who slept together in every combination start to pair off. Those doing drugs together sober up (or die). Everyone starts going to bed at eleven o'clock, and bit by bit the intimacy that fuels innovation slips away. "I was involved with Gilda, yeah. I was in love with her," Aykroyd tells Shales and Miller."We were friends, lovers, then friends again," and in a way that's the simplest and best explanation for the genius of the original "S.N.L." Today's cast is not less talented. It is simply more professional. "I think some people in the cast have fun crushes on other people, but nothing serious," Cheri Oteri, a cast member from the late nineteen-nineties, tells Shales and Miller, in what might well serve as the show's creative epitaph. "I guess we're kind of boring - no romances, no drugs. I had an audition once with somebody who used to work here. He's very, very big in the business now. And as soon as I went in for the audition, he went, "Hey, you guys still doing coke over at SNL?' Because back when he was here, they were doing it. What are we doing, for crying out loud? Oh yeah. Thinking up characters."
GO TO TOP MENU

  The great Chicago heat wave, and other unnatural disasters.

  1.

  In the first week of July, 1995, a strong high-pressure air mass developed over the plains of the Southwest and began moving slowly eastward toward Chicago. Illinois usually gets its warm summer air from the Gulf of Mexico, and the air coming off the ocean is relatively temperate. But this was a blast of western air that had been baked in the desert ovens of West Texas and New Mexico. It was hot, bringing temperatures in excess of a hundred degrees, and, because the preceding two months had been very wet in the Midwest and the ground was damp, the air steadily picked up moisture as it moved across the farmlands east of the Rockies. Ordinarily, this would not have been a problem, since humid air tends to become diluted as it mixes with the drier air higher up in the atmosphere. But it was Chicago's misfortune, in mid-July, to be in the grip of an unusually strong temperature inversion: the air in the first thousand feet above the city surface was cooler than the air at two and three thousand feet. The humid air could not rise and be diluted. It was trapped by the warmer air above. The United States has cities that are often humid - like Houston and New Orleans - without being tremendously hot. And it has very hot cities - like Las Vegas and Phoenix - that are almost never humid. But for one long week, beginning on Thursday, July 13, 1995, Chicago was both. Meteorologists measure humidity with what is called the dew point - the point at which the air is so saturated with moisture that it cannot cool without forming dew. On a typical Chicago summer day, the dew point is in the low sixties, and on a very warm, humid day it is in the low seventies. At Chicago's Midway Airport, during the heat wave of 1995, the dew point hit the low eighties - a figure reached regularly only in places like the coastal regions of the Middle East. In July of 1995, Chicago effectively turned into Dubai.

  As the air mass settled on the city, cars began to overheat and stall in the streets. Roads buckled. Hundreds of children developed heat exhaustion when school buses were stuck in traffic. More than three thousand fire hydrants were opened in poorer neighborhoods around the city, by people looking for relief from the heat, and this caused pressure to drop so precipitately that entire buildings were left without water. So many air-conditioners were turned on that the city's electrical infrastructure was overwhelmed. A series of rolling blackouts left thousands without power. As the heat took its toll, the city ran out of ambulances. More than twenty hospitals, mostly on Chicago's poorer South Side, shut their doors to new admissions. Callers to 911 were put on hold, and as the police and paramedics raced from one home to another it became clear that the heat was killing people in unprecedented numbers. The police took the bodies to the Cook County Medical Examiner's office, and a line of cruisers stretched outside the building. Students from a nearby mortuary school, and then ex-convicts looking to earn probation points, were brought in to help. The morgue ran out of bays in which to put the bodies. Office space was cleared. It wasn't enough. The owner of a local meatpacking firm offered the city his refrigerated trucks to help store the bodies. The first set wasn't enough. He sent another. It wasn't enough. In the end, there were nine forty-eight-foot meatpacking trailers in the morgue's parking lot. When the final statistics were tallied, the city calculated that in the seven days between July 14th and July 20th, the heat wave had resulted in the deaths of seven hundred and thirty-nine Chicagoans; on Saturday, July 15th, alone, three hundred and sixty-five people died from the heat. The chance intersection of a strong high-pressure ridge, a wet spring, and an intense temperature inversion claimed more lives than Hurricane Andrew, the crash of T.W.A. Flight 800, the Oklahoma City bombing, and the Northridge, California, earthquake combined.

  2.

  In "Heat Wave: A Social Autopsy of Disaster in Chicago" (Chicago; $27.50), the New York University sociologist Eric Klinenberg sets out to understand what happened during those seven days in July. He looks at who died, and where they died, and why they died. He goes to the county morgue and sifts through the dozens of boxes of unclaimed personal effects of heat-wave victims - "watches, wallets, letters, tax returns, photographs, and record books" - and reads the police reports on the victims, with their dry recitations of the circumstances of death. Here is one for a seventy-three-year-old white woman who was found on Monday, July 17th:

A recluse for 10 yrs, never left apartment, found today by son, apparently DOA. Conditions in apartment when R/O's [responding officers] arrived thermostat was registering over 90 degrees f. with no air circulation except for windows opened by son (after death).

  Here is another, for a seventy-nine-year-old black man found on Wednesday the 19th:

Victim did not respond to phone calls or knocks on victim's door since Sunday, 16 July 1995. Victim was known as quiet, [kept] to himself and at times, not to answer the door. Landlord . . . does not have any information to any relatives to victim. . . . Chain was on door. R/O was able to see victim on sofa with flies on victim and a very strong odor decay.

  The city's response to the crisis, Klinenberg argues, was to look at people like those two victims - the recluse who did not open her windows and the man who would not answer his door - and conclude that their deaths were inevitable, the result of an unavoidable collision between their own infirmity and an extreme environmental event. As one Health Department official put it at the time, "Government can't guarantee there won't be a heat wave." On the Friday, the human-services commissioner, Daniel Alvarez, told the press, "We're talking about people who die because they neglect themselves. We did everything possible. But some people didn't want to open their doors to us." In its official postmortem four months later, the city sounded the same fatalistic note: the disaster had been a "unique meteorological event" that proved that the "government alone cannot do it all."

  Klinenberg finds that conclusion unacceptably superficial. The disaster may look inevitable, but beneath the surface he sees numerous explanations for why it took the shape it did. One chapter of the book is devoted to a comparison of two adjoining low-income neighborhoods in Chicago, Little Village and North Lawndale. Statistically, the two are almost identical, each with heavy concentrations of poor, elderly people living alone, so it would seem that the heat wave should have taken a similar toll in both neighborhoods. But North Lawndale had ten times the fatality rate of Little Village. Why? Because Little Village is a bustling, relatively safe, close-knit Hispanic community; the elderly had family and friends nearby who could look in on them, and streets and stores where they could go to escape their stifling apartments. North Lawndale, by contrast, is a sprawling, underpopulated, drug-infested neighborhood. The elderly there were afraid to go outside, and had no one close by to visit them. The heat was deadly only in combination with particular social and physical circumstances.

  Klinenberg takes an equally close look at the city's ambulance shortage. The city could have nearly tripled the number of available ambulances by calling in reserves from the suburbs, but it was slow to realize that it had a disaster on its hands. "It's hot. It's very hot. But let's not blow it out of proportion": this was Mayor Richard Daley's assessment of the situation on Friday, July 14th. The streamlining of city governments like Chicago's, Klinenberg explains, isolated city officials. Social-services departments had been professionalized as if they were corporations. Responsibilities had been outsourced. "Police officers replace aldermen and precinct captains as the community sentries," he writes, and as a result political organizations began to lose contact with the needs of their constituents.

  Problem solving, in our day and age, brings with it the requirement of compression: we are urged to distill the most pertinent lessons from any experience. Klinenberg suggests that such distillation only obscures the truth, and by the end of "Heat Wave" he has traced the lines of culpability in dozens of directions, drawing a dense and subtle portrait of exactly what happened during that week in July. It is an approach that resembles, most of all, the way the heat wave was analyzed by meteorologists. They took hourly surface-airways observations of temperature, wind speed, and humidity, estimated radiation from cloud cover, and performed complex calculations using the Penman-Monteith formula to factor in soil-heat flux, latent heat of vaporization, stomatal resistance, and the von Kármán constant. Why, Klinenberg asks, can't we bring the same rigor to our study of the social causes of disaster?

  3.

  Take the question of air-conditioning. The Centers for Disease Control, in their Chicago investigation, concluded that the use of air-conditioners could have prevented more than half of the deaths. But many low-income people in Chicago couldn't afford to turn on an air-conditioner even if they had been given one for free. Many of those who did have air-conditioners, meanwhile, were hit by the power failures that week. Chicago had a problem with a vulnerable population: a lot of very old and very sick people. But it also, quite apart from this, had an air-conditioning problem. What was the cause of that problem?

  As it turns out, this is a particularly timely question, since there is a debate going on now in Washington over air-conditioners which bears directly on what happens during heat waves. All air-conditioners consist of a motor and a long coil that acts as a heat exchanger, taking hot air out of the room and replacing it with cold air. If you use a relatively unsophisticated motor and a small coil, an air-conditioner will be cheap to make but will use a lot of electricity. If you use a better motor and a larger heat exchanger, the air-conditioner will cost more to buy but far less to run. Rationally, consumers should buy the more expensive, energy-efficient units, because their slightly higher purchase price is dwarfed by the amount of money the owner pays over time in electric bills. But fifteen years ago Congress realized that this wasn't happening. The people who generally bought air-conditioners - builders and landlords - weren't the people who paid the utility bills to run them. Their incentive was to buy the cheapest unit. So Congress passed a minimum standard for air-conditioning efficiency. Residential central air-conditioning units now had to score at least 10 on a scale known as SEER - the seasonal energy-efficiency ratio. One of Bill Clinton's last acts as President was to raise that standard to 13. This spring, however, the Bush Administration cut the efficiency increase by a third, making SEER 12 the law.

  It should be said that SEER 13 is no more technologically difficult than SEER 12. SEER 12 is simply a bit cheaper to make, and SEER 13 is simply cheaper to operate. Nor is this a classic regulatory battle that pits corporate against consumer interests. The nation's largest air-conditioner manufacturer, Carrier, is in favor of 12. But the second-largest manufacturer, Goodman (which makes Amana air-conditioners), is in favor of 13. The Bush decision is really about politics, and the White House felt free to roll back the Clinton standard because most of the time the difference between the two standards is negligible. There is one exception, however: heat waves.

  Air-conditioning is, of course, the reason that electrical consumption soars on very hot days. On the worst day in August, electricity consumption in, say, Manhattan might be three or four times what it is on a cool spring day. For most of the year, a local utility can use the electricity from its own power plants, or sign stable, long-term contracts with other power companies. But the extra electricity a city needs on that handful of very hot days presents a problem. You can't build a power plant just to supply this surge - what would you do with it during the rest of the year? So, at peak periods, utilities buy the power they need on the "spot" market, and power bought on the spot market can cost fifty times as much as the power used on normal days. The amount of power that a utility has to buy for that handful of hot days every summer, in other words, is a huge factor in the size of our electric bills.

  For anyone wanting to make electricity cheaper, then, the crucial issue is not how to reduce average electrical consumption but how to reduce peak consumption. A recent study estimates that moving the SEER standard from 10 to 13 would have the effect of cutting peak demand by the equivalent of more than a hundred and fifty power plants. The Bush Administration's decision to cut the SEER upgrade by a third means that by 2020 demand will be fourteen thousand megawatts higher than it would have been, and that we'll have to build about fifty more power plants. The cost of those extra power plants - and of running a less efficient air-conditioner on hot days - is part of what will make air-conditioning less affordable for people who will someday desperately need it.

  The sheer volume of electricity required on a very hot day also puts enormous strain on a city's power-distribution system. On the Friday of the Chicago heat wave, when power demand peaked, one of the main problem areas was the transmission substation (TSS) at California Avenue and Addison Street, in the city's northwest corner. TSS 114 consists of a series of giant transformers - twenty feet high and fifteen feet across - that help convert the high-voltage electricity that comes into Chicago along power lines into the low-voltage power that is used in offices and homes. Throughout that Friday afternoon, the four transformers in the second terminal at TSS 114 were running at a hundred and eighteen per cent of capacity - that is, they were handling roughly a fifth more electricity than they were designed to carry. The chief side effect of overcapacity is heat. The more current you run through a transformer the hotter it gets, and, combined with the ambient temperature that afternoon, which averaged a hundred and eleven degrees, the heat turned the inside of terminal two into an oven.

  At 4:56 P.M., the heat overwhelmed a monitoring device known as a CT - a gauge almost small enough to fit in the palm of one's hand - on the first of the transformers. It tripped and shut down. The current that had been shared by four transformers had to be carried by just three, making them still hotter. The second transformer was now carrying a hundred and twenty-four per cent of its rated capacity. Fifty-one minutes later, a circuit breaker on the second transformer burst into flames. Transformers are engineered to handle extra loads for short periods of time, but there was just a little too much current and a little too much heat. At 6:19, two more CTs tripped on the third transformer and, as workmen struggled to get the terminal up and running, a CT failed on the fourth transformer. In all, forty-nine thousand customers and all of the people in those customers' houses and apartments and offices were without air-conditioning for close to twenty hours - and this is merely what happened at TSS 114.

  All around the city that week, between Wednesday and Sunday, there were 1,327 separate equipment failures that left an additional hundred and forty-nine thousand customers without power. Those are staggering numbers. But what is really staggering is how easy it would have been to avoid these power outages. Commonwealth Edison, the city's utility, had forecast a year earlier that electricity use in the summer of 1995 would peak at 18,600 megawatts. The actual high, on the Friday of the heat wave, was 19,201. The difference, in other words, between the demand that the utility was prepared to handle and the demand that brought the city to its knees was six hundred and one megawatts, or 3.2 per cent of the total - which is just about what a place like Chicago might save by having a city full of SEER 13 air-conditioners instead of SEER 12 air-conditioners.

  4.

  In 1928, a storm near Palm Beach, Florida, killed almost two thousand people, most of them black migrant workers on the shores of Lake Okeechobee. This was, the state comptroller declared, "an act of God." In 1935, the most severe hurricane in American history hit the Florida Keys, sending a storm surge fifteen to twenty feet high through a low-lying encampment of war veterans working on the highway. About four hundred people died. "The catastrophe must be characterized as an act of God and was by its very nature beyond the power of man," the Veterans Authority and Federal Emergency Relief Administration declared in an official report. In 1972, an earthen dam put up by a mining company in Logan County, West Virginia, collapsed in heavy rains, killing a hundred and thirty-nine people. It was an "act of God," a mining-company official said, disavowing any culpability. In 1974, a series of twisters swept across ten states, killing three hundred and fifteen people. Senator Thomas Eagleton, of Missouri, said at the time that his colleagues in Washington viewed the tornado "as an act of God where even the Congress can't intervene," explaining why the government would not fund an early-warning system. This is the way we have thought of catastrophes in the United States. The idea of an "act of God" suggests that any search for causes is unnecessary. It encourages us to see disasters, as the environmental historian Ted Steinberg writes in "Acts of God: The Unnatural History of Natural Disaster in America" (2000), simply as things that happen "from time to time." It suggests, too, that systems or institutions ought to be judged on the basis of how they perform most of the time, under "normal" conditions, rather than by how they perform under those rare moments of extreme stress. But this idea, as "Heat Wave" makes clear, is a grave mistake. Political systems and social institutions ought to be judged the way utilities are judged. The true test is how they perform on a blistering day in July.

  Klinenberg tells the story of Pauline Jankowitz, an elderly woman living alone in a third-floor apartment in a transitional neighborhood. Her air-conditioner was old and didn't work well. She had a bladder problem that left her incontinent, and she had to walk with a crutch because she had a weak leg. That made it difficult for her to get down the stairs, and once she was outside she was terrified of being mugged. "Chicago is just a shooting gallery," she said to Klinenberg. She left her apartment only about six times a year. Jankowitz was the prototypical heat-wave victim, and, as she told Klinenberg, that week in July was "the closest I've ever come to death." But she survived. A friend had told her to leave her apartment if it got too hot; so, early on what would turn out to be the worst of the seven days, she rose and crept down the stairs. She caught a city bus to a nearby store, which was air-conditioned, and there she bought fresh cherries and leaned on the shopping cart until she recovered her strength. On the trip home, she recalled, "climbing the stairs was almost impossible." Back in her apartment, she felt her body begin to swell and go numb. She telephoned a friend. She turned a fan on high, lay down on the floor, covered herself with wet towels, and dreamed that she was on a Caribbean cruise. She was poor and old and infirm, but she lived, and one of the many lessons of her story is that in order to survive that week in July she suddenly depended on services and supports that previously she had barely needed at all. Her old air-conditioner was useless most of the time. But that week it helped to keep her apartment at least habitable. She rarely travelled. But on that day the fact that there was a city bus, and that it came promptly and that it was air-conditioned, was of the greatest importance. She rarely went to the store; she had her groceries delivered. But now the proximity of a supermarket, where she could lean on the shopping cart and breathe in the cool air, was critical. Pauline Jankowitz's life depended not on the ordinary workings of the social institutions in her world but on their ability to perform at one critical moment of peak demand. On the hottest of all days, her neighborhood substation did not fail. Her bus came. Her grocery store was open. She was one of the lucky ones.
GO TO TOP MENU

  Can you read people's thoughts just by looking at them?

  1.

  Some years ago, John Yarbrough was working patrol for the Los Angeles County Sheriff's Department. It was about two in the morning. He and his partner were in the Willowbrook section of South Central Los Angeles, and they pulled over a sports car. "Dark, nighttime, average stop," Yarbrough recalls. "Patrol for me was like going hunting. At that time of night in the area I was working, there was a lot of criminal activity, and hardly anyone had a driver's license. Almost everyone had something intoxicating in the car. We stopped drunk drivers all the time. You're hunting for guns or lots of dope, or suspects wanted for major things. You look at someone and you get an instinctive reaction. And the longer you've been working the stronger that instinctive reaction is."

  Yarbrough was driving, and in a two-man patrol car the procedure is for the driver to make the approach and the officer on the passenger side to provide backup. He opened the door and stepped out onto the street, walking toward the vehicle with his weapon drawn. Suddenly, a man jumped out of the passenger side and pointed a gun directly at him. The two of them froze, separated by no more than a few yards. "There was a tree behind him, to his right," Yarbrough recalls. "He was about seventeen. He had the gun in his right hand. He was on the curb side. I was on the other side, facing him. It was just a matter of who was going to shoot first. I remember it clear as day. But for some reason I didn't shoot him." Yarbrough is an ex-marine with close-cropped graying hair and a small mustache, and he speaks in measured tones. "Is he a danger? Sure. He's standing there with a gun, and what person in his right mind does that facing a uniformed armed policeman? If you looked at it logically, I should have shot him. But logic had nothing to do with it. Something just didn't feel right. It was a gut reaction not to shoot - a hunch that at that exact moment he was not an imminent threat to me." So Yarbrough stopped, and, sure enough, so did the kid. He pointed a gun at an armed policeman on a dark street in South Central L.A., and then backed down.

  Yarbrough retired last year from the sheriff's department after almost thirty years, sixteen of which were in homicide. He now lives in western Arizona, in a small, immaculate house overlooking the Colorado River, with pictures of John Wayne, Charles Bronson, Clint Eastwood, and Dale Earnhardt on the wall. He has a policeman's watchfulness: while he listens to you, his eyes alight on your face, and then they follow your hands, if you move them, and the areas to your immediate left and right - and then back again, in a steady cycle. He grew up in an affluent household in the San Fernando Valley, the son of two doctors, and he is intensely analytical: he is the sort to take a problem and break it down, working it over slowly and patiently in his mind, and the incident in Willowbrook is one of those problems. Policemen shoot people who point guns directly at them at two in the morning. But something he saw held him back, something that ninety-nine people out of a hundred wouldn't have seen.

  Many years later, Yarbrough met with a team of psychologists who were conducting training sessions for law enforcement. They sat beside him in a darkened room and showed him a series of videotapes of people who were either lying or telling the truth. He had to say who was doing what. One tape showed people talking about their views on the death penalty and on smoking in public. Another featured a series of nurses who were all talking about a nature film they were supposedly watching, even though some of them were actually watching grisly documentary footage about burn victims and amputees. It may sound as if the tests should have been easy, because we all think we can tell whether someone is lying. But these were not the obvious fibs of a child, or the prevarications of people whose habits and tendencies we know well. These were strangers who were motivated to deceive, and the task of spotting the liars turns out to be fantastically difficult. There is just too much information - words, intonation, gestures, eyes, mouth - and it is impossible to know how the various cues should be weighted, or how to put them all together, and in any case it's all happening so quickly that you can't even follow what you think you ought to follow. The tests have been given to policemen, customs officers, judges, trial lawyers, and psychotherapists, as well as to officers from the F.B.I., the C.I.A., the D.E.A., and the Bureau of Alcohol, Tobacco, and Firearms - people one would have thought would be good at spotting lies. On average, they score fifty per cent, which is to say that they would have done just as well if they hadn't watched the tapes at all and just guessed. But every now and again - roughly one time in a thousand - someone scores off the charts. A Texas Ranger named David Maxwell did extremely well, for example, as did an ex-A.T.F. agent named J.J. Newberry, a few therapists, an arbitrator, a vice cop - and John Yarbrough, which suggests that what happened in Willowbrook may have been more than a fluke or a lucky guess. Something in our faces signals whether we're going to shoot, say, or whether we're lying about the film we just saw. Most of us aren't very good at spotting it. But a handful of people are virtuosos. What do they see that we miss?

  2.

  All of us, a thousand times a day, read faces. When someone says "I love you," we look into that person's eyes to judge his or her sincerity. When we meet someone new, we often pick up on subtle signals, so that, even though he or she may have talked in a normal and friendly manner, afterward we say, "I don't think he liked me," or "I don't think she's very happy." We easily parse complex distinctions in facial expression. If you saw me grinning, for example, with my eyes twinkling, you'd say I was amused. But that's not the only way we interpret a smile. If you saw me nod and smile exaggeratedly, with the corners of my lips tightened, you would take it that I had been teased and was responding sarcastically. If I made eye contact with someone, gave a small smile and then looked down and averted my gaze, you would think I was flirting. If I followed a remark with an abrupt smile and then nodded, or tilted my head sideways, you might conclude that I had just said something a little harsh, and wanted to take the edge off it. You wouldn't need to hear anything I was saying in order to reach these conclusions. The face is such an extraordinarily efficient instrument of communication that there must be rules that govern the way we interpret facial expressions. But what are those rules? And are they the same for everyone?

  In the nineteen-sixties, a young San Francisco psychologist named Paul Ekman began to study facial expression, and he discovered that no one knew the answers to those questions. Ekman went to see Margaret Mead, climbing the stairs to her tower office at the American Museum of Natural History. He had an idea. What if he travelled around the world to find out whether people from different cultures agreed on the meaning of different facial expressions? Mead, he recalls, "looked at me as if I were crazy." Like most social scientists of her day, she believed that expression was culturally determined - that we simply used our faces according to a set of learned social conventions. Charles Darwin had discussed the face in his later writings; in his 1872 book, "The Expression of the Emotions in Man and Animals," he argued that all mammals show emotion reliably in their faces. But in the nineteen-sixties academic psychologists were more interested in motivation and cognition than in emotion or its expression. Ekman was undaunted; he began travelling to places like Japan, Brazil, and Argentina, carrying photographs of men and women making a variety of distinctive faces. Everywhere he went, people agreed on what those expressions meant. But what if people in the developed world had all picked up the same cultural rules from watching the same movies and television shows? So Ekman set out again, this time making his way through the jungles of Papua New Guinea, to the most remote villages, and he found that the tribesmen there had no problem interpreting the expressions, either. This may not sound like much of a breakthrough. But in the scientific climate of the time it was a revelation. Ekman had established that expressions were the universal products of evolution. There were fundamental lessons to be learned from the face, if you knew where to look.

  Paul Ekman is now in his sixties. He is clean-shaven, with closely set eyes and thick, prominent eyebrows, and although he is of medium build, he seems much larger than he is: there is something stubborn and substantial in his demeanor. He grew up in Newark, the son of a pediatrician, and entered the University of Chicago at fifteen. He speaks deliberately: before he laughs, he pauses slightly, as if waiting for permission. He is the sort to make lists, and number his arguments. His academic writing has an orderly logic to it; by the end of an Ekman essay, each stray objection and problem has been gathered up and catalogued. In the mid-sixties, Ekman set up a lab in a ramshackle Victorian house at the University of California at San Francisco, where he holds a professorship. If the face was part of a physiological system, he reasoned, the system could be learned. He set out to teach himself. He treated the face as an adventurer would a foreign land, exploring its every crevice and contour. He assembled a videotape library of people's facial expressions, which soon filled three rooms in his lab, and studied them to the point where he could look at a face and pick up a flicker of emotion that might last no more than a fraction of a second. Ekman created the lying tests. He filmed the nurses talking about the movie they were watching and the movie they weren't watching. Working with Maureen O'Sullivan, a psychologist from the University of San Francisco, and other colleagues, he located people who had a reputation for being uncannily perceptive, and put them to the test, and that's how Yarbrough and the other high-scorers were identified. O'Sullivan and Ekman call this study of gifted face readers the Diogenes Project, after the Greek philosopher of antiquity who used to wander around Athens with a lantern, peering into people's faces as he searched for an honest man. Ekman has taken the most vaporous of sensations - the hunch you have about someone else - and sought to give them definition. Most of us don't trust our hunches, because we don't know where they came from. We think they can't be explained. But what if they can?

  3.

  Paul Ekman got his start in the face-reading business because of a man named Silvan Tomkins, and Silvan Tomkins may have been the best face reader there ever was. Tomkins was from Philadelphia, the son of a dentist from Russia. He was short, and slightly thick around the middle, with a wild mane of white hair and huge black plastic-rimmed glasses. He taught psychology at Princeton and Rutgers, and was the author of "Affect, Imagery, Consciousness," a four-volume work so dense that its readers were evenly divided between those who understood it and thought it was brilliant and those who did not understand it and thought it was brilliant. He was a legendary talker. At the end of a cocktail party, fifteen people would sit, rapt, at Tomkins's feet, and someone would say, "One more question!" and they would all sit there for another hour and a half, as Tomkins held forth on, say, comic books, a television sitcom, the biology of emotion, his problem with Kant, and his enthusiasm for the latest fad diets, all enfolded into one extended riff. During the Depression, in the midst of his doctoral studies at Harvard, he worked as a handicapper for a horse-racing syndicate, and was so successful that he lived lavishly on Manhattan's Upper East Side. At the track, where he sat in the stands for hours, staring at the horses through binoculars, he was known as the Professor. "He had a system for predicting how a horse would do based on what horse was on either side of him, based on their emotional relationship," Ekman said. If a male horse, for instance, had lost to a mare in his first or second year, he would be ruined if he went to the gate with a mare next to him in the lineup. (Or something like that - no one really knew for certain.) Tomkins felt that emotion was the code to life, and that with enough attention to particulars the code could be cracked. He thought this about the horses, and, more important, he thought this about the human face.

  Tomkins, it was said, could walk into a post office, go over to the "Wanted" posters, and, just by looking at mug shots, tell you what crimes the various fugitives had committed. "He would watch the show "To Tell the Truth,' and without fault he could always pick the person who was lying and who his confederates were," his son, Mark, recalls. "He actually wrote the producer at one point to say it was too easy, and the man invited him to come to New York, go backstage, and show his stuff." Virginia Demos, who teaches psychology at Harvard, recalls having long conversations with Tomkins. "We would sit and talk on the phone, and he would turn the sound down as Jesse Jackson was talking to Michael Dukakis, at the Democratic National Convention. And he would read the faces and give his predictions on what would happen. It was profound."

  Ekman's most memorable encounter with Tomkins took place in the late sixties. Ekman had just tracked down a hundred thousand feet of film that had been shot by the virologist Carleton Gajdusek in the remote jungles of Papua New Guinea. Some of the footage was of a tribe called the South Fore, who were a peaceful and friendly people. The rest was of the Kukukuku, who were hostile and murderous and who had a homosexual ritual where pre-adolescent boys were required to serve as courtesans for the male elders of the tribe. Ekman was still working on the problem of whether human facial expressions were universal, and the Gajdusek film was invaluable. For six months, Ekman and his collaborator, Wallace Friesen, sorted through the footage. They cut extraneous scenes, focussing just on closeups of the faces of the tribesmen, and when the editing was finished Ekman called in Tomkins.

  The two men, protégé and mentor, sat at the back of the room, as faces flickered across the screen. Ekman had told Tomkins nothing about the tribes involved; all identifying context had been edited out. Tomkins looked on intently, peering through his glasses. At the end, he went up to the screen and pointed to the faces of the South Fore. "These are a sweet, gentle people, very indulgent, very peaceful," he said. Then he pointed to the faces of the Kukukuku. "This other group is violent, and there is lots of evidence to suggest homosexuality." Even today, a third of a century later, Ekman cannot get over what Tomkins did. "My God! I vividly remember saying, "Silvan, how on earth are you doing that?' " Ekman recalls. "And he went up to the screen and, while we played the film backward, in slow motion, he pointed out the particular bulges and wrinkles in the face that he was using to make his judgment. That's when I realized, "I've got to unpack the face.' It was a gold mine of information that everyone had ignored. This guy could see it, and if he could see it, maybe everyone else could, too."

  Ekman and Friesen decided that they needed to create a taxonomy of facial expressions, so day after day they sat across from each other and began to make every conceivable face they could. Soon, though, they realized that their efforts weren't enough. "I met an anthropologist, Wade Seaford, told him what I was doing, and he said, 'Do you have this movement?'" - and here Ekman contracted what's called the triangularis, which is the muscle that depresses the corners of the lips, forming an arc of distaste - "and it wasn't in my system, because I had never seen it before. I had built a system not on what the face can do but on what I had seen. I was devastated. So I came back and said, 'I've got to learn the anatomy.' " Friesen and Ekman then combed through medical textbooks that outlined each of the facial muscles, and identified every distinct muscular movement that the face could make. There were forty-three such movements. Ekman and Friesen called them "action units." Then they sat across from each other again, and began manipulating each action unit in turn, first locating the muscle in their mind and then concentrating on isolating it, watching each other closely as they did, checking their movements in a mirror, making notes of how the wrinkle patterns on their faces would change with each muscle movement, and videotaping the movement for their records. On the few occasions when they couldn't make a particular movement, they went next door to the U.C.S.F. anatomy department, where a surgeon they knew would stick them with a needle and electrically stimulate the recalcitrant muscle. "That wasn't pleasant at all," Ekman recalls. When each of those action units had been mastered, Ekman and Friesen began working action units in combination, layering one movement on top of another. The entire process took seven years. "There are three hundred combinations of two muscles," Ekman says. "If you add in a third, you get over four thousand. We took it up to five muscles, which is over ten thousand visible facial configurations." Most of those ten thousand facial expressions don't mean anything, of course. They are the kind of nonsense faces that children make. But, by working through each action-unit combination, Ekman and Friesen identified about three thousand that did seem to mean something, until they had catalogued the essential repertoire of human emotion.

  4.

  On a recent afternoon, Ekman sat in his office at U.C.S.F., in what is known as the Human Interaction Laboratory, a standard academic's lair of books and files, with photographs of his two heroes, Tomkins and Darwin, on the wall. He leaned forward slightly, placing his hands on his knees, and began running through the action-unit configurations he had learned so long ago. "Everybody can do action unit four," he began. He lowered his brow, using his depressor glabellae, depressor supercilli, and corrugator. "Almost everyone can do A.U. nine." He wrinkled his nose, using his levator labii superioris, alaeque nasi. "Everybody can do five." He contracted his levator palpebrae superioris, raising his upper eyelid.

  I was trying to follow along with him, and he looked up at me. "You've got a very good five," he said generously. "The more deeply set your eyes are, the harder it is to see the five. Then there's seven." He squinted. "Twelve." He flashed a smile, activating the zygomatic major. The inner parts of his eyebrows shot up. "That's A.U. —- distress, anguish." Then he used his frontalis, pars lateralis, to raise the outer half of his eyebrows. "That's A.U. two. It's also very hard, but it's worthless. It's not part of anything except Kabuki theatre. Twenty-three is one of my favorites. It's the narrowing of the red margin of the lips. Very reliable anger sign. It's very hard to do voluntarily." He narrowed his lips. "Moving one ear at a time is still the hardest thing to do. I have to really concentrate. It takes everything I've got." He laughed. "This is something my daughter always wanted me to do for her friends. Here we go." He wiggled his left ear, then his right ear. Ekman does not appear to have a particularly expressive face. He has the demeanor of a psychoanalyst, watchful and impassive, and his ability to transform his face so easily and quickly was astonishing. "There is one I can't do," he went on. "It's A.U. thirty-nine. Fortunately, one of my postdocs can do it. A.U. thirty-eight is dilating the nostrils. Thirty-nine is the opposite. It's the muscle that pulls them down." He shook his head and looked at me again. "Oooh! You've got a fantastic thirty-nine. That's one of the best I've ever seen. It's genetic. There should be other members of your family who have this heretofore unknown talent. You've got it, you've got it." He laughed again. "You're in a position to flash it at people. See, you should try that in a singles bar!"

  Ekman then began to layer one action unit on top of another, in order to compose the more complicated facial expressions that we generally recognize as emotions. Happiness, for instance, is essentially A.U. six and twelve - contracting the muscles that raise the cheek (orbicularis oculi, pars orbitalis) in combination with the zygomatic major, which pulls up the corners of the lips. Fear is A.U. one, two and four, or, more fully, one, two, four, five, and twenty, with or without action units twenty-five, twenty-six, or twenty-seven. That is: the inner brow raiser (frontalis, pars medialis) plus the outer brow raiser (frontalis, pars lateralis) plus the brow-lowering depressor supercilli plus the levator palpebrae superioris (which raises the upper lid), plus the risorius (which stretches the lips), the parting of the lips (depressor labii), and the masseter (which drops the jaw). Disgust? That's mostly A.U. nine, the wrinkling of the nose (levator labii superioris, alaeque nasi), but it can sometimes be ten, and in either case may be combined with A.U. fifteen or sixteen or seventeen.

  Ekman and Friesen ultimately assembled all these combinations - and the rules for reading and interpreting them - into the Facial Action Coding System, or FACS, and wrote them up in a five-hundred-page binder. It is a strangely riveting document, full of details like the possible movements of the lips (elongate, de-elongate, narrow, widen, flatten, protrude, tighten and stretch); the four different changes of the skin between the eyes and the cheeks (bulges, bags, pouches, and lines); or the critical distinctions between infraorbital furrows and the nasolabial furrow. Researchers have employed the system to study everything from schizophrenia to heart disease; it has even been put to use by computer animators at Pixar ("Toy Story"), andat DreamWorks ("Shrek"). FACS takes weeks to master in its entirety, and only five hundred people around the world have been certified to use it in research. But for those who have, the experience of looking at others is forever changed. They learn to read the face the way that people like John Yarbrough did intuitively. Ekman compares it to the way you start to hear a symphony once you've been trained to read music: an experience that used to wash over you becomes particularized and nuanced.

  Ekman recalls the first time he saw Bill Clinton, during the 1992 Democratic primaries. "I was watching his facial expressions, and I said to my wife, 'This is Peck's Bad Boy,' " Ekman says. "This is a guy who wants to be caught with his hand in the cookie jar, and have us love him for it anyway. There was this expression that's one of his favorites. It's that hand-in-the-cookie-jar, love-me-Mommy-because-I'm-a-rascal look. It's A.U. twelve, fifteen, seventeen, and twenty-four, with an eye roll." Ekman paused, then reconstructed that particular sequence of expressions on his face. He contracted his zygomatic major, A.U. twelve, in a classic smile, then tugged the corners of his lips down with his triangularis, A.U. fifteen. He flexed the mentalis, A.U. seventeen, which raises the chin, slightly pressed his lips together in A.U. twenty-four, and finally rolled his eyes - and it was as if Slick Willie himself were suddenly in the room. "I knew someone who was on his communications staff. So I contacted him. I said, 'Look, Clinton's got this way of rolling his eyes along with a certain expression, and what it conveys is "I'm a bad boy." I don't think it's a good thing. I could teach him how not to do that in two to three hours.' And he said, 'Well, we can't take the risk that he's known to be seeing an expert on lying.' I think it's a great tragedy, because . . ." Ekman's voice trailed off. It was clear that he rather liked Clinton, and that he wanted Clinton's trademark expression to have been no more than a meaningless facial tic. Ekman shrugged. "Unfortunately, I guess, he needed to get caught - and he got caught."

  5.

  Early in his career, Paul Ekman filmed forty psychiatric patients, including a woman named Mary, a forty-two-year-old housewife. She had attempted suicide three times, and survived the last attempt - an overdose of pills - only because someone found her in time and rushed her to the hospital. Her children had left home and her husband was inattentive, and she was depressed. When she first went to the hospital, she simply sat and cried, but she seemed to respond well to therapy. After three weeks, she told her doctor that she was feeling much better and wanted a weekend pass to see her family. The doctor agreed, but just before Mary was to leave the hospital she confessed that the real reason she wanted to go on weekend leave was so that she could make another suicide attempt. Several years later, a group of young psychiatrists asked Ekman how they could tell when suicidal patients were lying. He didn't know, but, remembering Mary, he decided to try to find out. If the face really was a reliable guide to emotion, shouldn't he be able to look back on the film and tell that she was lying? Ekman and Friesen began to analyze the film for clues. They played it over and over for dozens of hours, examining in slow motion every gesture and expression. Finally, they saw it. As Mary's doctor asked her about her plans for the future, a look of utter despair flashed across her face so quickly that it was almost imperceptible.

  Ekman calls that kind of fleeting look a "microexpression," and one cannot understand why John Yarbrough did what he did on that night in South Central without also understanding the particular role and significance of microexpressions. Many facial expressions can be made voluntarily. If I' m trying to look stern as I give you a tongue-lashing, I'll have no difficulty doing so, and you' ll have no difficulty interpreting my glare. But our faces are also governed by a separate, involuntary system. We know this because stroke victims who suffer damage to what is known as the pyramidal neural system will laugh at a joke, but they cannot smile if you ask them to. At the same time, patients with damage to another part of the brain have the opposite problem. They can smile on demand, but if you tell them a joke they can't laugh. Similarly, few of us can voluntarily do A.U. one, the sadness sign. (A notable exception, Ekman points out, is Woody Allen, who uses his frontalis, pars medialis, to create his trademark look of comic distress.) Yet we raise our inner eyebrows all the time, without thinking, when we are unhappy. Watch a baby just as he or she starts to cry, and you'll often see the frontalis, pars medialis, shoot up, as if it were on a string.

  Perhaps the most famous involuntary expression is what Ekman has dubbed the Duchenne smile, in honor of the nineteenth-century French neurologist Guillaume Duchenne, who first attempted to document the workings of the muscles of the face with the camera. If I ask you to smile, you' ll flex your zygomatic major. By contrast, if you smile spontaneously, in the presence of genuine emotion, you' ll not only flex your zygomatic but also tighten the orbicularis oculi, pars orbitalis, which is the muscle that encircles the eye. It is almost impossible to tighten the orbicularis oculi, pars lateralis, on demand, and it is equally difficult to stop it from tightening when we smile at something genuinely pleasurable. This kind of smile "does not obey the will," Duchenne wrote. "Its absence unmasks the false friend." When we experience a basic emotion, a corresponding message is automatically sent to the muscles of the face. That message may linger on the face for just a fraction of a second, or be detectable only if you attached electrical sensors to the face, but It's always there. Silvan Tomkins once began a lecture by bellowing, "The face is like the penis!" and this is what he meant - that the face has, to a large extent, a mind of its own. This doesn't mean we have no control over our faces. We can use our voluntary muscular system to try to suppress those involuntary responses. But, often, some little part of that suppressed emotion - the sense that I' m really unhappy, even though I deny it - leaks out. Our voluntary expressive system is the way we intentionally signal our emotions. But our involuntary expressive system is in many ways even more important: it is the way we have been equipped by evolution to signal our authentic feelings.

  "You must have had the experience where somebody comments on your expression and you didn't know you were making it,"Ekman says. "Somebody tells you, "What are you getting upset about?' "Why are you smirking?' You can hear your voice, but you can't see your face. If we knew what was on our face, we would be better at concealing it. But that wouldn't necessarily be a good thing. Imagine if there were a switch that all of us had, to turn off the expressions on our face at will. If babies had that switch, we wouldn't know what they were feeling. They' d be in trouble. You could make an argument, if you wanted to, that the system evolved so that parents would be able to take care of kids. Or imagine if you were married to someone with a switch? It would be impossible. I don't think mating and infatuation and friendships and closeness would occur if our faces didn't work that way."

  Ekman slipped a tape taken from the O.J. Simpson trial into the VCR. It was of Kato Kaelin, Simpson's shaggy-haired house guest, being examined by Marcia Clark, one of the prosecutors in the case. Kaelin sits in the witness box, with his trademark vacant look. Clark asks a hostile question. Kaelin leans forward and answers softly. "Did you see that?" Ekman asked me. I saw nothing, just Kato being Kato - harmless and passive. Ekman stopped the tape, rewound it, and played it back in slow motion. On the screen, Kaelin moved forward to answer the question, and in that fraction of a second his face was utterly transformed. His nose wrinkled, as he flexed his levator labii superioris, alaeque nasi. His teeth were bared, his brows lowered. "It was almost totally A.U. nine," Ekman said. "It's disgust, with anger there as well, and the clue to that is that when your eyebrows go down, typically your eyes are not as open as they are here. The raised upper eyelid is a component of anger, not disgust. It's very quick." Ekman stopped the tape and played it again, peering at the screen. "You know, he looks like a snarling dog."

  Ekman said that there was nothing magical about his ability to pick up an emotion that fleeting. It was simply a matter of practice. "I could show you forty examples, and you could pick it up. I have a training tape, and people love it. They start it, and they can't see any of these expressions. Thirty-five minutes later, they can see them all. What that says is that this is an accessible skill."

  Ekman showed another clip, this one from a press conference given by Kim Philby in 1955. Philby had not yet been revealed as a Soviet spy, but two of his colleagues, Donald Maclean and Guy Burgess, had just defected to the Soviet Union. Philby is wearing a dark suit and a white shirt. His hair is straight and parted to the left. His face has the hauteur of privilege.

  "Mr. Philby," he is asked. "Mr. Macmillan, the foreign secretary, said there was no evidence that you were the so-called third man who allegedly tipped off Burgess and Maclean. Are you satisfied with that clearance that he gave you?"

  Philby answers confidently, in the plummy tones of the English upper class. "Yes, I am."

  "Well, if there was a third man, were you in fact the third man?"

  "No," Philby says, just as forcefully. "I was not."

  Ekman rewound the tape, and replayed it in slow motion. "Look at this," he said, pointing to the screen. "Twice, after being asked serious questions about whether he's committed treason, he's going to smirk. He looks like the cat who ate the canary." The expression was too brief to see normally. But at quarter speed it was painted on his face - the lips pressed together in a look of pure smugness. "He's enjoying himself, isn't he?" Ekman went on. "I call this - duping delight - the thrill you get from fooling other people." Ekman started the VCR up again. "There's another thing he does." On the screen, Philby was answering another question. "In the second place, the Burgess-Maclean affair has raised issues of great" - he pauses - "delicacy." Ekman went back to the pause, and froze the tape. "Here it is,"he said. "A very subtle microexpression of distress or unhappiness. It's only in the eyebrows - in fact, just in one eyebrow." Sure enough, Philby's right inner eyebrow was raised in an unmistakable A.U. one. "It's very brief," Ekman said. "He's not doing it voluntarily. And it totally contradicts all his confidence and assertiveness. It comes when he's talking about Burgess and Maclean, whom he had tipped off. It's a hot spot that suggests, 'You shouldn't trust what you hear.' "

  A decade ago, Ekman joined forces with J. J. Newberry - the ex-A.T.F. agent who is one of the high-scorers in the Diogenes Project - to put together a program for educating law-enforcement officials around the world in the techniques of interviewing and lie detection. In recent months, they have flown to Washington, D.C., to assist the C.I.A. and the F.B.I. in counter-terrorism training. At the same time, the Defense Advanced Research Projects Agency (DARPA) has asked Ekman and his former student Mark Frank, now at Rutgers, to develop experimental scenarios for studying deception that would be relevant to counter-terrorism. The objective is to teach people to look for discrepancies between what is said and what is signalled - to pick up on the difference between Philby's crisp denials and his fleeting anguish. It's a completely different approach from the shouting cop we see on TV and in the movies, who threatens the suspect and sweeps all of the papers and coffee cups off the battered desk. The Hollywood interrogation is an exercise in intimidation, and its point is to force the suspect to tell you what you need to know. It does not take much to see the limitations of this strategy. It depends for its success on the coöperation of the suspect - when, of course, the suspect's involuntary communication may be just as critical. And it privileges the voice over the face, when the voice and the face are equally significant channels in the same system.

  Ekman received his most memorable lesson in this truth when he and Friesen first began working on expressions of anger and distress. "It was weeks before one of us finally admitted feeling terrible after a session where we' d been making one of those faces all day," Friesen says. "Then the other realized that he'd been feeling poorly, too, so we began to keep track." They then went back and began monitoring their body during particular facial movements. "Say you do A.U. one, raising the inner eyebrows, and six, raising the cheeks, and fifteen, the lowering of the corner of the lips," Ekman said, and then did all three. "What we discovered is that that expression alone is sufficient to create marked changes in the autonomic nervous system. When this first occurred, we were stunned. We weren't expecting this at all. And it happened to both of us. We felt terrible . What we were generating was sadness, anguish. And when I lower my brows, which is four, and raise the upper eyelid, which is five, and narrow the eyelids, which is seven, and press the lips together, which is twenty-four, I' m generating anger. My heartbeat will go up ten to twelve beats. My hands will get hot. As I do it, I can't disconnect from the system. It's very unpleasant, very unpleasant."

  Ekman, Friesen, and another colleague, Robert Levenson, who teaches at Berkeley, published a study of this effect in Science. They monitored the bodily indices of anger, sadness, and fear - heart rate and body temperature - in two groups. The first group was instructed to remember and relive a particularly stressful experience. The other was told to simply produce a series of facial movements, as instructed by Ekman - to "assume the position," as they say in acting class. The second group, the people who were pretending, showed the same physiological responses as the first. A few years later, a German team of psychologists published a similar study. They had a group of subjects look at cartoons, either while holding a pen between their lips - an action that made it impossible to contract either of the two major smiling muscles, the risorius and the zygomatic major - or while holding a pen clenched between their teeth, which had the opposite effect and forced them to smile. The people with the pen between their teeth found the cartoons much funnier. Emotion doesn't just go from the inside out. It goes from the outside in. What's more, neither the subjects "assuming the position" nor the people with pens in their teeth knew they were making expressions of emotion. In the facial-feedback system, an expression you do not even know that you have can create an emotion you did not choose to feel.

  It is hard to talk to anyone who knows FACS without this point coming up again and again. Face-reading depends not just on seeing facial expressions but also on taking them seriously. One reason most of us - like the TV cop - do not closely attend to the face is that we view its evidence as secondary, as an adjunct to what we believe to be real emotion. But there's nothing secondary about the face, and surely this realization is what set John Yarbrough apart on the night that the boy in the sports car came at him with a gun. It's not just that he saw a microexpression that the rest of us would have missed. It's that he took what he saw so seriously that he was able to overcome every self-protective instinct in his body, and hold his fire.

  6.

  Yarbrough has a friend in the L.A. County Sheriff's Department, Sergeant Bob Harms, who works in narcotics in Palmdale. Harms is a member of the Diogenes Project as well, but the two men come across very differently. Harms is bigger than Yarbrough, taller and broader in the chest, with soft brown eyes and dark, thick hair. Yarbrough is restoring a Corvette and wears Rush Limbaugh ties, and he says that if he hadn't been a cop he would have liked to stay in the Marines. Harms came out of college wanting to be a commercial artist; now he plans to open a bed-and-breakfast in Vermont with his wife when he retires. On the day we met, Harms was wearing a pair of jean shorts and a short-sleeved patterned shirt. His badge was hidden inside his shirt. He takes notes not on a yellow legal pad, which he considers unnecessarily intimidating to witnesses, but on a powder-blue one. "I always get teased because I'm the touchy-feely one," Harms said. "John Yarbrough is very analytical. He thinks before he speaks. There is a lot going on inside his head. He's constantly thinking four or five steps ahead, then formulating whatever his answers are going to be. That's not how I do my interviews. I have a conversation. It's not "Where were you on Friday night?' Because that's the way we normally communicate. I never say, "I'm Sergeant Harms.' I always start by saying, "I'm Bob Harms, and I'm here to talk to you about your case,' and the first thing I do is smile."

  The sensation of talking to the two men, however, is surprisingly similar. Normal conversation is like a game of tennis: you talk and I listen, you listen and I talk, and we feel scrutinized by our conversational partner only when the ball is in our court. But Yarbrough and Harms never stop watching, even when they're doing the talking. Yarbrough would comment on my conversational style, noting where I held my hands as I talked, or how long I would wait out a lull in the conversation. At one point, he stood up and soundlessly moved to the door - which he could have seen only in his peripheral vision - opening it just before a visitor rang the doorbell. Harms gave the impression that he was deeply interested in me. It wasn't empathy. It was a kind of powerful curiosity. "I remember once, when I was in prison custody, I used to shake prisoners' hands," Harms said. "The deputies thought I was crazy. But I wanted to see what happened, because that's what these men are starving for, some dignity and respect."

  Some of what sets Yarbrough and Harms and the other face readers apart is no doubt innate. But the fact that people can be taught so easily to recognize microexpressions, and can learn FACS, suggests that we all have at least the potential capacity for this kind of perception. Among those who do very well at face-reading, tellingly, are some aphasics, such as stroke victims who have lost the ability to understand language. Collaborating with Ekman on a paper that was recently published in Nature, the psychologist Nancy Etcoff, of Massachusetts General Hospital, described how a group of aphasics trounced a group of undergraduates at M.I.T. on the nurses tape. Robbed of the power to understand speech, the stroke victims had apparently been forced to become far more sensitive to the information written on people's faces. "They are compensating for the loss in one channel through these other channels," Etcoff says. "We could hypothesize that there is some kind of rewiring in the brain, but I don't think we need that explanation. They simply exercise these skills much more than we do." Ekman has also done work showing that some abused children are particularly good at reading faces as well: like the aphasics in the study, they developed "interpretive strategies" - in their case, so they could predict the behavior of their volatile parents.

  What appears to be a kind of magical, effortless intuition about faces, then, may not really be effortless and magical at all. This kind of intuition is a product of desire and effort. Silvan Tomkins took a sabbatical from Princeton when his son Mark was born, and stayed in his house on the Jersey Shore, staring into his son's face, long and hard, picking up the patterns of emotion - the cycles of interest, joy, sadness, and anger - that flash across an infant's face in the first few months of life. He taught himself the logic of the furrows and the wrinkles and the creases, the subtle differences between the pre-smile and the pre-cry face. Later, he put together a library of thousands of photographs of human faces, in every conceivable expression. He developed something called the Picture Arrangement Test, which was his version of the Rorschach blot: a patient would look at a series of pictures and be asked to arrange them in a sequence and then tell a story based on what he saw. The psychologist was supposed to interpret the meaning of the story, but Tomkins would watch a videotape of the patient with the sound off, and by studying the expressions on the patient's face teach himself to predict what the story was. Face-reading, for those who have mastered it, becomes a kind of compulsion; it becomes hard to be satisfied with the level and quality of information that most of us glean from normal social encounters. "Whenever we get together," Harms says of spending time with other face readers, "we debrief each other. We're constantly talking about cases, or some of these videotapes of Ekman's, and we say, "I missed that, did you get that?' Maybe there's an emotion attached there. We're always trying to place things, and replaying interviews in our head."

  This is surely why the majority of us don't do well at reading faces: we feel no need to make that extra effort. People fail at the nurses tape, Ekman says, because they end up just listening to the words. That's why, when Tomkins was starting out in his quest to understand the face, he always watched television with the sound turned off. "We are such creatures of language that what we hear takes precedence over what is supposed to be our primary channel of communication, the visual channel," he once said. "Even though the visual channel provides such enormous information, the fact is that the voice preëmpts the individual's attention, so that he cannot really see the face while he listens." We prefer that way of dealing with the world because it does not challenge the ordinary boundaries of human relationships. Ekman, in one of his essays, writes of what he learned from the legendary sociologist Erving Goffman. Goffman said that part of what it means to be civilized is not to "steal" information that is not freely given to us. When someone picks his nose or cleans his ears, out of unthinking habit, we look away. Ekman writes that for Goffman the spoken word is "the acknowledged information, the information for which the person who states it is willing to take responsibility," and he goes on:

  When the secretary who is miserable about a fight with her husband the previous night answers, "Just fine," when her boss asks, "How are you this morning?" - that false message may be the one relevant to the boss's interactions with her. It tells him that she is going to do her job. The true message - that she is miserable - he may not care to know about at all as long as she does not intend to let it impair her job performance.

  What would the boss gain by reading the subtle and contradictory microexpressions on his secretary's face? It would be an invasion of her privacy and an act of disrespect. More than that, it would entail an obligation. He would be obliged to do something, or say something, or feel something that might otherwise be avoided entirely. To see what is intended to be hidden, or, at least, what is usually missed, opens up a world of uncomfortable possibilities. This is the hard part of being a face reader. People like that have more faith in their hunches than the rest of us do. But faith is not certainty. Sometimes, on a routine traffic stop late at night, you end up finding out that your hunch was right. But at other times you'll never know. And you can't even explain it properly, because what can you say? You did something the rest of us would never have done, based on something the rest of us would never have seen.

  "I was working in West Hollywood once, in the nineteen-eighties," Harms said. "I was with a partner, Scott. I was driving. I had just recently come off the prostitution team, and we spotted a man in drag. He was on Sunset, and I didn't recognize him. At that time, Sunset was normally for females. So it was kind of odd. It was a cold night in January. There was an all-night restaurant on Sunset called Ben Franks, so I asked my partner to roll down the window and ask the guy if he was going to Ben Franks - just to get a reaction. And the guy immediately keys on Scott, and he's got an overcoat on, and he's all bundled up, and he starts walking over to the car. It had been raining so much that the sewers in West Hollywood had backed up, and one of the manhole covers had been cordoned off because it was pumping out water. The guy comes over to the squad car, and he's walking right through that. He's fixated on Scott. So we asked him what he was doing. He says, "I was out for a walk.' And then he says, "I have something to show you.'"

  Later, after the incident was over, Harms and his partner learned that the man had been going around Hollywood making serious threats, that he was unstable and had just attempted suicide, that he was in all likelihood about to erupt. A departmental inquiry into the incident would affirm that Harms and his partner had been in danger: the man was armed with a makeshift flamethrower, and what he had in mind, evidently, was to turn the inside of the squad car into an inferno. But at the time all Harms had was a hunch, a sense from the situation and the man's behavior and what he glimpsed inside the man's coat and on the man's face - something that was the opposite of whatever John Yarbrough saw in the face of the boy in Willowbrook. Harms pulled out his gun and shot the man through the open window. "Scott looked at me and was, like, "What did you do?' because he didn't perceive any danger," Harms said. "But I did."
GO TO TOP MENU

  Are smart people overrated?

  1.

  Five years ago, several executives at McKinsey & Company, America's largest and most prestigious management-consulting firm, launched what they called the War for Talent. Thousands of questionnaires were sent to managers across the country. Eighteen companies were singled out for special attention, and the consultants spent up to three days at each firm, interviewing everyone from the C.E.O. down to the human-resources staff. McKinsey wanted to document how the top-performing companies in America differed from other firms in the way they handle matters like hiring and promotion. But, as the consultants sifted through the piles of reports and questionnaires and interview transcripts, they grew convinced that the difference between winners and losers was more profound than they had realized. "We looked at one another and suddenly the light bulb blinked on," the three consultants who headed the project - Ed Michaels, Helen Handfield-Jones, and Beth Axelrod - write in their new book, also called "The War for Talent." The very best companies, they concluded, had leaders who were obsessed with the talent issue. They recruited ceaselessly, finding and hiring as many top performers as possible. They singled out and segregated their stars, rewarding them disproportionately, and pushing them into ever more senior positions. "Bet on the natural athletes, the ones with the strongest intrinsic skills," the authors approvingly quote one senior General Electric executive as saying. "Don't be afraid to promote stars without specifically relevant experience, seemingly over their heads." Success in the modern economy, according to Michaels, Handfield-Jones, and Axelrod, requires "the talent mind-set": the "deep-seated belief that having better talent at all levels is how you outperform your competitors."

  This "talent mind-set" is the new orthodoxy of American management. It is the intellectual justification for why such a high premium is placed on degrees from first-tier business schools, and why the compensation packages for top executives have become so lavish. In the modern corporation, the system is considered only as strong as its stars, and, in the past few years, this message has been preached by consultants and management gurus all over the world. None, however, have spread the word quite so ardently as McKinsey, and, of all its clients, one firm took the talent mind-set closest to heart. It was a company where McKinsey conducted twenty separate projects, where McKinsey's billings topped ten million dollars a year, where a McKinsey director regularly attended board meetings, and where the C.E.O. himself was a former McKinsey partner. The company, of course, was Enron.

  The Enron scandal is now almost a year old. The reputations of Jeffrey Skilling and Kenneth Lay, the company's two top executives, have been destroyed. Arthur Andersen, Enron's auditor, has been driven out of business, and now investigators have turned their attention to Enron's investment bankers. The one Enron partner that has escaped largely unscathed is McKinsey, which is odd, given that it essentially created the blueprint for the Enron culture. Enron was the ultimate "talent" company. When Skilling started the corporate division known as Enron Capital and Trade, in 1990, he "decided to bring in a steady stream of the very best college and M.B.A. graduates he could find to stock the company with talent," Michaels, Handfield-Jones, and Axelrod tell us. During the nineties, Enron was bringing in two hundred and fifty newly minted M.B.A.s a year. "We had these things called Super Saturdays," one former Enron manager recalls. "I'd interview some of these guys who were fresh out of Harvard, and these kids could blow me out of the water. They knew things I'd never heard of." Once at Enron, the top performers were rewarded inordinately, and promoted without regard for seniority or experience. Enron was a star system. "The only thing that differentiates Enron from our competitors is our people, our talent," Lay, Enron's former chairman and C.E.O., told the McKinsey consultants when they came to the company's headquarters, in Houston. Or, as another senior Enron executive put it to Richard Foster, a McKinsey partner who celebrated Enron in his 2001 book, "Creative Destruction," "We hire very smart people and we pay them more than they think they are worth."

  The management of Enron, in other words, did exactly what the consultants at McKinsey said that companies ought to do in order to succeed in the modern economy. It hired and rewarded the very best and the very brightest - and it is now in bankruptcy. The reasons for its collapse are complex, needless to say. But what if Enron failed not in spite of its talent mind-set but because of it? What if smart people are overrated?

  2.

  At the heart of the McKinsey vision is a process that the War for Talent advocates refer to as "differentiation and affirmation." Employers, they argue, need to sit down once or twice a year and hold a "candid, probing, no-holds-barred debate about each individual," sorting employees into A, B, and C groups. The A's must be challenged and disproportionately rewarded. The B's need to be encouraged and affirmed. The C's need to shape up or be shipped out. Enron followed this advice almost to the letter, setting up internal Performance Review Committees. The members got together twice a year, and graded each person in their section on ten separate criteria, using a scale of one to five. The process was called "rank and yank." Those graded at the top of their unit received bonuses two-thirds higher than those in the next thirty per cent; those who ranked at the bottom received no bonuses and no extra stock options - and in some cases were pushed out.

  How should that ranking be done? Unfortunately, the McKinsey consultants spend very little time discussing the matter. One possibility is simply to hire and reward the smartest people. But the link between, say, I.Q. and job performance is distinctly underwhelming. On a scale where 0.1 or below means virtually no correlation and 0.7 or above implies a strong correlation (your height, for example, has a 0.7 correlation with your parents' height), the correlation between I.Q. and occupational success is between 0.2 and 0.3. "What I.Q. doesn't pick up is effectiveness at common-sense sorts of things, especially working with people," Richard Wagner, a psychologist at Florida State University, says. "In terms of how we evaluate schooling, everything is about working by yourself. If you work with someone else, it's called cheating. Once you get out in the real world, everything you do involves working with other people."

  Wagner and Robert Sternberg, a psychologist at Yale University, have developed tests of this practical component, which they call "tacit knowledge." Tacit knowledge involves things like knowing how to manage yourself and others, and how to navigate complicated social situations. Here is a question from one of their tests:

You have just been promoted to head of an important department in your organization. The previous head has been transferred to an equivalent position in a less important department. Your understanding of the reason for the move is that the performance of the department as a whole has been mediocre. There have not been any glaring deficiencies, just a perception of the department as so-so rather than very good. Your charge is to shape up the department. Results are expected quickly. Rate the quality of the following strategies for succeeding at your new position.

a) Always delegate to the most junior person who can be trusted with the task.
b) Give your superiors frequent progress reports.
c) Announce a major reorganization of the department that includes getting rid of whomever you believe to be "dead wood."
d) Concentrate more on your people than on the tasks to be done.
e) Make people feel completely responsible for their work.

  Wagner finds that how well people do on a test like this predicts how well they will do in the workplace: good managers pick (b) and (e); bad managers tend to pick (c). Yet there's no clear connection between such tacit knowledge and other forms of knowledge and experience. The process of assessing ability in the workplace is a lot messier than it appears.

  An employer really wants to assess not potential but performance. Yet that's just as tricky. In "The War for Talent," the authors talk about how the Royal Air Force used the A, B, and C ranking system for its pilots during the Battle of Britain. But ranking fighter pilots - for whom there are a limited and relatively objective set of performance criteria (enemy kills, for example, and the ability to get their formations safely home) - is a lot easier than assessing how the manager of a new unit is doing at, say, marketing or business development. And whom do you ask to rate the manager's performance? Studies show that there is very little correlation between how someone's peers rate him and how his boss rates him. The only rigorous way to assess performance, according to human-resources specialists, is to use criteria that are as specific as possible. Managers are supposed to take detailed notes on their employees throughout the year, in order to remove subjective personal reactions from the process of assessment. You can grade someone's performance only if you know their performance. And, in the freewheeling culture of Enron, this was all but impossible. People deemed "talented" were constantly being pushed into new jobs and given new challenges. Annual turnover from promotions was close to twenty per cent. Lynda Clemmons, the so-called "weather babe" who started Enron's weather derivatives business, jumped, in seven quick years, from trader to associate to manager to director and, finally, to head of her own business unit. How do you evaluate someone's performance in a system where no one is in a job long enough to allow such evaluation?

  The answer is that you end up doing performance evaluations that aren't based on performance. Among the many glowing books about Enron written before its fall was the best-seller "Leading the Revolution," by the management consultant Gary Hamel, which tells the story of Lou Pai, who launched Enron's power-trading business. Pai's group began with a disaster: it lost tens of millions of dollars trying to sell electricity to residential consumers in newly deregulated markets. The problem, Hamel explains, is that the markets weren't truly deregulated: "The states that were opening their markets to competition were still setting rules designed to give their traditional utilities big advantages." It doesn't seem to have occurred to anyone that Pai ought to have looked into those rules more carefully before risking millions of dollars. He was promptly given the chance to build the commercial electricity-outsourcing business, where he ran up several more years of heavy losses before cashing out of Enron last year with two hundred and seventy million dollars. Because Pai had "talent," he was given new opportunities, and when he failed at those new opportunities he was given still more opportunities . . . because he had "talent." "At Enron, failure - even of the type that ends up on the front page of the Wall Street Journal - doesn't necessarily sink a career," Hamel writes, as if that were a good thing. Presumably, companies that want to encourage risk-taking must be willing to tolerate mistakes. Yet if talent is defined as something separate from an employee's actual performance, what use is it, exactly?

  3.

  What the War for Talent amounts to is an argument for indulging A employees, for fawning over them. "You need to do everything you can to keep them engaged and satisfied - even delighted," Michaels, Handfield-Jones, and Axelrod write. "Find out what they would most like to be doing, and shape their career and responsibilities in that direction. Solve any issues that might be pushing them out the door, such as a boss that frustrates them or travel demands that burden them." No company was better at this than Enron. In one oft-told story, Louise Kitchin, a twenty-nine-year-old gas trader in Europe, became convinced that the company ought to develop an online-trading business. She told her boss, and she began working in her spare time on the project, until she had two hundred and fifty people throughout Enron helping her. After six months, Skilling was finally informed. "I was never asked for any capital," Skilling said later. "I was never asked for any people. They had already purchased the servers. They had already started ripping apart the building. They had started legal reviews in twenty-two countries by the time I heard about it." It was, Skilling went on approvingly, "exactly the kind of behavior that will continue to drive this company forward."

  Kitchin's qualification for running EnronOnline, it should be pointed out, was not that she was good at it. It was that she wanted to do it, and Enron was a place where stars did whatever they wanted. "Fluid movement is absolutely necessary in our company. And the type of people we hire enforces that," Skilling told the team from McKinsey. "Not only does this system help the excitement level for each manager, it shapes Enron's business in the direction that its managers find most exciting." Here is Skilling again: "If lots of [employees] are flocking to a new business unit, that's a good sign that the opportunity is a good one. . . . If a business unit can't attract people very easily, that's a good sign that it's a business Enron shouldn't be in." You might expect a C.E.O. to say that if a business unit can't attract customers very easily that's a good sign it's a business the company shouldn't be in. A company's business is supposed to be shaped in the direction that its managers find most profitable. But at Enron the needs of the customers and the shareholders were secondary to the needs of its stars.

  A dozen years ago, the psychologists Robert Hogan, Robert Raskin, and Dan Fazzini wrote a brilliant essay called "The Dark Side of Charisma." It argued that flawed managers fall into three types. One is the High Likability Floater, who rises effortlessly in an organization because he never takes any difficult decisions or makes any enemies. Another is the Homme de Ressentiment, who seethes below the surface and plots against his enemies. The most interesting of the three is the Narcissist, whose energy and self-confidence and charm lead him inexorably up the corporate ladder. Narcissists are terrible managers. They resist accepting suggestions, thinking it will make them appear weak, and they don't believe that others have anything useful to tell them. "Narcissists are biased to take more credit for success than is legitimate," Hogan and his co-authors write, and "biased to avoid acknowledging responsibility for their failures and shortcomings for the same reasons that they claim more success than is their due." Moreover:

Narcissists typically make judgments with greater confidence than other people . . . and, because their judgments are rendered with such conviction, other people tend to believe them and the narcissists become disproportionately more influential in group situations. Finally, because of their self-confidence and strong need for recognition, narcissists tend to "self-nominate"; consequently, when a leadership gap appears in a group or organization, the narcissists rush to fill it.

  Tyco Corporation and WorldCom were the Greedy Corporations: they were purely interested in short-term financial gain. Enron was the Narcissistic Corporation - a company that took more credit for success than was legitimate, that did not acknowledge responsibility for its failures, that shrewdly sold the rest of us on its genius, and that substituted self-nomination for disciplined management. At one point in "Leading the Revolution," Hamel tracks down a senior Enron executive, and what he breathlessly recounts - the braggadocio, the self-satisfaction - could be an epitaph for the talent mind-set:

"You cannot control the atoms within a nuclear fusion reaction," said Ken Rice when he was head of Enron Capital and Trade Resources (ECT), America's largest marketer of natural gas and largest buyer and seller of electricity. Adorned in a black T-shirt, blue jeans, and cowboy boots, Rice drew a box on an office whiteboard that pictured his business unit as a nuclear reactor. Little circles in the box represented its "contract originators," the gunslingers charged with doing deals and creating new businesses. Attached to each circle was an arrow. In Rice's diagram the arrows were pointing in all different directions. "We allow people to go in whichever direction that they want to go."

  The distinction between the Greedy Corporation and the Narcissistic Corporation matters, because the way we conceive our attainments helps determine how we behave. Carol Dweck, a psychologist at Columbia University, has found that people generally hold one of two fairly firm beliefs about their intelligence: they consider it either a fixed trait or something that is malleable and can be developed over time. Five years ago, Dweck did a study at the University of Hong Kong, where all classes are conducted in English. She and her colleagues approached a large group of social-sciences students, told them their English-proficiency scores, and asked them if they wanted to take a course to improve their language skills. One would expect all those who scored poorly to sign up for the remedial course. The University of Hong Kong is a demanding institution, and it is hard to do well in the social sciences without strong English skills. Curiously, however, only the ones who believed in malleable intelligence expressed interest in the class. The students who believed that their intelligence was a fixed trait were so concerned about appearing to be deficient that they preferred to stay home. "Students who hold a fixed view of their intelligence care so much about looking smart that they act dumb," Dweck writes, "for what could be dumber than giving up a chance to learn something that is essential for your own success?"

  In a similar experiment, Dweck gave a class of preadolescent students a test filled with challenging problems. After they were finished, one group was praised for its effort and another group was praised for its intelligence. Those praised for their intelligence were reluctant to tackle difficult tasks, and their performance on subsequent tests soon began to suffer. Then Dweck asked the children to write a letter to students at another school, describing their experience in the study. She discovered something remarkable: forty per cent of those students who were praised for their intelligence lied about how they had scored on the test, adjusting their grade upward. They weren't naturally deceptive people, and they weren't any less intelligent or self-confident than anyone else. They simply did what people do when they are immersed in an environment that celebrates them solely for their innate "talent." They begin to define themselves by that description, and when times get tough and that self-image is threatened they have difficulty with the consequences. They will not take the remedial course. They will not stand up to investors and the public and admit that they were wrong. They'd sooner lie.

  4.

  The broader failing of McKinsey and its acolytes at Enron is their assumption that an organization's intelligence is simply a function of the intelligence of its employees. They believe in stars, because they don't believe in systems. In a way, that's understandable, because our lives are so obviously enriched by individual brilliance. Groups don't write great novels, and a committee didn't come up with the theory of relativity. But companies work by different rules. They don't just create; they execute and compete and coördinate the efforts of many different people, and the organizations that are most successful at that task are the ones where the system is the star.

  There is a wonderful example of this in the story of the so-called Eastern Pearl Harbor, of the Second World War. During the first nine months of 1942, the United States Navy suffered a catastrophe. German U-boats, operating just off the Atlantic coast and in the Caribbean, were sinking our merchant ships almost at will. U-boat captains marvelled at their good fortune. "Before this sea of light, against this footlight glare of a carefree new world were passing the silhouettes of ships recognizable in every detail and sharp as the outlines in a sales catalogue," one U-boat commander wrote. "All we had to do was press the button."

  What made this such a puzzle is that, on the other side of the Atlantic, the British had much less trouble defending their ships against U-boat attacks. The British, furthermore, eagerly passed on to the Americans everything they knew about sonar and depth-charge throwers and the construction of destroyers. And still the Germans managed to paralyze America's coastal zones.

  You can imagine what the consultants at McKinsey would have concluded: they would have said that the Navy did not have a talent mind-set, that President Roosevelt needed to recruit and promote top performers into key positions in the Atlantic command. In fact, he had already done that. At the beginning of the war, he had pushed out the solid and unspectacular Admiral Harold R. Stark as Chief of Naval Operations and replaced him with the legendary Ernest Joseph King. "He was a supreme realist with the arrogance of genius," Ladislas Farago writes in "The Tenth Fleet," a history of the Navy's U-boat battles in the Second World War. "He had unbounded faith in himself, in his vast knowledge of naval matters and in the soundness of his ideas. Unlike Stark, who tolerated incompetence all around him, King had no patience with fools."

  The Navy had plenty of talent at the top, in other words. What it didn't have was the right kind of organization. As Eliot A. Cohen, a scholar of military strategy at Johns Hopkins, writes in his brilliant book "Military Misfortunes in the Atlantic":

To wage the antisubmarine war well, analysts had to bring together fragments of information, direction-finding fixes, visual sightings, decrypts, and the "flaming datum" of a U-boat attack - for use by a commander to coordinate the efforts of warships, aircraft, and convoy commanders. Such synthesis had to occur in near "real time" - within hours, even minutes in some cases.

  The British excelled at the task because they had a centralized operational system. The controllers moved the British ships around the Atlantic like chess pieces, in order to outsmart U-boat "wolf packs." By contrast, Admiral King believed strongly in a decentralized management structure: he held that managers should never tell their subordinates " 'how' as well as what to 'do.' " In today's jargon, we would say he was a believer in "loose-tight" management, of the kind celebrated by the McKinsey consultants Thomas J. Peters and Robert H. Waterman in their 1982 best-seller, "In Search of Excellence." But "loose-tight" doesn't help you find U-boats. Throughout most of 1942, the Navy kept trying to act smart by relying on technical know-how, and stubbornly refused to take operational lessons from the British. The Navy also lacked the organizational structure necessary to apply the technical knowledge it did have to the field. Only when the Navy set up the Tenth Fleet - a single unit to coördinate all anti-submarine warfare in the Atlantic - did the situation change. In the year and a half before the Tenth Fleet was formed, in May of 1943, the Navy sank thirty-six U-boats. In the six months afterward, it sank seventy-five. "The creation of the Tenth Fleet did not bring more talented individuals into the field of ASW" - anti-submarine warfare - "than had previous organizations," Cohen writes. "What Tenth Fleet did allow, by virtue of its organization and mandate, was for these individuals to become far more effective than previously." The talent myth assumes that people make organizations smart. More often than not, it's the other way around.

  5.

  There is ample evidence of this principle among America's most successful companies. Southwest Airlines hires very few M.B.A.s, pays its managers modestly, and gives raises according to seniority, not "rank and yank." Yet it is by far the most successful of all United States airlines, because it has created a vastly more efficient organization than its competitors have. At Southwest, the time it takes to get a plane that has just landed ready for takeoff - a key index of productivity - is, on average, twenty minutes, and requires a ground crew of four, and two people at the gate. (At United Airlines, by contrast, turnaround time is closer to thirty-five minutes, and requires a ground crew of twelve and three agents at the gate.)

  In the case of the giant retailer Wal-Mart, one of the most critical periods in its history came in 1976, when Sam Walton "unretired," pushing out his handpicked successor, Ron Mayer. Mayer was just over forty. He was ambitious. He was charismatic. He was, in the words of one Walton biographer, "the boy-genius financial officer." But Walton was convinced that Mayer was, as people at McKinsey would say, "differentiating and affirming" in the corporate suite, in defiance of Wal-Mart's inclusive culture. Mayer left, and Wal-Mart survived. After all, Wal-Mart is an organization, not an all-star team. Walton brought in David Glass, late of the Army and Southern Missouri State University, as C.E.O.; the company is now ranked No. 1 on the Fortune 500 list.

  Procter & Gamble doesn't have a star system, either. How could it? Would the top M.B.A. graduates of Harvard and Stanford move to Cincinnati to work on detergent when they could make three times as much reinventing the world in Houston? Procter & Gamble isn't glamorous. Its C.E.O. is a lifer - a former Navy officer who began his corporate career as an assistant brand manager for Joy dishwashing liquid - and, if Procter & Gamble's best played Enron's best at Trivial Pursuit, no doubt the team from Houston would win handily. But Procter & Gamble has dominated the consumer-products field for close to a century, because it has a carefully conceived managerial system, and a rigorous marketing methodology that has allowed it to win battles for brands like Crest and Tide decade after decade. In Procter & Gamble's Navy, Admiral Stark would have stayed. But a cross-divisional management committee would have set the Tenth Fleet in place before the war ever started.

  6.

  Among the most damning facts about Enron, in the end, was something its managers were proudest of. They had what, in McKinsey terminology, is called an "open market" for hiring. In the open-market system - McKinsey's assault on the very idea of a fixed organization - anyone could apply for any job that he or she wanted, and no manager was allowed to hold anyone back. Poaching was encouraged. When an Enron executive named Kevin Hannon started the company's global broadband unit, he launched what he called Project Quick Hire. A hundred top performers from around the company were invited to the Houston Hyatt to hear Hannon give his pitch. Recruiting booths were set up outside the meeting room. "Hannon had his fifty top performers for the broadband unit by the end of the week," Michaels, Handfield-Jones, and Axelrod write, "and his peers had fifty holes to fill." Nobody, not even the consultants who were paid to think about the Enron culture, seemed worried that those fifty holes might disrupt the functioning of the affected departments, that stability in a firm's existing businesses might be a good thing, that the self-fulfillment of Enron's star employees might possibly be in conflict with the best interests of the firm as a whole.

  These are the sort of concerns that management consultants ought to raise. But Enron's management consultant was McKinsey, and McKinsey was as much a prisoner of the talent myth as its clients were. In 1998, Enron hired ten Wharton M.B.A.s; that same year, McKinsey hired forty. In 1999, Enron hired twelve from Wharton; McKinsey hired sixty-one. The consultants at McKinsey were preaching at Enron what they believed about themselves. "When we would hire them, it wouldn't just be for a week," one former Enron manager recalls, of the brilliant young men and women from McKinsey who wandered the hallways at the company's headquarters. "It would be for two to four months. They were always around." They were there looking for people who had the talent to think outside the box. It never occurred to them that, if everyone had to think outside the box, maybe it was the box that needed fixing.
GO TO TOP MENU

  Big business and the myth of the lone inventor

  1.

  Philo T. Farnsworth was born in 1906, and he looked the way an inventor of that era was supposed to look: slight and gaunt, with bright-blue exhausted eyes, and a mane of brown hair swept back from his forehead. He was nervous and tightly wound. He rarely slept. He veered between fits of exuberance and depression. At the age of three, he was making precise drawings of the internal mechanisms of locomotives. At six, he declared his intention to follow in the footsteps of Thomas Edison and Alexander Graham Bell. At fourteen, while tilling a potato field on his family's farm in Idaho, he saw the neat, parallel lines of furrows in front of him, and it occurred to him - in a single, blinding moment - that a picture could be sent electronically through the airwaves in the same way, broken down into easily transmitted lines and then reassembled into a complete picture at the other end. He went to see his high-school science teacher, and covered the blackboard with drawings and equations. At nineteen, after dropping out of college, he impressed two local investors with his brilliance and his conviction. He moved to California and set up shop in a tiny laboratory. He got married on an impulse. On his wedding night, he seized his bride by the shoulders and looked at her with those bright-blue eyes. "Pemmie," he said. "I have to tell you. There is another woman in my life - and her name is Television."

  Philo T. Farnsworth was the inventor of television. Through the nineteen-thirties and forties, he engaged in a heroic battle to perfect and commercialize his discovery, fending off creditors and predators, and working himself to the point of emotional and physical exhaustion. His nemesis was David Sarnoff, the head of RCA, then one of the most powerful American electronics companies. Sarnoff lived in an enormous Upper East Side mansion and smoked fat cigars and travelled by chauffeured limousine. His top television researcher was Vladimir Zworykin, the scion of a wealthy Russian family, who wore elegant three-piece suits and round spectacles, had a Ph.D. in physics, and apprenticed with the legendary Boris Rosing at the St. Petersburg Institute of Technology. Zworykin was never more than half a step behind Farnsworth: he filed for a patent on his own version of electronic television two years after Farnsworth had his potato-field vision. At one point, Sarnoff sent Zworykin to Farnsworth's tiny laboratory, on Green Street in San Francisco, and he stayed for three days, asking suspiciously detailed questions. He had one of Farnsworth's engineers build the heart of Farnsworth's television system - the so-called image dissector - before his eyes, and then picked the tube up and turned it over in his hands and said, ominously, "This is a beautiful instrument. I wish I had invented it myself." Soon Sarnoff himself came out to Green Street, swept imperially through the laboratory, and declared, "There's nothing here we'll need." It was, of course, a lie. In the nineteen-thirties, television was not possible without Philo Farnsworth's work. But in the end it didn't much matter. Farnsworth's company was forced out of the TV business. Farnsworth had a nervous breakdown, and Sarnoff used his wealth and power to declare himself the father of television.

  The life of Philo Farnsworth is the subject of two new books, "The Last Lone Inventor," by Evan I. Schwartz (HarperCollins; $24.95), and "The Boy Genius and the Mogul," by Daniel Stashower (Broadway; $24.95). It is a wonderful tale, riveting and bittersweet. But its lessons, on closer examination, are less straightforward than the clichés of the doomed inventor and the villainous mogul might suggest. Philo Farnsworth's travails make a rather strong case for big corporations, not against them.

  2.

  The idea of television arose from two fundamental discoveries. The first was photoconductivity. In 1872, Joseph May and Willoughby Smith discovered that the electrical resistance of certain metals varied according to their exposure to light. And, since everyone knew how to transmit electricity from one place to another, it made sense that images could be transmitted as well. The second discovery was what is called visual persistence. In 1880, the French engineer Maurice LeBlanc pointed out that, because the human eye retains an image for about a tenth of a second, if you wanted to transmit a picture you didn't have to send it all at once. You could scan it, one line at a time, and, as long as you put all those lines back together at the other end within that fraction of a second, the human eye would be fooled into thinking that it was seeing a complete picture.

  The hard part was figuring out how to do the scanning. In 1883, the German engineer Paul Nipkow devised an elaborate and ultimately unworkable system using a spinning metal disk. The disk was punctured with a spiral of small holes, and, as it spun, one line of light after another was projected through the holes onto a photocell. In 1908, a British electrical engineer named A. A. Campbell Swinton suggested that it would make more sense to scan images electronically, using a cathode ray. Philo Farnsworth was the first to work out how to do that. His image dissector was a vacuum tube with a lens at one end, a photoelectric plate right in front of the lens to convert the image from light to electricity, and then an "anode finger" to scan the electrical image line by line. After setting up his laboratory, Farnsworth tinkered with his makeshift television camera day and night for months. Finally, on September 7, 1927, he was ready. His wife, Pem, was by his side. His tiny television screen was in front of him. His brother-in-law, Cliff Gardner, was manning the television camera in a room at the other end of the lab. Stashower writes:

  Squaring his shoulders, Farnsworth took his place at the controls and flicked a series of switches. A small, bluish patch of light appeared at the end of the receiving tube. Farnsworth lifted his head and began calling out instructions to Gardner in the next room.

  "Put in the slide, Cliff," Farnsworth said.

  "Okay, it's in," Gardner answered. "Can you see it?"

  A faint but unmistakable line appeared across the receiving end of the tube. As Farnsworth made some adjustments, the line became more distinct.

  "Turn the slide a quarter turn, Cliff," Farnsworth called. Seconds later, the line on the receiving tube rotated ninety degrees. Farnsworth looked up from the tube. "That's it, folks," he announced with a tremor in his voice. "We've done it - there you have electronic television."

  Both Stashower and Schwartz talk about how much meaning Farnsworth attached to this moment. He was a romantic, and in the romance of invention the creative process consists of two discrete, euphoric episodes, linked by long years of grit and hard work. First is the magic moment of conception: Farnsworth in the potato field. Second is the moment of execution: the day in the lab. If you had the first of those moments and not the second, you were a visionary. But if you had both you were in a wholly different category. Farnsworth must have known the story of King Gillette, the bottle-cap salesman, who woke up one morning in the summer of 1895 to find his razor dull. Gillette had a sudden vision: if all he wanted was a sharp edge, then why should he have to refashion the whole razor? Gillette later recalled:

  As I stood there with the razor in my hand, my eyes resting on it as lightly as a bird settling down on its nest, the Gillette razor was born - more with the rapidity of a dream than by a process of reasoning. In a moment I saw it all: the way the blade could be held in a holder; the idea of sharpening the two opposite edges on the thin piece of steel; the clamping plates for the blade, with a handle half-way between the two edges of the blade.   I stood there before the mirror in a trance of joy. My wife was visiting Ohio and I hurriedly wrote to her: "I've got it! Our fortune is made!"

  If you had the vision and you made the vision work, then the invention was yours - that was what Farnsworth believed. It belonged to you, just as the safety razor belonged to King Gillette.

  But this was Farnsworth's mistake, because television wasn't at all like the safety razor. It didn't belong to one person. May and Smith stumbled across photoconductivity, and inspired LeBlanc, who, in turn, inspired Swinton, and Swinton's idea inspired inventors around the world. Then there was Zworykin, of course, and his mentor Boris Rosing, and the team of Max Dieckmann and Rudolf Hell, in Germany, who tried to patent something in the mid-twenties that was virtually identical to the image dissector. In 1931, when Zworykin perfected his own version of the television camera, called the Iconoscope, RCA did a worldwide patent search and found very similar patent applications from a Hungarian named Kolomon Tihany, a Canadian named François Henrouteau, a Japanese inventor named Kenjiro Takayanagi, two Englishmen, and a Russian. Everyone was working on television and everyone was reading everyone else's patent applications, and, because television was such a complex technology, nearly everyone had something new to add. Farnsworth came up with the first camera. Zworykin had the best early picture tube. And when Zworykin finally came up with his own camera it was not as good as Farnsworth's camera in some respects, but it was better in others. In September of 1939, when RCA finally licensed the rights to Farnsworth's essential patents, it didn't replace the Iconoscope with Farnsworth's image dissector. It took the best parts of both.

  It is instructive to compare the early history of television with the development, some seventy-five years earlier, of the sewing machine. As the historian Grace Rogers Cooper points out, a sewing machine is really six different mechanisms in one - a means of supporting the cloth, a needle and a combining device to form the stitch, a feeding mechanism to allow one stitch to follow another, a means of insuring the even delivery of thread, and a governing mechanism to insure that each of the previous five steps is performed in sequence. Cooper writes in her book "The Sewing Machine":

  Weisenthal had added a point to the eye-end of the needle. Saint supported the fabric by placing it in a horizontal position with a needle entering vertically, Duncan successfully completed a chainstitch for embroidery purposes, Chapman used a needle with an eye at its point and did not pass it completely through the fabric, Krems stitched circular caps with an eye-pointed needle used with a hook to form a chainstitch, Thimmonier used the hooked needle to form a chainstitch on a fabric laid horizontally, and Hunt created a new stitch that was more readily adapted to sewing by machine than the hand stitches had been.

  The man generally credited with combining and perfecting these elements is Elias Howe, a machinist from Boston. But even Howe's patents were quickly superseded by a new round of patents, each taking one of the principles of his design and either augmenting it or replacing it. The result was legal and commercial gridlock, broken only when, in 1856, Howe and three of the leading sewing-machine manufacturers (among them Isaac Merritt Singer, who gave the world the sewing-machine foot pedal) agreed to pool their patents and form a trust. It was then that the sewing-machine business took off. For the sewing machine to succeed, in other words, those who saw themselves as sewing-machine inventors had to swallow their pride and concede that the machine was larger than they were - that groups, not individuals, invent complex technologies. That was what Farnsworth could not do, and it explains the terrible turn that his life took.

  3.

  David Sarnoff's RCA had a very strict policy on patents. If you worked for RCA and you invented something patentable, it belonged to RCA. Your name was on the patent, and you got credit for your work. But you had to sign over your rights for one dollar. In "The Last Lone Inventor," Schwartz tells the story of an RCA engineer who thought the system was so absurd that he would paste his one-dollar checks to the wall of his office - until the accounting department, upset with the unresolved balance on its books, steamed them off and forced him to cash them. At the same time, Sarnoff was a patient and generous benefactor. When Zworykin and Sarnoff discussed television for the first time, in 1929, Zworykin promised the RCA chief that he would create a working system in two years, at a cost of a hundred thousand dollars. In fact, it took more than ten years and fifty million dollars, and through all those years - which just happened to coincide with the Depression - Sarnoff's support never wavered. Sarnoff "hired the best engineers out of the best universities," Schwartz writes. "He paid them competitive salaries, provided them with ample research budgets, and offered them a chance to join his crusade to change the world, working in the most dynamic industry the world had ever seen." What Sarnoff presented was a compromise. In exchange for control over the fruits of invention, he gave his engineers the freedom to invent.

  Farnsworth didn't want to relinquish that control. Both RCA and General Electric offered him a chance to work on television in their laboratories. He turned them both down. He wanted to go it alone. This was the practical consequence of his conviction that television was his, and it was, in retrospect, a grievous error. It meant that Farnsworth was forced to work in a state of chronic insecurity. He never had enough money. He feuded constantly with his major investor, a man named Jesse McCargar, who didn't have the resources to play the television game. At the time of what should have been one of Farnsworth's greatest triumphs - the granting of his principal - McCargar showed up at the lab complaining about costs, and made Farnsworth fire his three star engineers. When, in 1928, the Green Street building burned down, a panicked Farnsworth didn't know whether or not his laboratory was insured. It was, as it happened, but a second laboratory, in Maine, wasn't, and when it burned down, years later, he lost everything. Twice, he testified before Congress. The first time, he rambled off on a tangent about transmission bandwidth which left people scratching their heads. The second time, he passed up a perfect opportunity to register his complaints about RCA, and launched, instead, into a sentimental account of his humble origins. He simply did not understand how to play politics, just as he did not understand how to raise money or run a business or organize his life. All he really knew how to do was invent, which was something that, as a solo operator, he too seldom had time for.

  This is the reason that so many of us work for big companies, of course: in a big company, there is always someone to do what we do not want to do or do not do well - someone to answer the phone, and set up our computer, and arrange our health insurance, and clean our office at night, and make sure the building is insured. In a famous 1937 essay, "The Nature of the Firm," the economist Ronald Coase said that the reason we have corporations is to reduce the everyday transaction costs of doing business: a company puts an accountant on the staff so that if a staffer needs to check the books all he has to do is walk down the hall. It's an obvious point, but one that is consistently overlooked, particularly by those who periodically rail, in the name of efficiency, against corporate bloat and superfluous middle managers. Yes, the middle manager does not always contribute directly to the bottom line. But he does contribute to those who contribute to the bottom line, and only an absurdly truncated account of human productivity - one that assumes real work to be somehow possible when phones are ringing, computers are crashing, and health insurance is expiring - does not see that secondary contribution as valuable.

  In April, 1931, Sarnoff showed up at the Green Street laboratory to review Farnsworth's work. This was, by any measure, an extraordinary event. Farnsworth was twenty-four, and working out of a ramshackle building. Sarnoff was one of the leading industrialists of his day. It was as if Bill Gates were to get in his private jet and visit a software startup in a garage across the country. But Farnsworth wasn't there. He was in New York, trapped there by a court order resulting from a frivolous lawsuit filed by a shady would-be investor. Stashower calls this one of the great missed opportunities of Farnsworth's career, because he almost certainly would have awed Sarnoff with his passion and brilliance, winning a lucrative licensing deal. Instead, an unimpressed Sarnoff made a token offer of a hundred thousand dollars for Farnsworth's patents, and Farnsworth dismissed the offer out of hand. This, too, is a reason that inventors ought to work for big corporations: big corporations have legal departments to protect their employees against being kept away from their laboratories by frivolous lawsuits. A genius is a terrible thing to waste.

  4.

  In 1939, at the World's Fair in New York City, David Sarnoff set up a nine-thousand-square-foot pavilion to showcase the new technology of television. The pavilion, shaped like a giant radio tube, was covered with RCA logos, and stood next to the Perisphere Theatre, the centerpiece of the fairgrounds. On opening day, thirty thousand people gathered to hear from President Roosevelt and Albert Einstein. The gala was televised by RCA, beamed across the New York City area from the top of the Empire State Building. As it happened, Farnsworth was in New York City that day, and he caught the opening ceremonies on a television in a department-store window. He saw Sarnoff introducing both Roosevelt and Einstein, and effectively claiming this wondrous new technology as his own. "Farnsworth's entire existence seemed to be annulled in this moment," Schwartz writes:

  The dreams of a farm boy, the eureka moment in a potato field, the confession to a teacher, the confidence in him shown by businessmen and bankers and investors, the breakthroughs in the laboratory, all the years of work, the decisions of the official patent examiners, those hard-fought victories, all of those demonstrations that had come and gone, the entire vision of the future. All of it was being negated by Sarnoff's performance at the World's Fair. Would the public ever know the truth?.   The agony of it set off sharp pains in his stomach.

  Finally, later that summer, RCA settled with Farnsworth. It agreed to pay him a million dollars for the rights to his main patents, plus royalties on every television set sold. But it was too late. Something had died in him. "It's come to the point of choosing whether I want to be a drunk or go crazy," he told his wife. One doctor prescribed chloral hydrate, which destroyed his appetite and left him dangerously thin. Another doctor prescribed cigarettes, to soothe his nerves. A third prescribed uppers. He became addicted to the painkiller Pantipon. He committed himself to a sanitarium in Massachusetts, where he was given a course of shock therapy. After the war, his brother died in a plane crash. His patents expired, drying up his chief source of income. His company, unable to compete with RCA, was forced out of the television business. He convinced himself that he could unlock the secrets of nuclear fusion, and launched another private research project, mortgaging his home, selling his stock, and cashing in his life insurance to fund the project. But nothing came of it. He died in 1971 - addicted to alcohol, deeply depressed, and all but forgotten. He was sixty-four.

  In "Tube," a history of television, David E. Fisher and Marshall Jon Fisher point out that Farnsworth was not the only television pioneer to die in misery. So did two others - John Logie Baird and Charles Francis Jenkins - who had tried and failed to produce mechanical television. This should not come as a surprise. The creative enterprise is a hazardous journey, and those who venture on it alone do so at their peril. Baird and Jenkins and Farnsworth risked their psychological and financial well-being on the romantic notion of the solitary inventor, and when that idea failed them what resources did they have left? Zworykin had his share of setbacks as well. He took on Farnsworth in court, and lost. He promised television in two years for a hundred thousand dollars and he came in eight years and fifty million dollars over budget. But he ended his life a prosperous and contented man, lauded and laurelled with awards and honorary degrees. He had the cocoon of RCA to protect him: a desk and a paycheck and a pension and a secretary and a boss with the means to rewrite history in his favor. This is perhaps a more important reason that we have companies - or, for that matter, that we have universities and tenure. Institutions are not just the best environment for success; they are also the safest environment for failure - and, much of the time, failure is what lies in store for innovators and visionaries. Philo Farnsworth should have gone to work for RCA. He would still have been the father of television, and he might have died a happy man.
GO TO TOP MENU

  How Nassim Taleb turned the inevitability of disaster into an investment strategy

  1.

  One day in 1996, a Wall Street trader named Nassim Nicholas Taleb went to see Victor Niederhoffer. Victor Niederhoffer was one of the most successful money managers in the country. He lived and worked out of a thirteen-acre compound in Fairfield County, Connecticut, and when Taleb drove up that day from his home in Larchmont he had to give his name at the gate, and then make his way down a long, curving driveway. Niederhoffer had a squash court and a tennis court and a swimming pool and a colossal, faux-alpine mansion in which virtually every square inch of space was covered with eighteenth- and nineteenth-century American folk art. In those days, he played tennis regularly with the billionaire financier George Soros. He had just written a best-selling book, "The Education of a Speculator," dedicated to his father, Artie Niederhoffer, a police officer from Coney Island. He had a huge and eclectic library and a seemingly insatiable desire for knowledge. When Niederhoffer went to Harvard as an undergraduate, he showed up for the very first squash practice and announced that he would someday be the best in that sport; and, sure enough, he soon beat the legendary Shariff Khan to win the U.S. Open squash championship. That was the kind of man Niederhoffer was. He had heard of Taleb's growing reputation in the esoteric field of options trading, and summoned him to Connecticut. Taleb was in awe.

  "He didn't talk much, so I observed him," Taleb recalls. "I spent seven hours watching him trade. Everyone else in his office was in his twenties, and he was in his fifties, and he had the most energy of them all. Then, after the markets closed, he went out to hit a thousand backhands on the tennis court." Taleb is Greek-Orthodox Lebanese and his first language was French, and in his pronunciation the name Niederhoffer comes out as the slightly more exotic Nieder hoffer. "Here was a guy living in a mansion with thousands of books, and that was my dream as a child," Taleb went on. "He was part chevalier, part scholar. My respect for him was intense." There was just one problem, however, and it is the key to understanding the strange path that Nassim Taleb has chosen, and the position he now holds as Wall Street's principal dissident. Despite his envy and admiration, he did not want to be Victor Niederhoffer — not then, not now, and not even for a moment in between. For when he looked around him, at the books and the tennis court and the folk art on the walls — when he contemplated the countless millions that Niederhoffer had made over the years — he could not escape the thought that it might all have been the result of sheer, dumb luck.

  Taleb knew how heretical that thought was. Wall Street was dedicated to the principle that when it came to playing the markets there was such a thing as expertise, that skill and insight mattered in investing just as skill and insight mattered in surgery and golf and flying fighter jets. Those who had the foresight to grasp the role that software would play in the modern world bought Microsoft in 1985, and made a fortune. Those who understood the psychology of investment bubbles sold their tech stocks at the end of 1999 and escaped the Nasdaq crash. Warren Buffett was known as the "sage of Omaha" because it seemed incontrovertible that if you started with nothing and ended up with billions then you had to be smarter than everyone else: Buffett was successful for a reason. Yet how could you know, Taleb wondered, whether that reason was responsible for someone's success, or simply a rationalization invented after the fact? George Soros seemed to be successful for a reason, too. He used to say that he followed something called "the theory of reflexivity." But then, later, Soros wrote that in most situations his theory "is so feeble that it can be safely ignored." An old trading partner of Taleb's, a man named Jean-Manuel Rozan, once spent an entire afternoon arguing about the stock market with Soros. Soros was vehemently bearish, and he had an elaborate theory to explain why, which turned out to be entirely wrong. The stock market boomed. Two years later, Rozan ran into Soros at a tennis tournament. "Do you remember our conversation?" Rozan asked. "I recall it very well," Soros replied. "I changed my mind, and made an absolute fortune." He changed his mind! The truest thing about Soros seemed to be what his son Robert had once said:

  My father will sit down and give you theories to explain why he does this or that. But I remember seeing it as a kid and thinking, Jesus Christ, at least half of this is bullshit. I mean, you know the reason he changes his position on the market or whatever is because his back starts killing him. It has nothing to do with reason. He literally goes into a spasm, and it?s this early warning sign.

  For Taleb, then, the question why someone was a success in the financial marketplace was vexing. Taleb could do the arithmetic in his head. Suppose that there were ten thousand investment managers out there, which is not an outlandish number, and that every year half of them, entirely by chance, made money and half of them, entirely by chance, lost money. And suppose that every year the losers were tossed out, and the game replayed with those who remained. At the end of five years, there would be three hundred and thirteen people who had made money in every one of those years, and after ten years there would be nine people who had made money every single year in a row, all out of pure luck. Niederhoffer, like Buffett and Soros, was a brilliant man. He had a Ph.D. in economics from the University of Chicago. He had pioneered the idea that through close mathematical analysis of patterns in the market an investor could identify profitable anomalies. But who was to say that he wasn't one of those lucky nine? And who was to say that in the eleventh year Niederhoffer would be one of the unlucky ones, who suddenly lost it all, who suddenly, as they say on Wall Street, "blew up"?

  Taleb remembered his childhood in Lebanon and watching his country turn, as he puts it, from "paradise to hell" in six months. His family once owned vast tracts of land in northern Lebanon. All of that was gone. He remembered his grandfather, the former Deputy Prime Minister of Lebanon and the son of a Deputy Prime Minister of Lebanon and a man of great personal dignity, living out his days in a dowdy apartment in Athens. That was the problem with a world in which there was so much uncertainty about why things ended up the way they did: you never knew whether one day your luck would turn and it would all be washed away.

  So here is what Taleb took from Niederhoffer. He saw that Niederhoffer was a serious athlete, and he decided that he would be, too. He would bicycle to work and exercise in the gym. Niederhoffer was a staunch empiricist, who turned to Taleb that day in Connecticut and said to him sternly, "Everything that can be tested must be tested," and so when Taleb started his own hedge fund, a few years later, he called it Empirica. But that is where it stopped. Nassim Taleb decided that he could not pursue an investment strategy that had any chance of blowing up.

  2.

  Nassim Taleb is a tall, muscular man in his early forties, with a salt-and-pepper beard and a balding head. His eyebrows are heavy and his nose is long. His skin has the olive hue of the Levant. He is a man of moods, and when his world turns dark the eyebrows come together and the eyes narrow and it is as if he were giving off an electrical charge. It is said, by some of his friends, that he looks like Salman Rushdie, although at his office his staff have pinned to the bulletin board a photograph of a mullah they swear is Taleb's long-lost twin, while Taleb himself maintains, wholly implausibly, that he resembles Sean Connery. He lives in a four-bedroom Tudor with twenty-six Russian Orthodox icons, nineteen Roman heads, and four thousand books, and he rises at dawn to spend an hour writing. He is the author of two books, the first a technical and highly regarded work on derivatives, and the second a treatise entitled "Fooled by Randomness," which was published last year and is to conventional Wall Street wisdom approximately what Martin Luther's ninety-five theses were to the Catholic Church. Some afternoons, he drives into the city and attends a philosophy lecture at City University. During the school year, in the evenings, he teaches a graduate course in finance at New York University, after which he can often be found at the bar at Odeon Café in Tribeca, holding forth, say, on the finer points of stochastic volatility or his veneration of the Greek poet C. P. Cavafy.

  Taleb runs Empirica Capital out of an anonymous, concrete office park somewhere in the woods outside Greenwich, Connecticut. His offices consist, principally, of a trading floor about the size of a Manhattan studio apartment. Taleb sits in one corner, in front of a laptop, surrounded by the rest of his team — Mark Spitznagel, the chief trader, another trader named Danny Tosto, a programmer named Winn Martin, and a graduate student named Pallop Angsupun. Mark Spitznagel is perhaps thirty. Win, Danny, and Pallop look as if they belonged in high school. The room has an overstuffed bookshelf in one corner, and a television muted and tuned to CNBC. There are two ancient Greek heads, one next to Taleb's computer and the other, somewhat bafflingly, on the floor, next to the door, as if it were being set out for the trash. There is almost nothing on the walls, except for a slightly battered poster for an exhibition of Greek artifacts, the snapshot of the mullah, and a small pen-and-ink drawing of the patron saint of Empirica Capital, the philosopher Karl Popper.

  On a recent spring morning, the staff of Empirica were concerned with solving a thorny problem, having to do with the square root of n, where n is a given number of random set of observations, and what relation n might have to a speculator's confidence in his estimations. Taleb was up at a whiteboard by the door, his marker squeaking furiously as he scribbled possible solutions. Spitznagel and Pallop looked on intently. Spitznagel is blond and from the Midwest and does yoga: in contrast to Taleb, he exudes a certain laconic levelheadedness. In a bar, Taleb would pick a fight. Spitznagel would break it up. Pallop is of Thai extraction and is doing a Ph.D. in financial mathematics at Princeton. He has longish black hair, and a slightly quizzical air. "Pallop is very lazy," Taleb will remark, to no one in particular, several times over the course of the day, although this is said with such affection that it suggests that "laziness," in the Talebian nomenclature, is a synonym for genius. Pallop's computer was untouched and he often turned his chair around, so that he faced completely away from his desk. He was reading a book by the cognitive psychologists Amos Tversky and Daniel Kahneman, whose arguments, he said a bit disappointedly, were "not really quantifiable." The three argued back and forth about the solution. It appeared that Taleb might be wrong, but before the matter could be resolved the markets opened. Taleb returned to his desk and began to bicker with Spitznagel about what exactly would be put on the company boom box. Spitznagel plays the piano and the French horn and has appointed himself the Empirica d.j. He wanted to play Mahler, and Taleb does not like Mahler. "Mahler is not good for volatility," Taleb complained. "Bach is good. St. Matthew's Passion!" Taleb gestured toward Spitznagel, who was wearing a gray woollen turtleneck. "Look at him. He wants to be like von Karajan, like someone who wants to live in a castle. Technically superior to the rest of us. No chitchatting. Top skier. That's Mark!" As Spitznagel rolled his eyes, a man whom Taleb refers to, somewhat mysteriously, as Dr. Wu wandered in. Dr. Wu works for another hedge fund, down the hall, and is said to be brilliant. He is thin and squints through black-rimmed glasses. He was asked his opinion on the square root of n but declined to answer. "Dr. Wu comes here for intellectual kicks and to borrow books and to talk music with Mark," Taleb explained after their visitor had drifted away. He added darkly, "Dr. Wu is a Mahlerian."

  Empirica follows a very particular investment strategy. It trades options, which is to say that it deals not in stocks and bonds but with bets on stocks and bonds. Imagine, for example, that General Motors stock is trading at fifty dollars, and imagine that you are a major investor on Wall Street. An options trader comes up to you with a proposition. What if, within the next three months, he decides to sell you a share of G.M. at forty-five dollars? How much would you charge for agreeing to buy it at that price? You would look at the history of G.M. and see that in a three-month period it has rarely dropped ten per cent, and obviously the trader is only going to make you buy his G.M. at forty-five dollars if the stock drops below that point. So you say you'll make that promise, or sell that option, for a relatively small fee, say, a dime. You are betting on the high probability that G.M. stock will stay relatively calm over the next three months, and if you are right you'll pocket the dime as pure profit. The trader, on the other hand, is betting on the unlikely event that G.M. stock will drop a lot, and if that happens his profits are potentially huge. If the trader bought a million options from you at a dime each and G.M. drops to thirty-five dollars, he'll buy a million shares at thirty-five dollars and turn around and force you to buy them at forty-five dollars, making himself suddenly very rich and you substantially poorer.

  That particular transaction is called, in the argot of Wall Street, an "out-of-the-money option." But an option can be configured in a vast number of ways. You could sell the trader a G.M. option at thirty dollars, or, if you wanted to bet against G.M. stock going up, you could sell a G.M. option at sixty dollars. You could sell or buy options on bonds, on the S. & P. index, on foreign currencies or on mortgages, or on the relationship among any number of financial instruments of your choice; you can bet on the market booming, or the market crashing, or the market staying the same. Options allow investors to gamble heavily and turn one dollar into ten. They also allow investors to hedge their risk. The reason your pension fund may not be wiped out in the next crash is that it has protected itself by buying options. What drives the options game is the notion that the risks represented by all of these bets can be quantified; that by looking at the past behavior of G.M. you can figure out the exact chance of G.M. hitting forty-five dollars in the next three months, and whether at a dollar that option is a good or a bad investment. The process is a lot like the way insurance companies analyze actuarial statistics in order to figure out how much to charge for a life-insurance premium, and to make those calculations every investment bank has, on staff, a team of Ph.D.s, physicists from Russia, applied mathematicians from China, computer scientists from India. On Wall Street, those Ph.D.s are called "quants."

  Nassim Taleb and his team at Empirica are quants. But they reject the quant orthodoxy, because they don't believe that things like the stock market behave in the way that physical phenomena like mortality statistics do. Physical events, whether death rates or poker games, are the predictable function of a limited and stable set of factors, and tend to follow what statisticians call a "normal distribution," a bell curve. But do the ups and downs of the market follow a bell curve? The economist Eugene Fama once studied stock prices and pointed out that if they followed a normal distribution you'd expect a really big jump, what he specified as a movement five standard deviations from the mean, once every seven thousand years. In fact, jumps of that magnitude happen in the stock market every three or four years, because investors don't behave with any kind of statistical orderliness. They change their mind. They do stupid things. They copy each other. They panic. Fama concluded that if you charted the ups and downs of the stock market the graph would have a "fat tail,"meaning that at the upper and lower ends of the distribution there would be many more outlying events than statisticians used to modelling the physical world would have imagined.

  In the summer of 1997, Taleb predicted that hedge funds like Long Term Capital Management were headed for trouble, because they did not understand this notion of fat tails. Just a year later, L.T.C.M. sold an extraordinary number of options, because its computer models told it that the markets ought to be calming down. And what happened? The Russian government defaulted on its bonds; the markets went crazy; and in a matter of weeks L.T.C.M. was finished. Spitznagel, Taleb's head trader, says that he recently heard one of the former top executives of L.T.C.M. give a lecture in which he defended the gamble that the fund had made. "What he said was, Look, when I drive home every night in the fall I see all these leaves scattered around the base of the trees,?" Spitznagel recounts. "There is a statistical distribution that governs the way they fall, and I can be pretty accurate in figuring out what that distribution is going to be. But one day I came home and the leaves were in little piles. Does that falsify my theory that there are statistical rules governing how leaves fall? No. It was a man-made event." In other words, the Russians, by defaulting on their bonds, did something that they were not supposed to do, a once-in-a-lifetime, rule-breaking event. But this, to Taleb, is just the point: in the markets, unlike in the physical universe, the rules of the game can be changed. Central banks can decide to default on government-backed securities.

  One of Taleb's earliest Wall Street mentors was a short-tempered Frenchman named Jean-Patrice, who dressed like a peacock and had an almost neurotic obsession with risk. Jean-Patrice would call Taleb from Regine's at three in the morning, or take a meeting in a Paris nightclub, sipping champagne and surrounded by scantily clad women, and once Jean-Patrice asked Taleb what would happen to his positions if a plane crashed into his building. Taleb was young then and brushed him aside. It seemed absurd. But nothing, Taleb soon realized, is absurd. Taleb likes to quote David Hume: "No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion." Because L.T.C.M. had never seen a black swan in Russia, it thought no Russian black swans existed. Taleb, by contrast, has constructed a trading philosophy predicated entirely on the existence of black swans. on the possibility of some random, unexpected event sweeping the markets. He never sells options, then. He only buys them. He's never the one who can lose a great deal of money if G.M. stock suddenly plunges. Nor does he ever bet on the market moving in one direction or another. That would require Taleb to assume that he understands the market, and he doesn't. He hasn't Warren Buffett's confidence. So he buys options on both sides, on the possibility of the market moving both up and down. And he doesn't bet on minor fluctuations in the market. Why bother? If everyone else is vastly underestimating the possibility of rare events, then an option on G.M. at, say, forty dollars is going to be undervalued. So Taleb buys out-of-the-money options by the truckload. He buys them for hundreds of different stocks, and if they expire before he gets to use them he simply buys more. Taleb doesn't even invest in stocks, not for Empirica and not for his own personal account. Buying a stock, unlike buying an option, is a gamble that the future will represent an improved version of the past. And who knows whether that will be true? So all of Taleb's personal wealth, and the hundreds of millions that Empirica has in reserve, is in Treasury bills. Few on Wall Street have taken the practice of buying options to such extremes. But if anything completely out of the ordinary happens to the stock market, if some random event sends a jolt through all of Wall Street and pushes G.M. to, say, twenty dollars, Nassim Taleb will not end up in a dowdy apartment in Athens. He will be rich.

  Not long ago, Taleb went to a dinner in a French restaurant just north of Wall Street. The people at the dinner were all quants: men with bulging pockets and open-collared shirts and the serene and slightly detached air of those who daydream in numbers. Taleb sat at the end of the table, drinking pastis and discussing French literature. There was a chess grand master at the table, with a shock of white hair, who had once been one of Anatoly Karpov's teachers, and another man who over the course of his career had worked, in order, at Stanford University, Exxon, Los Alamos National Laboratory, Morgan Stanley, and a boutique French investment bank. They talked about mathematics and chess and fretted about one of their party who had not yet arrived and who had the reputation, as one of the quants worriedly said, of "not being able to find the bathroom." When the check came, it was given to a man who worked in risk management at a big Wall Street bank, and he stared at it for a long time, with a slight mixture of perplexity and amusement, as if he could not remember what it was like to deal with a mathematical problem of such banality. The men at the table were in a business that was formally about mathematics but was really about epistemology, because to sell or to buy an option requires each party to confront the question of what it is he truly knows. Taleb buys options because he is certain that, at root, he knows nothing, or, more precisely, that other people believe they know more than they do. But there were plenty of people around that table who sold options, who thought that if you were smart enough to set the price of the option properly you could win so many of those one-dollar bets on General Motors that, even if the stock ever did dip below forty-five dollars, you'd still come out far ahead. They believe that the world is a place where, at the end of the day, leaves fall more or less in a predictable pattern.

  The distinction between these two sides is the divide that emerged between Taleb and Niederhoffer all those years ago in Connecticut. Niederhoffer's hero is the nineteenth-century scientist Francis Galton. Niederhoffer called his eldest daughter Galt, and there is a full-length portrait of Galton in his library. Galton was a statistician and a social scientist (and a geneticist and a meteorologist), and if he was your hero you believed that by marshalling empirical evidence, by aggregating data points, you could learn whatever it was you needed to know. Taleb's hero, on the other hand, is Karl Popper, who said that you could not know with any certainty that a proposition was true; you could only know that it was not true. Taleb makes much of what he learned from Niederhoffer, but Niederhoffer insists that his example was wasted on Taleb. "In one of his cases, Rumpole of the Bailey talked about being tried by the bishop who doesn't believe in God," Niederhoffer says. "Nassim is the empiricist who doesn't believe in empiricism." What is it that you claim to learn from experience, if you believe that experience cannot be trusted? Today, Niederhoffer makes a lot of his money selling options, and more often than not the person who he sells those options to is Nassim Taleb. If one of them is up a dollar one day, in other words, that dollar is likely to have come from the other. The teacher and pupil have become predator and prey.

  3.

  Years ago, Nassim Taleb worked at the investment bank First Boston, and one of the things that puzzled him was what he saw as the mindless industry of the trading floor. A trader was supposed to come in every morning and buy and sell things, and on the basis of how much money he made buying and selling he was given a bonus. If he went too many weeks without showing a profit, his peers would start to look at him funny, and if he went too many months without showing a profit he would be gone. The traders were often well educated, and wore Savile Row suits and Ferragamo ties. They dove into the markets with a frantic urgency. They read the Wall Street Journal closely and gathered around the television to catch breaking news. "The Fed did this, the Prime Minister of Spain did that," Taleb recalls. "The Italian Finance Minister says there will be no competitive devaluation, this number is higher than expected, Abby Cohen just said this." It was a scene that Taleb did not understand.

  "He was always so conceptual about what he was doing," says Howard Savery, who was Taleb?s assistant at the French bank Indosuez in the nineteen-eighties. "He used to drive our floor trader (his name was Tim) crazy. Floor traders are used to precision: "Sell a hundred futures at eighty-seven." Nassim would pick up the phone and say, "Tim, sell some." And Tim would say, "How many?" And he would say, "Oh, a social amount." It was like saying, "I don't have a number in mind, I just know I want to sell." There would be these heated arguments in French, screaming arguments. Then everyone would go out to dinner and have fun. Nassim and his group had this attitude that we're not interested in knowing what the new trade number is. When everyone else was leaning over their desks, listening closely to the latest figures, Nassim would make a big scene of walking out of the room."

  At Empirica, then, there are no Wall Street Journals to be found. There is very little active trading, because the options that the fund owns are selected by computer. Most of those options will be useful only if the market does something dramatic, and, of course, on most days the market doesn't. So the job of Taleb and his team is to wait and to think. They analyze the company's trading policies, back-test various strategies, and construct ever-more sophisticated computer models of options pricing. Danny, in the corner, occasionally types things into the computer. Pallop looks dreamily off into the distance. Spitznagel takes calls from traders, and toggles back and forth between screens on his computer. Taleb answers e-mails and calls one of the firm's brokers in Chicago, affecting, as he does, the kind of Brooklyn accent that people from Brooklyn would have if they were actually from northern Lebanon: "Howyoudoin?" It is closer to a classroom than to a trading floor.

  "Pallop, did you introspect?" Taleb calls out as he wanders back in from lunch. Pallop is asked what his Ph.D. is about. "Pretty much this," he says, waving a languid hand around the room.

  "It looks like we will have to write it for him," Taleb chimes in, "because Pollop is very lazy."

  What Empirica has done is to invert the traditional psychology of investing. You and I, if we invest conventionally in the market, have a fairly large chance of making a small amount of money in a given day from dividends or interest or the general upward trend of the market. We have almost no chance of making a large amount of money in one day, and there is a very small, but real, possibility that if the market collapses we could blow up. We accept that distribution of risks because, for fundamental reasons, it feels right. In the book that Pallop was reading by Kahneman and Tversky, for example, there is a description of a simple experiment, where a group of people were told to imagine that they had three hundred dollars. They were then given a choice between (a) receiving another hundred dollars or (b) tossing a coin, where if they won they got two hundred dollars and if they lost they got nothing. Most of us, it turns out, prefer (a) to (b). But then Kahneman and Tversky did a second experiment. They told people to imagine that they had five hundred dollars, and then asked them if they would rather (c) give up a hundred dollars or (d) toss a coin and pay two hundred dollars if they lost and nothing at all if they won. Most of us now prefer (d) to (c). What is interesting about those four choices is that, from a probabilistic standpoint, they are identical. They all yield an expected outcome of four hundred dollars. Nonetheless, we have strong preferences among them. Why? Because we're more willing to gamble when it comes to losses, but are risk averse when it comes to our gains. That's why we like small daily winnings in the stock market, even if that requires that we risk losing everything in a crash.

  At Empirica, by contrast, every day brings a small but real possibility that they'll make a huge amount of money in a day; no chance that they'll blow up; and a very large possibility that they'll lose a small amount of money. All those dollar, and fifty-cent, and nickel options that Empirica has accumulated, few of which will ever be used, soon begin to add up. By looking at a particular column on the computer screens showing Empirica's positions, anyone at the firm can tell you precisely how much money Empirica has lost or made so far that day. At 11:30 A.M., for instance, they had recovered just twenty-eight percent of the money they had spent that day on options. By 12:30, they had recovered forty per cent, meaning that the day was not yet half over and Empirica was already in the red to the tune of several hundred thousand dollars. The day before that, it had made back eighty-five per cent of its money; the day before that, forty-eight per cent; the day before that, sixty-five per cent; and the day before that also sixty-five per cent; and, in fact-with a few notable exceptions, like the few days when the market reopened after September 11th — Empirica has done nothing but lose money since last April. "We cannot blow up, we can only bleed to death," Taleb says, and bleeding to death, absorbing the pain of steady losses, is precisely what human beings are hardwired to avoid. "Say you've got a guy who is long on Russian bonds," Savery says. "He's making money every day. One day, lightning strikes and he loses five times what he made. Still, on three hundred and sixty-four out of three hundred and sixty-five days he was very happily making money. It's much harder to be the other guy, the guy losing money three hundred and sixty-four days out of three hundred and sixty-five, because you start questioning yourself. Am I ever going to make it back? Am I really right? What if it takes ten years? Will I even be sane ten years from now?" What the normal trader gets from his daily winnings is feedback, the pleasing illusion of progress. At Empirica, there is no feedback. "It's like you're playing the piano for ten years and you still can't play chopsticks," Spitznagel say, "and the only thing you have to keep you going is the belief that one day you'll wake up and play like Rachmaninoff." Was it easy knowing that Niederhoffer — who represented everything they thought was wrong — was out there getting rich while they were bleeding away? Of course it wasn't . If you watched Taleb closely that day, you could see the little ways in which the steady drip of losses takes a toll. He glanced a bit too much at the Bloomberg. He leaned forward a bit too often to see the daily loss count. He succumbs to an array of superstitious tics. If the going is good, he parks in the same space every day; he turned against Mahler because he associates Mahler with the last year's long dry spell. "Nassim says all the time that he needs me there, and I believe him," Spitznagel says. He is there to remind Taleb that there is a point to waiting, to help Taleb resist the very human impulse to abandon everything and stanch the pain of losing. "Mark is my cop," Taleb says. So is Pallop: he is there to remind Taleb that Empirica has the intellectual edge.

  "The key is not having the ideas but having the recipe to deal with your ideas," Taleb says. "We don't need moralizing. We need a set of tricks." His trick is a protocol that stipulates precisely what has to be done in every situation. "We built the protocol, and the reason we did was to tell the guys, Don't listen to me, listen to the protocol. Now, I have the right to change the protocol, but there is a protocol to changing the protocol. We have to be hard on ourselves to do what we do. The bias we see in Niederhoffer we see in ourselves." At the quant dinner, Taleb devoured his roll, and as the busboy came around with more rolls Taleb shouted out "No, no!" and blocked his plate. It was a never-ending struggle, this battle between head and heart. When the waiter came around with wine, he hastily covered the glass with his hand. When the time came to order, he asked for steak frites — without the frites, please! — and then immediately tried to hedge his choice by negotiating with the person next to him for a fraction of his frites.

  The psychologist Walter Mischel has done a series of experiments where he puts a young child in a room and places two cookies in front of him, one small and one large. The child is told that if he wants the small cookie he need only ring a bell and the experimenter will come back into the room and give it to him. If he wants the better treat, though, he has to wait until the experimenter returns on his own, which might be anytime in the next twenty minutes. Mischel has videotapes of six-year-olds, sitting in the room by themselves, staring at the cookies, trying to persuade themselves to wait. One girl starts to sing to herself. She whispers what seems to be the instructions — that she can have the big cookie if she can only wait. She closes her eyes. Then she turns her back on the cookies. Another little boy swings his legs violently back and forth, and then picks up the bell and examines it, trying to do anything but think about the cookie he could get by ringing it. The tapes document the beginnings of discipline and self-control — the techniques we learn to keep our impulses in check — and to watch all the children desperately distracting themselves is to experience the shock of recognition: that's Nassim Taleb!

  There is something else as well that helps to explain Taleb's resolve — more than the tics and the systems and the self-denying ordinances. It happened a year or so before he went to see Niederhoffer. Taleb had been working as a trader at the Chicago Mercantile Exchange, and developed a persistently hoarse throat. At first, he thought nothing of it: a hoarse throat was an occupational hazard of spending every day in the pit. Finally, when he moved back to New York, he went to see a doctor, in one of those Upper East Side prewar buildings with a glamorous façade. Taleb sat in the office, staring out at the plain brick of the courtyard, reading the medical diplomas on the wall over and over, waiting and waiting for the verdict. The doctor returned and spoke in a low, grave voice: "I got the pathology report. It's not as bad as it sounds ?" But, of course, it was: he had throat cancer. Taleb's mind shut down. He left the office. It was raining outside. He walked and walked and ended up at a medical library. There he read frantically about his disease, the rainwater forming a puddle under his feet. It made no sense. Throat cancer was the disease of someone who has spent a lifetime smoking heavily. But Taleb was young, and he barely smoked at all. His risk of getting throat cancer was something like one in a hundred thousand, almost unimaginably small. He was a black swan! The cancer is now beaten, but the memory of it is also Taleb's secret, because once you have been a black swan — not just seen one, but lived and faced death as one — it becomes easier to imagine another on the horizon.

  As the day came to an end, Taleb and his team turned their attention once again to the problem of the square root of n. Taleb was back at the whiteboard. Spitznagel was looking on. Pallop was idly peeling a banana. Outside, the sun was beginning to settle behind the trees. "You do a conversion to p1 and p2," Taleb said. His marker was once again squeaking across the whiteboard. "We say we have a Gaussian distribution, and you have the market switching from a low-volume regime to a high-volume. P21. P22. You have your igon value." He frowned and stared at his handiwork. The markets were now closed. Empirica had lost money, which meant that somewhere off in the woods of Connecticut Niederhoffer had no doubt made money. That hurt, but if you steeled yourself, and thought about the problem at hand, and kept in mind that someday the market would do something utterly unexpected because in the world we live in something utterly unexpected always happens, then the hurt was not so bad. Taleb eyed his equations on the whiteboard, and arched an eyebrow. It was a very difficult problem. "Where is Dr. Wu? Should we call in Dr. Wu?"

  4.

  A year after Nassim Taleb came to visit him, Victor Niederhoffer blew up. He sold a very large number of options on the S. & P. index, taking millions of dollars from other traders in exchange for promising to buy a basket of stocks from them at current prices, if the market ever fell. It was an unhedged bet, or what was called on Wall Street a "naked put," meaning that he bet everyone on one outcome: he bet in favor of the large probability of making a small amount of money, and against the small probability of losing a large amount of money-and he lost. On October 27, 1997, the market plummeted eight per cent, and all of the many, many people who had bought those options from Niederhoffer came calling all at once, demanding that he buy back their stocks at pre-crash prices. He ran through a hundred and thirty million dollars — his cash reserves, his savings, his other stocks — and when his broker came and asked for still more he didn't have it. In a day, one of the most successful hedge funds in America was wiped out. Niederhoffer had to shut down his firm. He had to mortgage his house. He had to borrow money from his children. He had to call Sotheby's and sell his prized silver collection — the massive nineteenth-century Brazilian "sculptural group of victory" made for the Visconde De Figueirdeo, the massive silver bowl designed in 1887 by Tiffany & Company for the James Gordon Bennet Cup yacht race, and on and on. He stayed away from the auction. He couldn't bear to watch.

  "It was one of the worst things that has ever happened to me in my life, right up there with the death of those closest to me," Niederhoffer said recently. It was a Saturday in March, and he was in the library of his enormous house. Two weary-looking dogs wandered in and out. He is a tall man, an athlete, thick through the upper body and trunk, with a long, imposing face and baleful, hooded eyes. He was shoeless. One collar on his shirt was twisted inward, and he looked away as he talked. "I let down my friends. I lost my business. I was a major money manager. Now I pretty much have had to start from ground zero." He paused. "Five years have passed. The beaver builds a dam. The river washes it away, so he tries to build a better foundation, and I think I have. But I'm always mindful of the possibility of more failures." In the distance, there was a knock on the door. It was a man named Milton Bond, an artist who had come to present Niederhoffer with a painting he had done of Moby Dick ramming the Pequod. It was in the folk-art style that Niederhoffer likes so much, and he went to meet Bond in the foyer, kneeling down in front of the painting as Bond unwrapped it. Niederhoffer has other paintings of the Pequod in his house, and paintings of the Essex, the ship on which Melville's story was based. In his office, on a prominent wall, is a painting of the Titanic. They were, he said, his way of staying humble. "One of the reasons I've paid lots of attention to the Essex is that it turns out that the captain of the Essex, as soon as he got back to Nantucket, was given another job," Niederhoffer said. "They thought he did a good job in getting back after the ship was rammed. The captain was asked, `How could people give you another ship?' And he said, `I guess on the theory that lightning doesn't strike twice.' It was a fairly random thing. But then he was given the other ship, and that one foundered, too. Got stuck in the ice. At that time, he was a lost man. He wouldn't even let them save him. They had to forcibly remove him from the ship. He spent the rest of his life as a janitor in Nantucket. He became what on Wall Street they call a ghost." Niederhoffer was back in his study now, his lanky body stretched out, his feet up on the table, his eyes a little rheumy. "You see? I can't afford to fail a second time. Then I'll be a total washout. That's the significance of the Pequod."

  A month or so before he blew up, Taleb had dinner with Niederhoffer at a restaurant in Westport, and Niederhoffer told him that he had been selling naked puts. You can imagine the two of them across the table from each other, Niederhoffer explaining that his bet was an acceptable risk, that the odds of the market going down so heavily that he would be wiped out were minuscule, and Taleb listening and shaking his head, and thinking about black swans. "I was depressed when I left him," Taleb said. "Here is a guy who goes out and hits a thousand backhands. He plays chess like his life depends on it. Here is a guy who, whatever he wants to do when he wakes up in the morning, he ends up better than anyone else. Whatever he wakes up in the morning and decides to do, he did better than anyone else. I was talking to my hero . . ." This was the reason Taleb didn't want to be Niederhoffer when Niederhoffer was at his height — the reason he didn't want the silver and the house and the tennis matches with George Soros. He could see all too clearly where it all might end up. In his mind's eye, he could envision Niederhoffer borrowing money from his children, and selling off his silver, and talking in a hollow voice about letting down his friends, and Taleb did not know if he had the strength to live with that possibility. Unlike Niederhoffer, Taleb never thought he was invincible. You couldn't if you had watched your homeland blow up, and had been the one person in a hundred thousand who gets throat cancer, and so for Taleb there was never any alternative to the painful process of insuring himself against catastrophe.

  This kind of caution does not seem heroic, of course. It seems like the joyless prudence of the accountant and the Sunday-school teacher. The truth is that we are drawn to the Niederhoffers of this world because we are all, at heart, like Niederhoffer: we associate the willingness to risk great failure — and the ability to climb back from catastrophe - with courage. But in this we are wrong. That is the lesson of Taleb and Niederhoffer, and also the lesson of our volatile times. There is more courage and heroism in defying the human impulse, in taking the purposeful and painful steps to prepare for the unimaginable.

  Last fall, Niederhoffer sold a large number of options, betting that the markets would be quiet, and they were, until out of nowhere two planes crashed into the World Trade Center. "I was exposed. It was nip and tuck." Niederhoffer shook his head, because there was no way to have anticipated September 11th. "That was a totally unexpected event."
GO TO TOP MENU

  Looking for method in the mess.

  1.

  On a busy day, a typical air-traffic controller might be in charge of as many as twenty-five airplanes at a time - some ascending, some descending, each at a different altitude and travelling at a different speed. He peers at a large, monochromatic radar console, tracking the movement of tiny tagged blips moving slowly across the screen. He talks to the sector where a plane is headed, and talks to the pilots passing through his sector, and talks to the other controllers about any new traffic on the horizon. And, as a controller juggles all those planes overhead, he scribbles notes on little pieces of paper, moving them around on his desk as he does. Air-traffic control depends on computers and radar. It also depends, heavily, on paper and ink.

  When people talk about the need to modernize the American air-traffic-control system, this is, in large part, what they are referring to. Whenever a plane takes off, the basic data about the flight — the type of plane, the radar I.D. number, the requested altitude, the destination — are printed out on a stiff piece of paper, perhaps one and a half by six and a half inches, known as a flight strip. And as the plane passes through each sector of the airspace the controller jots down, using a kind of shorthand, everything new that is happening to the plane — its speed, say, and where it's heading, clearances from ground control, holding instructions, comments on the pilot. It's a method that dates back to the days before radar, and it drives critics of the air-traffic-control system crazy. Why, in this day and age, are planes being handled like breakfast orders in a roadside diner?

  This is one of the great puzzles of the modern workplace. Computer technology was supposed to replace paper. But that hasn't happened. Every country in the Western world uses more paper today, on a per-capita basis, than it did ten years ago. The consumption of uncoated free-sheet paper, for instance — the most common kind of office paper — rose almost fifteen per cent in the United States between 1995 and 2000. This is generally taken as evidence of how hard it is to eradicate old, wasteful habits and of how stubbornly resistant we are to the efficiencies offered by computerization. A number of cognitive psychologists and ergonomics experts, however, don't agree. Paper has persisted, they argue, for very good reasons: when it comes to performing certain kinds of cognitive tasks, paper has many advantages over computers. The dismay people feel at the sight of a messy desk — or the spectacle of air-traffic controllers tracking flights through notes scribbled on paper strips — arises from a fundamental confusion about the role that paper plays in our lives.

  2.

  The case for paper is made most eloquently in "The Myth of the Paperless Office" (M.I.T.; $24.95), by two social scientists, Abigail Sellen and Richard Harper. They begin their book with an account of a study they conducted at the International Monetary Fund, in Washington, D.C. Economists at the I.M.F. spend most of their time writing reports on complicated economic questions, work that would seem to be perfectly suited to sitting in front of a computer. Nonetheless, the I.M.F. is awash in paper, and Sellen and Harper wanted to find out why. Their answer is that the business of writing reports — at least at the I.M.F — is an intensely collaborative process, involving the professional judgments and contributions of many people. The economists bring drafts of reports to conference rooms, spread out the relevant pages, and negotiate changes with one other. They go back to their offices and jot down comments in the margin, taking advantage of the freedom offered by the informality of the handwritten note. Then they deliver the annotated draft to the author in person, taking him, page by page, through the suggested changes. At the end of the process, the author spreads out all the pages with comments on his desk and starts to enter them on the computer — moving the pages around as he works, organizing and reorganizing, saving and discarding.

  Without paper, this kind of collaborative, iterative work process would be much more difficult. According to Sellen and Harper, paper has a unique set of "affordances" — that is, qualities that permit specific kinds of uses. Paper is tangible: we can pick up a document, flip through it, read little bits here and there, and quickly get a sense of it. (In another study on reading habits, Sellen and Harper observed that in the workplace, people almost never read a document sequentially, from beginning to end, the way they would read a novel.) Paper is spatially flexible, meaning that we can spread it out and arrange it in the way that suits us best. And it's tailorable: we can easily annotate it, and scribble on it as we read, without altering the original text. Digital documents, of course, have their own affordances. They can be easily searched, shared, stored, accessed remotely, and linked to other relevant material. But they lack the affordances that really matter to a group of people working together on a report. Sellen and Harper write:

  Because paper is a physical embodiment of information, actions performed in relation to paper are, to a large extent, made visible to one's colleagues. Reviewers sitting around a desk could tell whether a colleague was turning toward or away from a report; whether she was flicking through it or setting it aside. Contrast this with watching someone across a desk looking at a document on a laptop. What are they looking at? Where in the document are they? Are they really reading their e-mail? Knowing these things is important because they help a group coördinate its discussions and reach a shared understanding of what is being discussed.

  3.

  Paper enables a certain kind of thinking. Picture, for instance, the top of your desk. Chances are that you have a keyboard and a computer screen off to one side, and a clear space roughly eighteen inches square in front of your chair. What covers the rest of the desktop is probably piles — piles of papers, journals, magazines, binders, postcards, videotapes, and all the other artifacts of the knowledge economy. The piles look like a mess, but they aren't. When a group at Apple Computer studied piling behavior several years ago, they found that even the most disorderly piles usually make perfect sense to the piler, and that office workers could hold forth in great detail about the precise history and meaning of their piles. The pile closest to the cleared, eighteen-inch-square working area, for example, generally represents the most urgent business, and within that pile the most important document of all is likely to be at the top. Piles are living, breathing archives. Over time, they get broken down and resorted, sometimes chronologically and sometimes thematically and sometimes chronologically and thematically; clues about certain documents may be physically embedded in the file by, say, stacking a certain piece of paper at an angle or inserting dividers into the stack.

  But why do we pile documents instead of filing them? Because piles represent the process of active, ongoing thinking. The psychologist Alison Kidd, whose research Sellen and Harper refer to extensively, argues that "knowledge workers" use the physical space of the desktop to hold "ideas which they cannot yet categorize or even decide how they might use." The messy desk is not necessarily a sign of disorganization. It may be a sign of complexity: those who deal with many unresolved ideas simultaneously cannot sort and file the papers on their desks, because they haven't yet sorted and filed the ideas in their head. Kidd writes that many of the people she talked to use the papers on their desks as contextual cues to "recover a complex set of threads without difficulty and delay" when they come in on a Monday morning, or after their work has been interrupted by a phone call. What we see when we look at the piles on our desks is, in a sense, the contents of our brains.

  Sellen and Harper arrived at similar findings when they did some consulting work with a chocolate manufacturer. The people in the firm they were most interested in were the buyers — the staff who handled the company's relationships with its venders, from cocoa and sugar manufacturers to advertisers. The buyers kept folders (containing contracts, correspondence, meeting notes, and so forth) on every supplier they had dealings with. The company wanted to move the information in those documents online, to save space and money, and make it easier for everyone in the firm to have access to it. That sounds like an eminently rational thing to do. But when Sellen and Harper looked at the folders they discovered that they contained all kinds of idiosyncratic material — advertising paraphernalia, printouts of e-mails, presentation notes, and letters — much of which had been annotated in the margins with thoughts and amendments and, they write, "perhaps most important, comments about problems and issues with a supplier's performance not intended for the supplier's eyes." The information in each folder was organized — if it was organized at all — according to the whims of the particular buyer. Whenever other people wanted to look at a document, they generally had to be walked through it by the buyer who "owned" it, because it simply wouldn't make sense otherwise. The much advertised advantage of digitizing documents — that they could be made available to anyone, at any time — was illusory: documents cannot speak for themselves. "All of this emphasized that most of what constituted a buyer's expertise resulted from involvement with the buyer's own suppliers through a long history of phone calls and meetings," Sellen and Harper write:

  The correspondence, notes, and other documents such discussions would produce formed a significant part of the documents buyers kept. These materials therefore supported rather than constituted the expertise of the buyers. In other words, the knowledge existed not so much in the documents as in the heads of the people who owned them — in their memories of what the documents were, in their knowledge of the history of that supplier relationship, and in the recollections that were prompted whenever they went through the files.

  4.

  This idea that paper facilitates a highly specialized cognitive and social process is a far cry from the way we have historically thought about the stuff. Paper first began to proliferate in the workplace in the late nineteenth century as part of the move toward "systematic management." To cope with the complexity of the industrial economy, managers were instituting company-wide policies and demanding monthly, weekly, or even daily updates from their subordinates. Thus was born the monthly sales report, and the office manual and the internal company newsletter. The typewriter took off in the eighteen-eighties, making it possible to create documents in a fraction of the time it had previously taken, and that was followed closely by the advent of carbon paper, which meant that a typist could create ten copies of that document simultaneously. If you were, say, a railroad company, then you would now have a secretary at the company headquarters type up a schedule every week, setting out what train was travelling in what direction at what time, because in the mid-nineteenth century collisions were a terrible problem. Then the secretary would make ten carbon copies of that schedule and send them out to the stations along your railway line. Paper was important not to facilitate creative collaboration and thought but as an instrument of control.

  Perhaps no one embodied this notion more than the turn-of-the-century reformer Melvil Dewey. Dewey has largely been forgotten by history, perhaps because he was such a nasty fellow — an outspoken racist and anti-Semite — but in his day he dominated America's thinking about the workplace. He invented the Dewey decimal system, which revolutionized the organization of libraries. He was an ardent advocate of shorthand and of the metric system, and was so obsessed with time-saving and simplification that he changed his first name from Melville to the more logical Melvil. (He also pushed for the adoption of "catalog" in place of "catalogue," and of "thruway" to describe major highways, a usage that survives to this day in New York State). Dewey's principal business was something called the Library Bureau, which was essentially the Office Depot of his day, selling card catalogues, cabinets, office chairs and tables, pre-printed business forms, and, most important, filing cabinets. Previously, businessmen had stored their documents in cumbersome cases, or folded and labelled the pieces of paper and stuck them in the pigeonholes of the secretary desks so common in the Victorian era. What Dewey proposed was essentially an enlarged version of a card catalogue, where paper documents hung vertically in long drawers.

  The vertical file was a stunning accomplishment. In those efficiency-obsessed days, it prompted books and articles and debates and ended up winning a gold medal at the 1893 World's Fair, because it so neatly addressed the threat of disorder posed by the proliferation of paper. What good was that railroad schedule, after all, if it was lost on someone's desk? Now a railroad could buy one of Dewey's vertical filing cabinets, and put the schedule under "S," where everyone could find it. In "Scrolling Forward: Making Sense of Documents in the Digital Age" (Arcade; $24.95), the computer scientist David M. Levy argues that Dewey was the anti-Walt Whitman, and that his vision of regularizing and standardizing life ended up being just as big a component of the American psyche as Whitman's appeal to embrace the world just as it is. That seems absolutely right. The fact is, the thought of all those memos and reports and manuals made Dewey anxious, and that anxiety has never really gone away, even in the face of evidence that paper is no longer something to be anxious about.

  When Thomas Edison invented the phonograph, for example, how did he imagine it would be used? As a dictation device that a businessman could pass around the office in place of a paper memo. In 1945, the computer pioneer Vannevar Bush imagined what he called a "memex" — a mechanized library and filing cabinet, on which an office worker would store all his relevant information without the need for paper files at all. So, too, with the information-technology wizards who have descended on the workplace in recent years. Instead of a real desktop, they have offered us the computer desktop, where cookie-cutter icons run in orderly rows across a soothing background, implicitly promising to bring order to the chaos of our offices.

  Sellen and Harper include in their book a photograph of an office piled high with stacks of paper. The occupant of the office — a researcher in Xerox's European research facility — was considered neither ineffective nor inefficient. Quite the contrary: he was, they tell us, legendary in being able to find any document in his office very quickly. But the managers of the laboratory were uncomfortable with his office because of what it said about their laboratory. They were, after all, an organization looking to develop digital workplace solutions. "They wanted to show that this was a workplace reaching out to the future rather than being trapped in an inefficient past," Sellen and Harper write. "Yet, if this individual's office was anything to go by, the reality was that this workplace of the future was full of paper." Whenever senior colleagues came by the office, then, the man with the messy desk was instructed to put his papers in boxes and hide them under the stairs. The irony is, of course, that it was not the researcher who was trapped in an inefficient past but the managers. They were captives of the nineteenth-century notion that paper was most useful when it was put away. They were channelling Melvil Dewey. But this is a different era. In the tasks that face modern knowledge workers, paper is most useful out in the open, where it can be shuffled and sorted and annotated and spread out. The mark of the contemporary office is not the file. It's the pile.

  5.

  Air-traffic controllers are quintessential knowledge workers. They perform a rarefied version of the task faced by the economists at the I.M.F. when they sit down at the computer with the comments and drafts of five other people spread around them, or the manager when she gets to her office on Monday morning, looks at the piles of papers on her desk, and tries to make sense of all the things she has to do in the coming week. When an air-traffic controller looks at his radar, he sees a two-dimensional picture of where the planes in his sector are. But what he needs to know is where his planes will be. He has to be able to take the evidence from radar, what he hears from the pilots and other controllers, and what he has written down on the flight strips in front of him, and construct a three-dimensional "picture" of all the planes in his sector. Psychologists call the ability to create that mental picture "situation awareness." "Situation awareness operates on three levels," says Mica Endsley, the president of S.A. Technologies, in Georgia, and perhaps the country's leading expert on the subject. "One is perceiving. Second is understanding what the information means — analogous to reading comprehension. That's where you or I would have problems. We'd see the blips on the screen, and it wouldn't mean anything to us. The highest level, though, is projection — the ability to predict which aircraft are coming in and when. You've got to be able to look into the future, probably by as much as five minutes."

  Psychologists believe that those so-called flight strips play a major role in helping controllers achieve this situation awareness. Recently, for example, Wendy Mackay, a computer scientist now working in Paris, spent several months at an air-traffic-control facility near Orly Airport, in Paris. The French air-traffic-control system is virtually identical to the American system. One controller, the planning controller, is responsible for the radar. He has a partner, whose job is to alert the radar controller to incoming traffic, and what Mackay observed was how beautifully the strips enable efficient interaction between these two people. The planning controller, for instance, overhears what his partner is saying on the radio, and watches him annotate strips. If he has a new strip, he might keep it just out of his partner's visual field until it is relevant. "She [the planner] moves it into his peripheral view if the strip should be dealt with soon, but not immediately," Mackay writes. "If the problem is urgent, she will physically move it into his focal view, placing the strip on top of the stripboard or, rarely, inserting it."

  Those strips moving in and out of the peripheral view of the controller serve as cognitive cues, which the controller uses to help keep the "picture" of his sector clear in his head. When taking over a control position, controllers touch and rearrange the strips in front of them. When they are given a new strip, they are forced mentally to register a new flight and the new traffic situation. By writing on the strips, they can off-load information, keeping their minds free to attend to other matters. The controller's flight strips are like the piles of paper on a desk: they are the physical manifestations of what goes on inside his head. Is it any wonder that the modernization of the air-traffic-control system has taken so long? No one wants to do anything that might disrupt that critical mental process.

  This is, of course, a difficult conclusion for us to accept. Like the managers of the office-technology lab, we have in our heads the notion that an air-traffic-control center ought to be a pristine and gleaming place, full of the latest electronic gadgetry. We think of all those flight strips as cluttering and confusing the work of the office, and we fret about where all that paper will go. But, as Sellen and Harper point out, we needn't worry. It is only if paper's usefulness is in the information written directly on it that it must be stored. If its usefulness lies in the promotion of ongoing creative thinking, then, once that thinking is finished, the paper becomes superfluous. The solution to our paper problem, they write, is not to use less paper but to keep less paper. Why bother filing at all? Everything we know about the workplace suggests that few if any knowledge workers ever refer to documents again once they have filed them away, which should come as no surprise, since paper is a lousy way to archive information. It's too hard to search and it takes up too much space. Besides, we all have the best filing system ever invented, right there on our desks — the personal computer. That is the irony of the P.C.: the workplace problem that it solves is the nineteenth-century anxiety. It's a better filing cabinet than the original vertical file, and if Dewey were alive today, he'd no doubt be working very happily in an information-technology department somewhere. The problem that paper solves, by contrast, is the problem that most concerns us today, which is how to support knowledge work. In fretting over paper, we have been tripped up by a historical accident of innovation, confused by the assumption that the most important invention is always the most recent. Had the computer come first — and paper second — no one would raise an eyebrow at the flight strips cluttering our air-traffic-control centers.
GO TO TOP MENU

  What Stanley H. Kaplan taught us about the SAT.

  1.

  Once, in fourth grade, Stanley Kaplan got a B-plus on his report card and was so stunned that he wandered aimlessly around the neighborhood, ashamed to show his mother. This was in Brooklyn, on Avenue K in Flatbush, between the wars. Kaplan's father, Julius, was from Slutsk, in Belorussia, and ran a plumbing and heating business. His mother, Ericka, ninety pounds and four feet eight, was the granddaughter of the chief rabbi of the synagogue of Prague, and Stanley loved to sit next to her on the front porch, immersed in his schoolbooks while his friends were off playing stickball. Stanley Kaplan had Mrs. Holman for fifth grade, and when she quizzed the class on math equations, he would shout out the answers. If other students were having problems, Stanley would take out pencil and paper and pull them aside. He would offer them a dime, sometimes, if they would just sit and listen. In high school, he would take over algebra class, and the other kids, passing him in the hall, would call him Teach. One classmate, Aimee Rubin, was having so much trouble with math that she was in danger of being dropped from the National Honor Society. Kaplan offered to help her, and she scored a ninety-five on her next exam. He tutored a troubled eleven-year-old named Bob Linker, and Bob Linker ended up a successful businessman. In Kaplan?s sophomore year at City College, he got a C in biology and was so certain that there had been a mistake that he marched in to see the professor and proved that his true grade, an A, had accidentally been switched with that of another, not quite so studious, Stanley Kaplan. Thereafter, he became Stanley H. Kaplan, and when people asked him what the "H" stood for he would say "Higher scores!" or, with a sly wink, "Preparation!" He graduated Phi Beta Kappa and hung a shingle outside his parent's house on Avenue K, "Stanley H. Kaplan Educational Center," and started tutoring kids in the basement. In 1946, a high-school junior named Elizabeth, from Coney Island, came to him for help on an exam he was unfamiliar with. It was called the Scholastic Aptitude Test, and from that moment forward the business of getting into college in America was never quite the same.

  The S.A.T., at that point, was just beginning to go into widespread use. Unlike existing academic exams, it was intended to measure innate ability - not what a student had learned but what a student was capable of learning - and it stated clearly in the instructions that "cramming or last-minute reviewing" was pointless. Kaplan was puzzled. In Flatbush you always studied for tests. He gave Elizabeth pages of math problems and reading-comprehension drills. He grilled her over and over, doing what the S.A.T. said should not be done. And what happened? On test day, she found the S.A.T. "a piece of cake," and promptly told all her friends, and her friends told their friends, and soon word of Stanley H. Kaplan had spread throughout Brooklyn.

  A few years later, Kaplan married Rita Gwirtzman, who had grown up a mile away, and in 1951 they moved to a two-story brick-and-stucco house on Bedford Avenue, a block from his alma mater, James Madison High School. He renovated his basement, dividing it into classrooms. When the basement got too crowded, he rented a podiatrist's office near King's Highway, at the Brighton Beach subway stop. In the nineteen-seventies, he went national, setting up educational programs throughout the country, creating an S.A.T.-preparation industry that soon became crowded with tutoring companies and study manuals. Kaplan has now written a memoir, "Test Pilot" (Simon & Schuster; $19), which has as its subtitle "How I Broke Testing Barriers for Millions of Students and Caused a Sonic Boom in the Business of Education." That actually understates his importance. Stanley Kaplan changed the rules of the game.

  2.

  The S.A.T. is now seventy-five years old, and it is in trouble. Earlier this year, the University of California - the nation's largest public-university system - stunned the educational world by proposing a move toward a "holistic" admissions system, which would mean abandoning its heavy reliance on standardized-test scores. The school backed up its proposal with a devastating statistical analysis, arguing that the S.A.T. is virtually useless as a tool for making admissions decisions.

  The report focussed on what is called predictive validity, a statistical measure of how well a high-school student's performance in any given test or program predicts his or her performance as a college freshman. If you wanted to, for instance, you could calculate the predictive validity of prowess at Scrabble, or the number of books a student reads in his senior year, or, more obviously, high-school grades. What the Educational Testing Service (which creates the S.A.T.) and the College Board (which oversees it) have always argued is that most performance measures are so subjective and unreliable that only by adding aptitude-test scores into the admissions equation can a college be sure it is picking the right students.

  This is what the U.C. study disputed. It compared the predictive validity of three numbers: a student's high-school G.P.A., his or her score on the S.A.T. (or, as it is formally known, the S.A.T. I), and his or her score on what is known as the S.A.T. II, which is a so-called achievement test, aimed at gauging mastery of specific areas of the high-school curriculum. Drawing on the transcripts of seventy-eight thousand University of California freshmen from 1996 through 1999, the report found that, over all, the most useful statistic in predicting freshman grades was the S.A.T. II, which explained sixteen per cent of the "variance" (which is another measure of predictive validity). The second most useful was high-school G.P.A., at 15.4 per cent. The S.A.T. was the least useful, at 13.3 per cent. Combining high-school G.P.A. and the S.A.T. II explained 22.2 per cent of the variance in freshman grades. Adding in S.A.T. I scores increased that number by only 0.1 per cent. Nor was the S.A.T. better at what one would have thought was its strong suit: identifying high-potential students from bad schools. In fact, the study found that achievement tests were ten times more useful than the S.A.T. in predicting the success of students from similar backgrounds. "Achievement tests are fairer to students because they measure accomplishment rather than promise," Richard Atkinson, the president of the University of California, told a conference on college admissions last month. "They can be used to improve performance; they are less vulnerable to charges of cultural or socioeconomic bias; and they are more appropriate for schools because they set clear curricular guidelines and clarify what is important for students to learn. Most important, they tell students that a college education is within the reach of anyone with the talent and determination to succeed."

  This argument has been made before, of course. The S.A.T. has been under attack, for one reason or another, since its inception. But what is happening now is different. The University of California is one of the largest single customers of the S.A.T. It was the U.C. system's decision, in 1968, to adopt the S.A.T. that affirmed the test's national prominence in the first place. If U.C. defects from the S.A.T., it is not hard to imagine it being followed by a stampede of other colleges. Seventy-five years ago, the S.A.T. was instituted because we were more interested, as a society, in what a student was capable of learning than in what he had already learned. Now, apparently, we have changed our minds, and few people bear more responsibility for that shift than Stanley H. Kaplan.

  3.

  From the moment he set up shop on Avenue K, Stanley Kaplan was a pariah in the educational world. Once, in 1956, he went to a meeting for parents and teachers at a local high school to discuss the upcoming S.A.T., and one of the teachers leading the meeting pointed his finger at Kaplan and shouted, "I refuse to continue until THAT MAN leaves the room." When Kaplan claimed that his students routinely improved their scores by a hundred points or more, he was denounced by the testing establishment as a "quack" and "the cram king" and a "snake oil salesman." At the Educational Testing Service, "it was a cherished assumption that the S.A.T. was uncoachable," Nicholas Lemann writes in his history of the S.A.T., "The Big Test":

  The whole idea of psychometrics was that mental tests are a measurement of a psychical property of the brain, analogous to taking a blood sample. By definition, the test-taker could not affect the result. More particularly, E.T.S.' s main point of pride about the S.A.T. was its extremely high test-retest reliability, one of the best that any standardized test had ever achieved.   . So confident of the S.A.T.'s reliability was E.T.S. that the basic technique it developed for catching cheaters was simply to compare first and second scores, and to mount an investigation in the case of any very large increase. E.T.S. was sure that substantially increasing one's score could be accomplished only by nefarious means.

  But Kaplan wasn't cheating. His great contribution was to prove that the S.A.T. was eminently coachable - that whatever it was that the test was measuring was less like a blood sample than like a heart rate, a vital sign that could be altered through the right exercises. In those days, for instance, the test was a secret. Students walking in to take the S.A.T. were often in a state of terrified ignorance about what to expect. (It wasn't until the early eighties that the E.T.S. was forced to release copies of old test questions to the public.) So Kaplan would have "Thank Goodness It's Over" pizza parties after each S.A.T. As his students talked about the questions they had faced, he and his staff would listen and take notes, trying to get a sense of how better to structure their coaching. "Every night I stayed up past midnight writing new questions and study materials," he writes. "I spent hours trying to understand the design of the test, trying to think like the test makers, anticipating the types of questions my students would face." His notes were typed up the next day, cranked out on a Gestetner machine, hung to dry in the office, then snatched off the line and given to waiting students. If students knew what the S.A.T. was like, he reasoned, they would be more confident. They could skip the instructions and save time. They could learn how to pace themselves. They would guess more intelligently. (For a question with five choices, a right answer is worth one point but a wrong answer results in minus one-quarter of a point - which is why students were always warned that guessing was penalized. In reality, of course, if a student can eliminate even one obviously wrong possibility from the list of choices, guessing becomes an intelligent strategy.) The S.A.T. was a test devised by a particular institution, by a particular kind of person, operating from a particular mind-set. It had an ideology, and Kaplan realized that anyone who understood that ideology would have a tremendous advantage.

  Critics of the S.A.T. have long made a kind of parlor game of seeing how many questions on the reading-comprehension section (where a passage is followed by a series of multiple-choice questions about its meaning) can be answered without reading the passage. David Owen, in the anti-S.A.T. account "None of the Above," gives the following example, adapted from an actual S.A.T. exam:

1.
The main idea of the passage is that:
A) a constricted view of [this novel] is natural and acceptable
B) a novel should not depict a vanished society
C) a good novel is an intellectual rather than an emotional experience
D) many readers have seen only the comedy [in this novel]
E) [this novel] should be read with sensitivity and an open mind

  If you've never seen an S.A.T. before, it might be difficult to guess the right answer. But if, through practice and exposure, you have managed to assimilate the ideology of the S.A.T. - the kind of decent, middlebrow earnestness that permeates the testit's possible to develop a kind of gut feeling for the right answer, the confidence to predict, in the pressure and rush of examination time, what the S.A.T. is looking for. A is suspiciously postmodern. B is far too dogmatic. C is something that you would never say to an eager, college-bound student. Is it D? Perhaps, but D seems too small a point. It's probably E - and, sure enough, it is.

  With that in mind, try this question:

2.
The author of [this passage] implies that a work of art is properly judged on the basis of its:
A) universality of human experience truthfully recorded
B) popularity and critical acclaim in its own age
C) openness to varied interpretations, including seemingly contradictory ones
D) avoidance of political and social issues of minor importance
E) continued popularity through different eras and with different societies

  Is it any surprise that the answer is A? Bob Schaeffer, the public education director of the anti-test group FairTest, says that when he got a copy of the latest version of the S.A.T. the first thing he did was try the reading comprehension section blind. He got twelve out of thirteen questions right. The math portion of the S.A.T. is perhaps a better example of how coachable the test can be. Here is another question, cited by Owen, from an old S.A.T.:

In how many different color combinations can 3 balls be painted if each ball is painted one color and there are 3 colors available? (Order is not considered; e.g. red, blue, red is considered the same combination as red, red, blue.)

A) 4
B) 6
C) 9
D) 10
E) 27

  This was, Owen points out, the twenty-fifth question in a twenty-five-question math section. S.A.T.s - like virtually all standardized tests - rank their math questions from easiest to hardest. If the hardest questions came first, the theory goes, weaker students would be so intimidated as they began the test that they might throw up their hands in despair. So this is a "hard" question. The second thing to understand about the S.A.T. is that it only really works if good students get the hard questions right and poor students get the hard questions wrong. If anyone can guess or blunder his way into the right answer to a hard question, then the test isn't doing its job. So this is the second clue: the answer to this question must not be something that an average student might blunder into answering correctly. With these two facts in mind, Owen says, don't focus on the question. Just look at the numbers: there are three balls and three colors. The average student is most likely to guess by doing one of three things - adding three and three, multiplying three times three, or, if he is feeling more adventurous, multiplying three by three by three. So six, nine, and twenty-seven are out. That leaves four and ten. Now, he says, read the problem. It can't be four, since anyone can think of more than four combinations. The correct answer must be D, 10.

  Does being able to answer that question mean that a student has a greater "aptitude" for math? Of course not. It just means that he had a clever teacher. Kaplan once determined that the testmakers were fond of geometric problems involving the Pythagorean theorem. So an entire generation of Kaplan students were taught "boo, boo, boo, square root of two," to help them remember how the Pythagorean formula applies to an isosceles right triangle. "It was usually not lack of ability," Kaplan writes, "but poor study habits, inadequate instruction or a combination of the two that jeopardized students' performance." The S.A.T. was not an aptitude test at all.

  4.

  In proving that the S.A.T. was coachable, Stanley Kaplan did something else, which was of even greater importance. He undermined the use of aptitude tests as a means of social engineering. In the years immediately before and after the First World War, for instance, the country's élite colleges faced what became known as "the Jewish problem." They were being inundated with the children of Eastern European Jewish immigrants. These students came from the lower middle class and they disrupted the genteel Wasp sensibility that had been so much a part of the Ivy League tradition. They were guilty of "underliving and overworking." In the words of one writer, they "worked far into each night [and] their lessons next morning were letter perfect." They were "socially untrained," one Harvard professor wrote, "and their bodily habits are not good." But how could a college keep Jews out? Columbia University had a policy that the New York State Regents Examinations - the statewide curriculum-based high-school-graduation examination - could be used as the basis for admission, and the plain truth was that Jews did extraordinarily well on the Regents Exams. One solution was simply to put a quota on the number of Jews, which is what Harvard explored. The other idea, which Columbia followed, was to require applicants to take an aptitude test. According to Herbert Hawkes, the dean of Columbia College during this period, because the typical Jewish student was simply a "grind," who excelled on the Regents Exams because he worked so hard, a test of innate intelligence would put him back in his place. "We have not eliminated boys because they were Jews and do not propose to do so," Hawkes wrote in 1918: We have honestly attempted to eliminate the lowest grade of applicant and it turns out that a good many of the low grade men are New York City Jews. It is a fact that boys of foreign parentage who have no background in many cases attempt to educate themselves beyond their intelligence. Their accomplishment is over 100% of their ability on account of their tremendous energy and ambition. I do not believe however that a College would do well to admit too many men of low mentality who have ambition but not brains.

  Today, Hawkes's anti-Semitism seems absurd, but he was by no means the last person to look to aptitude tests as a means of separating ambition from brains. The great selling point of the S.A.T. has always been that it promises to reveal whether the high-school senior with a 3.0 G.P.A. is someone who could have done much better if he had been properly educated or someone who is already at the limit of his abilities. We want to know that information because, like Hawkes, we prefer naturals to grinds: we think that people who achieve based on vast reserves of innate ability are somehow more promising and more worthy than those who simply work hard.

  But is this distinction real? Some years ago, a group headed by the British psychologist John Sloboda conducted a study of musical talent. The group looked at two hundred and fifty-six young musicians, between the ages of ten and sixteen, drawn from élite music academies and public-school music programs alike. They interviewed all the students and their parents and recorded how each student did in England's national music-examination system, which, the researchers felt, gave them a relatively objective measure of musical ability. "What we found was that the best predictor of where you were on that scale was the number of hours practiced," Sloboda says. This is, if you think about it, a little hard to believe. We conceive musical ability to be a "talent" - people have an aptitude for music - and so it would make sense that some number of students could excel at the music exam without practicing very much. Yet Sloboda couldn't find any. The kids who scored the best on the test were, on average, practicing eight hundred per cent more than the kids at the bottom. "People have this idea that there are those who learn better than others, can get further on less effort,"Sloboda says. "On average, our data refuted that. Whether you're a dropout or at the best school, where you end up can be predicted by how much you practice."

  Sloboda found another striking similarity among the "musical" children. They all had parents who were unusually invested in their musical education. It wasn't necessarily the case that the parents were themselves musicians or musically inclined. It was simply that they wanted their children to be that way. "The parents of the high achievers did things that most parents just don't do," he said. "They didn' t simply drop their child at the door of the teacher. They went into the practice room. They took notes on what the teacher said, and when they got home they would say, Remember when your teacher said do this and that. There was a huge amount of time and motivational investment by the parents." Does this mean that there is no such thing as musical talent? Of course not. Most of those hardworking children with pushy parents aren't going to turn out to be Itzhak Perlmans; some will be second violinists in their community orchestra. The point is that when it comes to a relatively well-defined and structured task - like playing an instrument or taking an exam - how hard you work and how supportive your parents are have a lot more to do with success than we ordinarily imagine. Ability cannot be separated from effort. The testmakers never understood that, which is why they thought they could weed out the grinds. But educators increasingly do, and that is why college admissions are now in such upheaval. The Texas state-university system, for example, has, since 1997, automatically admitted any student who places in the top ten per cent of his or her high-school class - regardless of S.A.T. score. Critics of the policy said that it would open the door to students from marginal schools whose S.A.T. scores would normally have been too low for admission to the University of Texas - and that is exactly what happened. But so what? The "top ten percenters," as they are known, may have lower S.A.T. scores, but they get excellent grades. In fact, their college G.P.A.s are the equal of students who scored two hundred to three hundred points higher on the S.A.T. In other words, the determination and hard work that propel someone to the top of his high-school class - even in cases where that high school is impoverished - are more important to succeeding in college (and, for that matter, in life) than whatever abstract quality the S.A.T. purports to measure. The importance of the Texas experience cannot be overstated. Here, at last, is an intelligent alternative to affirmative action, a way to find successful minority students without sacrificing academic performance. But we would never have got this far without Stanley Kaplan - without someone first coming along and puncturing the mystique of the S.A.T. "Acquiring test-taking skills is the same as learning to play the piano or ride a bicycle,"Kaplan writes. "It requires practice, practice, practice. Repetition breeds familiarity. Familiarity breeds confidence." In this, as in so many things, the grind was the natural.

  To read Kaplan's memoir is to be struck by what a representative figure he was in the postwar sociological miracle that was Jewish Brooklyn. This is the lower-middle-class, second- and third-generation immigrant world, stretching from Prospect Park to Sheepshead Bay, that ended up peopling the upper reaches of American professional life. Thousands of students from those neighborhoods made their way through Kaplan's classroom in the fifties and sixties, many along what Kaplan calls the "heavily traveled path" from Brooklyn to Cornell, Yale, and the University of Michigan. Kaplan writes of one student who increased his score by three hundred and forty points, and ended up with a Ph.D. and a position as a scientist at Xerox. "Debbie" improved her S.A.T. by five hundred points, got into the University of Chicago, and earned a Ph.D. in clinical psychology. Arthur Levine, the president of Teachers College at Columbia University, raised his S.A.T.s by two hundred and eighty-two points, "making it possible," he writes on the book?s jacket, "for me to attend a better university than I ever would have imagined." Charles Schumer, the senior senator from New York, studied while he worked the mimeograph machine in Kaplan's office, and ended up with close to a perfect sixteen hundred.

  These students faced a system designed to thwart the hard worker, and what did they do? They got together with their pushy parents and outworked it. Kaplan says that he knew a "strapping athlete who became physically ill before taking the S.A.T. because his mother was so demanding." There was the mother who called him to say, "Mr. Kaplan, I think I'm going to commit suicide. My son made only a 1000 on the S.A.T." "One mother wanted her straight-A son to have an extra edge, so she brought him to my basement for years for private tutoring in basic subjects," Kaplan recalls. "He was extremely bright and today is one of the country' s most successful ophthalmologists." Another student was "so nervous that his mother accompanied him to class armed with a supply of terry-cloth towels. She stood outside the classroom and when he emerged from our class sessions dripping in sweat, she wiped him dry and then nudged him back into the classroom." Then, of course, there was the formidable four-foot-eight figure of Ericka Kaplan, granddaughter of the chief rabbi of the synagogue of Prague. "My mother was a perfectionist whether she was keeping the company books or setting the dinner table," Kaplan writes, still in her thrall today. "She was my best cheerleader, the reason I performed so well, and I constantly strove to please her." What chance did even the most artfully constructed S.A.T. have against the mothers of Brooklyn?

  5.

  Stanley Kaplan graduated No. 2 in his class at City College, and won the school's Award for Excellence in Natural Sciences. He wanted to be a doctor, and he applied to five medical schools, confident that he would be accepted. To his shock, he was rejected by every single one. Medical schools did not take public colleges like City College seriously. More important, in the forties there was a limit to how many Jews they were willing to accept. "The term meritocracy - or success based on merit rather than heritage, wealth, or social status?wasn?t even coined yet," Kaplan writes, "and the methods of selecting students based on talent, not privilege, were still evolving."

  That's why Stanley Kaplan was always pained by those who thought that what went on in his basement was somehow subversive. He loved the S.A.T. He thought that the test gave people like him the best chance of overcoming discrimination. As he saw it, he was simply giving the middle-class students of Brooklyn the same shot at a bright future that their counterparts in the private schools of Manhattan had. In 1983, after years of hostility, the College Board invited him to speak at its annual convention. It was one of the highlights of Kaplan's life. "Never, in my wildest dreams," he began, "did I ever think I'd be speaking to you here today."

  The truth is, however, that Stanley Kaplan was wrong. What he did in his basement was subversive. The S.A.T. was designed as an abstract intellectual tool. It never occurred to its makers that aptitude was a social matter: that what people were capable of was affected by what they knew, and what they knew was affected by what they were taught, and what they were taught was affected by the industry of their teachers and parents. And if what the S.A.T. was measuring, in no small part, was the industry of teachers and parents, then what did it mean? Stanley Kaplan may have loved the S.A.T. But when he stood up and recited "boo, boo, boo, square root of two," he killed it.
GO TO TOP MENU

  The disposable diaper and the meaning of progress.

  1.

  The best way to explore the mystery of the Huggies Ultratrim disposable diaper is to unfold it and then cut it in half, widthwise, across what is known as the diaper's chassis. At Kimberly-Clark's Lakeview plant, in Neenah, Wisconsin, where virtually all the Huggies in the Midwest are made, there is a quality-control specialist who does this all day long, culling diapers from the production line, pinning them up against a lightboard, and carefully dismembering them with a pair of scissors. There is someone else who does a "visual cull," randomly picking out Huggies and turning them over to check for flaws. But a surface examination tells you little. A diaper is not like a computer that makes satisfying burbling noises from time to time, hinting at great inner complexity. It feels like papery underwear wrapped around a thin roll of Cottonelle. But peel away the soft fabric on the top side of the diaper, the liner, which receives what those in the trade delicately refer to as the "insult." You'll find a layer of what's called polyfilm, which is thinner than a strip of Scotch tape. This layer is one of the reasons the garment stays dry: it has pores that are large enough to let air flow in, so the diaper can breathe, but small enough to keep water from flowing out, so the diaper doesn't leak.

  Or run your hands along that liner. It feels like cloth. In fact, the people at Kimberly-Clark make the liner out of a special form of plastic, a polyresin. But they don't melt the plastic into a sheet, as one would for a plastic bag. They spin the resin into individual fibres, and then use the fibres to create a kind of microscopic funnel, channelling the insult toward the long, thick rectangular pad that runs down the center of the chassis, known as the absorbent core. A typical insult arrives at a rate of seven millilitres a second, and might total seventy millilitres of fluid. The liner can clear that insult in less than twenty seconds. The core can hold three or more of those insults, with a chance of leakage in the single digits. The baby's skin will remain almost perfectly dry, and that is critical, because prolonged contact between the baby and the insult (in particular, ammonium hydroxide, a breakdown product of urine) is what causes diaper rash. And all this will be accomplished by a throwaway garment measuring, in the newborn size, just seven by thirteen inches. This is the mystery of the modern disposable diaper: how does something so small do so much?

  2.

  Thirty-seven years ago, the Silicon Valley pioneer Gordon Moore made a famous prediction. The number of transistors that engineers could fit onto a microchip, he said, would double every two years. It seemed like a foolhardy claim: it was not clear that you could keep making transistors smaller and smaller indefinitely. It also wasn't clear that it would make sense to do so. Most of the time when we make things smaller, after all, we pay a price. A smaller car is cheaper and more fuel-efficient, and easier to park and maneuver, but it will never be as safe as a larger car. In the nineteen-fifties and sixties, the transistor radio was all the rage; it could fit inside your pocket and run on a handful of batteries. But, because it was so small, the sound was terrible, and virtually all the other mini-electronics turn out to be similarly imperfect. Tiny cell phones are hard to dial. Tiny televisions are hard to watch. In making an object smaller, we typically compromise its performance. The remarkable thing about chips, though, was that there was no drawback: if you could fit more and more transistors onto a microchip, then instead of using ten or twenty or a hundred microchips for a task you could use just one. This meant, in turn, that you could fit microchips in all kinds of places (such as cellular phones and laptops) that you couldn't before, and, because you were using one chip and not a hundred, computer power could be had at a fraction of the price, and because chips were now everywhere and in such demand they became even cheaper to make - and so on and so on. Moore's Law, as it came to be called, describes that rare case in which there is no trade-off between size and performance. Microchips are what might be termed a perfect innovation.

  In the past twenty years, diapers have got smaller and smaller, too. In the early eighties, they were three times bulkier than they are now, thicker and substantially wider in the crotch. But in the mid-eighties Huggies and Procter & Gamble's Pampers were reduced in bulk by fifty per cent; in the mid-nineties they shrank by a third or so; and in the next few years they may shrink still more. It seems reasonable that there should have been a downside to this, just as there was to the shrinking of cars and radios: how could you reduce the amount of padding in a diaper and not, in some way, compromise its ability to handle an insult? Yet, as diapers got smaller, they got better, and that fact elevates the diaper above nearly all the thousands of other products on the supermarket shelf.

  Kimberly-Clark's Lakeview plant is a huge facility, just down the freeway from Green Bay. Inside, it is as immaculate as a hospital operating room. The walls and floors have been scrubbed white. The stainless-steel machinery gleams. The employees are dressed in dark-blue pants, starched light-blue button-down shirts, and tissue-paper caps. There are rows of machines in the plant, each costing more than fifteen million dollars - a dizzying combination of conveyor belts and whirling gears and chutes stretching as long as a city block and creating such a din that everyone on the factory floor wears headsets and communicates by radio. Computers monitor a million data points along the way, insuring that each of those components is precisely cut and attached according to principles and processes and materials protected, on the Huggies Ultratrim alone, by hundreds of patents. At the end of the line, the Huggies come gliding out of the machine, stacked upright, one after another in an endless row, looking like exquisitely formed slices of white bread in a toast rack. For years, because of Moore's Law, we have considered the microchip the embodiment of the technological age. But if the diaper is also a perfect innovation, doesn't it deserve a place beside the chip?

  3.

  The modern disposable diaper was invented twice, first by Victor Mills and then by Carlyle Harmon and Billy Gene Harper. Mills worked for Procter & Gamble, and he was a legend. Ivory soap used to be made in an expensive and time-consuming batch-by-batch method. Mills figured out a simpler, continuous process. Duncan Hines cake mixes used to have a problem blending flour, sugar, and shortening in a consistent mixture. Mills introduced the machines used for milling soap, which ground the ingredients much more finely than before, and the result was New, Improved Duncan Hines cake mix. Ever wonder why Pringles, unlike other potato chips, are all exactly the same shape? Because they are made like soap: the potato is ground into a slurry, then pressed, baked, and wrapped - and that was Victor Mills's idea, too.

  In 1957, Procter & Gamble bought the Charmin Paper Company, of Green Bay, Wisconsin, and Mills was told to think of new products for the paper business. Since he was a grandfather - and had always hated washing diapers - he thought of a disposable diaper. "One of the early researchers told me that among the first things they did was go out to a toy store and buy one of those Betsy Wetsy-type dolls, where you put water in the mouth and it comes out the other end," Ed Rider, the head of the archives department at Procter & Gamble, says. "They brought it back to the lab, hooked up its legs on a treadmill to make it walk, and tested diapers on it." The end result was Pampers, which were launched in Peoria, in 1961. The diaper had a simple rectangular shape. Its liner, which lay against the baby's skin, was made of rayon. The outside material was plastic. In between were multiple layers of crêped tissue. The diaper was attached with pins and featured what was known as a Z fold, meaning that the edges of the inner side were pleated, to provide a better fit around the legs.

  In 1968, Kimberly-Clark brought out Kimbies, which took the rectangular diaper and shaped it to more closely fit a baby's body. In 1976, Procter & Gamble brought out Luvs, which elasticized the leg openings to prevent leakage. But diapers still adhered to the basic Millsian notion of an absorbent core made out of paper - and that was a problem. When paper gets wet, the fluid soaks right through, which makes diaper rash worse. And if you put any kind of pressure on paper - if you squeeze it, or sit on it - it will surrender some of the water it has absorbed, which creates further difficulties, because a baby, in the usual course of squirming and crawling and walking, might place as much as five kilopascals of pressure on the absorbent core of a diaper. Diaper-makers tried to address this shortcoming by moving from crêped tissue to what they called fluff, which was basically finely shredded cellulose. Then they began to compensate for paper's failing by adding more and more of it, until diapers became huge. But they now had Moore's Law in reverse: in order to get better, they had to get bigger - and bigger still wasn't very good.

  Carlyle Harmon worked for Johnson & Johnson and Billy Gene Harper worked for Dow Chemical, and they had a solution. In 1966, each filed separate but virtually identical patent applications, proposing that the best way to solve the diaper puzzle was with a peculiar polymer that came in the form of little pepperlike flakes and had the remarkable ability to absorb up to three hundred times its weight in water.

  In the Dow patent, Harper and his team described how they sprinkled two grams of the superabsorbent polymer between two twenty-inch-square sheets of nylon broadcloth, and then quilted the nylon layers together. The makeshift diaper was "thereafter put into use in personal management of a baby of approximately 6 months age." After four hours, the diaper was removed. It now weighed a hundred and twenty grams, meaning the flakes had soaked up sixty times their weight in urine.

  Harper and Harmon argued that it was quite unnecessary to solve the paper problem by stuffing the core of the diaper with thicker and thicker rolls of shredded pulp. Just a handful of superabsorbent polymer would do the job. Thus was the modern diaper born. Since the mid-eighties, Kimberly-Clark and Procter & Gamble have made diapers the Harper and Harmon way, pulling out paper and replacing it with superabsorbent polymer. The old, paper-filled diaper could hold, at most, two hundred and seventy-five millilitres of fluid, or a little more than a cup. Today, a diaper full of superabsorbent polymer can handle as much as five hundred millilitres, almost twice that. The chief characteristic of the Mills diaper was its simplicity: the insult fell directly into the core. But the presence of the polymer has made the diaper far more complex. It takes longer for the polymer than it does paper to fully absorb an insult, for instance. So another component was added, the acquisition layer, between the liner and the core. The acquisition layer acts like blotting paper, holding the insult while the core slowly does its work, and distributing the fluid over its full length.

  Diaper researchers sometimes perform what is called a re-wet test, where they pour a hundred millilitres of fluid onto the surface of a diaper and then apply a piece of filter paper to the diaper liner with five kilopascals of pressure - the average load a baby would apply to a diaper during ordinary use. In a contemporary superabsorbent diaper, like a Huggies or a Pampers, the filter paper will come away untouched after one insult. After two insults, there might be 0.1 millilitres of fluid on the paper. After three insults, the diaper will surrender, at most, only two millilitres of moisture - which is to say that, with the aid of superabsorbents, a pair of Huggies or Pampers can effortlessly hold, even under pressure, a baby's entire night's work.

  The heir to the legacy of Billy Gene Harper at Dow Chemical is Fredric Buchholz, who works in Midland, Michigan, a small town two hours northwest of Detroit, where Dow has its headquarters. His laboratory is in the middle of the sprawling chemical works, a mile or two away from corporate headquarters, in a low, unassuming brick building. "We still don't understand perfectly how these polymers work," Buchholz said on a recent fall afternoon. What we do know, he said, is that superabsorbent polymers appear, on a microscopic level, to be like a tightly bundled fisherman's net. In the presence of water, that net doesn't break apart into thousands of pieces and dissolve, like sugar. Rather, it just unravels, the way a net would open up if you shook it out, and as it does the water gets stuck in the webbing. That ability to hold huge amounts of water, he said, could make superabsorbent polymers useful in fire fighting or irrigation, because slightly gelled water is more likely to stay where it's needed. There are superabsorbents mixed in with the sealant on the walls of the Chunnel between England and France, so if water leaks in the polymer will absorb the water and plug the hole.

  Right now, one of the major challenges facing diaper technology, Buchholz said, is that urine is salty, and salt impairs the unravelling of the netting: superabsorbents can handle only a tenth as much salt water as fresh water. "One idea is to remove the salt from urine. Maybe you could have a purifying screen," he said. If the molecular structure of the superabsorbent were optimized, he went on, its absorptive capacity could increase by another five hundred per cent. "Superabsorbents could go from absorbing three hundred times their weight to absorbing fifteen hundred times their weight. We could have just one perfect particle of super-absorbent in a diaper. If you are going to dream, why not make the diaper as thin as a pair of underwear?"

  Buchholz was in his laboratory, and he held up a small plastic cup filled with a few tablespoons of superabsorbent flakes, each not much larger than a grain of salt. "It's just a granular material, totally nontoxic," he said. "This is about two grams." He walked over to the sink and filled a large beaker with tap water, and poured the contents of the beaker into the jar of superabsorbent. At first, nothing happened. The amounts were so disproportionate that it looked as if the water would simply engulf the flakes. But, slowly and steadily, the water began to thicken. "Look," Buchholz said. "It's becoming soupy." Sure enough, little beads of gel were forming. Nothing else was happening: there was no gas given off, no burbling or sizzling as the chemical process took place. The superabsorbent polymer was simply swallowing up the water, and within minutes the contents of the cup had thickened into what looked like slightly lumpy, spongy pudding. Buchholz picked up the jar and tilted it, to show that nothing at all was coming out. He pushed and prodded the mass with his finger. The water had disappeared. To soak up that much liquid, the Victor Mills diaper would have needed a thick bundle of paper towelling. Buchholz had used a few tablespoons of superabsorbent flakes. Superabsorbent was not merely better; it was smaller.

  4.

  Why does it matter that the diaper got so small? It seems a trivial thing, chiefly a matter of convenience to the parent taking a bag of diapers home from the supermarket. But it turns out that size matters a great deal. There's a reason that there are now "new, improved concentrated" versions of laundry detergent, and that some cereals now come in smaller boxes. Smallness is one of those changes that send ripples through the whole economy. The old disposable diapers, for example, created a transportation problem. Tractor-trailers are prohibited by law from weighing more than eighty thousand pounds when loaded. That's why a truck carrying something heavy and compact like bottled water or Campbell's soup is "full," when the truck itself is still half empty. But the diaper of the eighties was what is known as a "high cube" item. It was bulky and not very heavy, meaning that a diaper truck was full before it reached its weight limit. By cutting the size of a diaper in half, companies could fit twice as many diapers on a truck, and cut transportation expenses in half. They could also cut the amount of warehouse space and labor they needed in half. And companies could begin to rethink their manufacturing operations. "Distribution costs used to force you to have plants in lots of places," Dudley Lehman, who heads the Kimberly-Clark diaper business, says. "As that becomes less and less of an issue, you say, 'Do I really need all my plants?' In the United States, it used to take eight. Now it takes five." (Kimberly-Clark didn't close any plants. But other manufacturers did, and here, perhaps, is a partial explanation for the great wave of corporate restructuring that swept across America in the late eighties and early nineties: firms could downsize their workforce because they had downsized their products.) And, because using five plants to make diapers is more efficient than using eight, it became possible to improve diapers without raising diaper prices - which is important, because the sheer number of diapers parents have to buy makes it a price-sensitive product. Until recently, diapers were fastened with little pieces of tape, and if the person changing the diapers got lotion or powder on her fingers the tape wouldn't work. A hook-and-loop, Velcro-like fastener doesn't have this problem. But it was years before the hook-and-loop fastener was incorporated into the diaper chassis: until over-all manufacturing costs were reduced, it was just too expensive.

  Most important, though, is how size affects the way diapers are sold. The shelves along the aisles of a supermarket are divided into increments of four feet, and the space devoted to a given product category is almost always a multiple of that. Diapers, for example, might be presented as a twenty-foot set. But when diapers were at their bulkiest the space reserved for them was never enough. "You could only get a limited number on the shelf," says Sue Klug, the president of Catalina Marketing Solutions and a former executive for Albertson's and Safeway. "Say you only had six bags. Someone comes in and buys a few, and then someone else comes in and buys a few more. Now you're out of stock until someone reworks the shelf, which in some supermarkets might be a day or two." Out-of-stock rates are already a huge problem in the retail business. At any given time, only about ninety-two per cent of the products that a store is supposed to be carrying are actually on the shelf - which, if you consider that the average supermarket has thirty-five thousand items, works out to twenty-eight hundred products that are simply not there. (For a highly efficient retailer like Wal-Mart, in-stock rates might be as high as ninety-nine per cent; for a struggling firm, they might be in the low eighties.) But, for a fast-moving, bulky item like diapers, the problem of restocking was much worse. Supermarkets could have allocated more shelf space to diapers, of course, but diapers aren't a particularly profitable category for retailers - profit margins are about half what they are for the grocery department. So retailers would much rather give more shelf space to a growing and lucrative category like bottled water. "It's all a trade-off," Klug says. "If you expand diapers four feet, you've got to give up four feet of something else." The only way diaper-makers could insure that their products would actually be on the shelves was to make the products smaller, so they could fit twelve bags into the space of six. And if you can fit twelve bags on a shelf, you can introduce different kinds of diapers. You can add pull-ups and premium diapers and low-cost private-label diapers, all of which give parents more options.

  "We cut the cost of trucking in half," says Ralph Drayer, who was in charge of logistics for Procter & Gamble for many years and now runs his own supply-chain consultancy in Cincinnati. "We cut the cost of storage in half. We cut handling in half, and we cut the cost of the store shelf in half, which is probably the most expensive space in the whole chain." Everything in the diaper world, from plant closings and trucking routes to product improvements and consumer choice and convenience, turns, in the end, on the fact that Harmon and Harper's absorbent core was smaller than Victor Mills's.

  The shame of it, though, is that Harmon and Harper have never been properly celebrated for their accomplishment. Victor Mills is the famous one. When he died, he was given a Times obituary, in which he was called "the father of disposable diapers." When Carlyle Harmon died, seven months earlier, he got four hundred words in Utah's Deseret News, stressing his contributions to the Mormon Church. We tend to credit those who create an idea, not those who perfect it, forgetting that it is often only in the perfection of an idea that true progress occurs. Putting sixty-four transistors on a chip allowed people to dream of the future. Putting four million transistors on a chip actually gave them the future. The diaper is no different. The paper diaper changed parenting. But a diaper that could hold four insults without leakage, keep a baby's skin dry, clear an insult in twenty seconds flat, and would nearly always be in stock, even if you arrived at the supermarket at eight o'clock in the evening - and that would keep getting better at all those things, year in and year out - was another thing altogether. This was more than a good idea. This was something like perfection.
GO TO TOP MENU

  If you are wondering what to worry about when it comes to biological weapons, you should concern yourself, first of all, with things that are easy to deliver. Biological agents are really dangerous only when they can reach lots of people, and very few bioweapons can easily do that. In 1990, members of Japan's Aum Shinrikyo cult drove around the Parliament buildings in Tokyo in an automobile rigged to disseminate botulinum toxin. It didn't work. The same group also tried, repeatedly, to release anthrax from a rooftop, and that didn't work, either. It's simply too complicated to make anthrax in the fine, "mist" form that is the most lethal. And the spores are destroyed so quickly by sunlight that any kind of mass administration of anthrax is extremely difficult.

  A much scarier biological weapon would be something contagious: something a few infected people could spread, unwittingly, in ever widening and more terrifying circles. Even with a contagious agent, though, you don't really have to worry about pathogens that are what scientists call stable - that are easy to identify and that don't change from place to place or year to year - because those kinds of biological agents are easy to defend against. That's why you shouldn't worry quite so much about smallpox. Deadly as it is, smallpox is so well understood that the vaccine is readily made and extraordinarily effective, and works for decades. If we wanted to, we could all be inoculated against smallpox in a matter of years.

  What you really should worry about, then, is something that is highly contagious and highly unstable, a biological agent that kills lots of people and isn't easy to treat, that mutates so rapidly that each new bout of terror requires a brand-new vaccine. What you should worry about, in other words, is the influenza virus.

  If there is an irony to America's current frenzy over anthrax and biological warfare - the paralyzed mailrooms, the endless talk-show discussions, the hoarding of antibiotics, and the closed halls of Congress - it is that it has occurred right at the beginning of the flu season, the time each year when the democracies of the West are routinely visited by one of the most deadly of all biological agents. This year, around twenty thousand Americans will die of the flu, and if this is one of those years, like 1957 or 1968, when we experience an influenza pandemic, that number may hit fifty thousand. The victims will primarily be the very old and the very young, although there will be a significant number of otherwise healthy young adults among them, including many pregnant women. All will die horrible deaths, racked by raging fevers, infections, headaches, chills, and sweats. And the afflicted, as they suffer, will pass their illness on to others, creating a wave of sickness that will cost the country billions of dollars. Influenza "quietly kills tens of thousands of people every year," Edwin Kilbourne, a research professor at New York Medical College and one of the country's leading flu experts, says. "And those who don't die are incapacitated for weeks. It mounts a silent and pervasive assault."

  That we have chosen to worry more about anthrax than about the flu is hardly surprising. The novel is always scarier than the familiar, and the flu virus, as far as we know, isn't being sent through the mail by terrorists. But it is a strange kind of public-health policy that concerns itself more with the provenance of illness than with its consequences; and the consequences of the flu, year in, year out, dwarf everything but the most alarmist bioterror scenarios. If even a fraction of the energy and effort now being marshalled against anthrax were directed instead at the flu, we could save thousands of lives. Kilbourne estimates that at least half the deaths each year from the flu are probably preventable: vaccination rates among those most at risk under the age of fifty are a shameful twenty-three per cent, and for asthmatic children, who are also at high risk, the vaccination rate is ten per cent. And vaccination has been shown to save money: the costs of hospitalization for those who get sick far exceed the costs of inoculating everyone else. Why, under the circumstances, this country hasn't mounted an aggressive flu-vaccination program is a question that Congress might want to consider, when it returns to its newly fumigated, anthrax-free chambers. Not all threats to health and happiness come from terrorists in faraway countries. Many are the result of what, through simple indifference, we do to ourselves.
GO TO TOP MENU

  How far can airline safety go?

  1.

  On November 24, 1971, a man in a dark suit, white shirt, and sunglasses bought a ticket in the name of Dan Cooper on the 2:50 P.M. Northwest Orient flight from Portland to Seattle. Once aboard the plane, he passed a note to a flight attendant. He was carrying a bomb, he said, and he wanted two hundred thousand dollars, four parachutes, and "no funny stuff." In Seattle, the passengers and flight attendants were allowed to leave, and the F.B.I. handed over the parachutes and the money in used twenty-dollar bills. Cooper then told the pilot to fly slowly at ten thousand feet in the direction of Nevada, and not long after takeoff, somewhere over southwest Washington, he gathered up the ransom, lowered the plane's back stairs, and parachuted into the night.

  In the aftermath of Cooper's leap, "para-jacking," as it was known, became an epidemic in American skies. Of the thirty-one hijackings in the United States the following year, nineteen were attempts at Cooper-style extortion, and in fifteen of those cases the hijackers demanded parachutes so that they, too, could leap to freedom. It was a crime wave unlike any America had seen, and in response Boeing installed a special latch on its 727 model which prevented the tail stairs from being lowered in flight. The latch was known as the Cooper Vane, and it seemed, at the time, to be an effective response to the reign of terror in the skies. Of course, it was not. The Cooper Vane just forced hijackers to come up with ideas other than parachuting out of planes.

  This is the great paradox of law enforcement. The better we are at preventing and solving the crimes before us, the more audacious criminals become. Put alarms and improved locks on cars, and criminals turn to the more dangerous sport of carjacking. Put guards and bulletproof screens in banks, and bank robbery gets taken over by high-tech hackers. In the face of resistance, crime falls in frequency but rises in severity, and few events better illustrate this tradeoff than the hijackings of September 11th. The way in which those four planes were commandeered that Tuesday did not simply reflect a failure of our security measures; it reflected their success. When you get very good at cracking down on ordinary hijacking — when you lock the stairs at the back of the aircraft with a Cooper Vane — what you are left with is extraordinary hijacking.

  2.

  The first serious push for airport security began in late 1972, in the wake of a bizarre hijacking of a DC-9 flight out of Birmingham, Alabama. A group of three men — one an escaped convict and two awaiting trial for rape — demanded a ransom of ten million dollars and had the pilot circle the Oak Ridge, Tennessee, nuclear facility for five hours, threatening to crash the plane if their demands were not met. Until that point, security at airports had been minimal, but, as the director of the Federal Aviation Administration said at the time, "The Oak Ridge odyssey has cleared the air." In December of that year, the airlines were given sixty days to post armed security officers at passenger-boarding checkpoints. On January 5, 1973, all passengers and all carry-on luggage were required by law to be screened, and X-ray machines and metal detectors began to be installed in airports.

  For a time, the number of hijackings dropped significantly. But it soon became clear that the battle to make flying safer was only beginning. In the 1985 hijacking of TWA Flight 847 out of Athens — which lasted seventeen days — terrorists bypassed the X-ray machines and the metal detectors by using members of the cleaning staff to stash guns and grenades in a washroom of the plane. In response, the airlines started to require background checks and accreditation of ground crews. In 1986, El Al security officers at London's Heathrow Airport found ten pounds of high explosives in the luggage of an unwitting and pregnant Irish girl, which had been placed there by her Palestinian boyfriend. Now all passengers are asked if they packed their bags themselves. In a string of bombings in the mid-eighties, terrorists began checking explosives-filled bags onto planes without boarding the planes themselves. Airlines responded by introducing "bag matching" on international flights — stipulating that no luggage can be loaded on a plane unless its owner is on board as well. As an additional safety measure, the airlines started X-raying and searching checked bags for explosives. But in the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland, terrorists beat that system by hiding plastic explosives inside a radio. As a result, the airlines have now largely switched to using CT scanners, a variant of the kind used in medical care, which take a three-dimensional picture of the interior of every piece of luggage and screen it with pattern-recognition software. The days when someone could stroll onto a plane with a bag full of explosives are long gone.

  3.

  These are the security obstacles that confront terrorists planning an attack on an airline. They can't bomb an international flight with a checked bag, because they know that there is a good chance the bag will be intercepted. They can't check the bag and run, because the bomb will never get on board. And they can't hijack the plane with a gun, because there is no sure way of getting that weapon on board. The contemporary hijacker, in other words, must either be capable of devising a weapon that can get past security or be willing to go down with the plane. Most terrorists have neither the cleverness to meet the first criterion nor the audacity to meet the second, which is why the total number of hijackings has been falling for the past thirty years. During the nineties, in fact, the number of civil aviation "incidents" worldwide — hijackings, bombings, shootings, attacks, and so forth — dropped by more than seventy per cent. But this is where the law — enforcement paradox comes in: Even as the number of terrorist acts has diminished, the number of people killed in hijackings and bombings has steadily increased. And, despite all the improvements in airport security, the percentage of terrorist hijackings foiled by airport security in the years between 1987 and 1996 was at its lowest point in thirty years. Airport-security measures have simply chased out the amateurs and left the clever and the audacious. "A look at the history of attacks on commercial aviation reveals that new terrorist methods of attack have virtually never been foreseen by security authorities," the Israeli terrorism expert Ariel Merari writes, in the recent book "Aviation Terrorism and Security."

  The security system was caught by surprise when an airliner was first hijacked for political extortion; it was unprepared when an airliner was attacked on the tarmac by a terrorist team firing automatic weapons; when terrorists, who arrived as passengers, collected their luggage from the conveyer belt, took out weapons from their suitcases, and strafed the crowd in the arrivals hall; when a parcel bomb sent by mail exploded in an airliner's cargo hold in mid-flight; when a bomb was brought on board by an unwitting passenger. . . . The history of attacks on aviation is the chronicle of a cat-and-mouse game, where the cat is busy blocking old holes and the mouse always succeeds in finding new ones.

  And no hole was bigger than the one found on September 11th.

  4.

  What the attackers understood was the structural weakness of the passenger-gate security checkpoint, particularly when it came to the detection of knives. Hand-luggage checkpoints use X-ray machines, which do a good job of picking out a large, dense, and predictable object like a gun. Now imagine looking at a photograph of a knife. From the side, the shape is unmistakable. But if the blade edge is directly facing the camera what you'll see is just a thin line. "If you stand the knife on its edge, it could be anything," says Harry Martz, who directs the Center for Nondestructive Characterization at Lawrence Livermore Laboratories. "It could be a steel ruler. Then you put in computers, hair dryers, pens, clothes hangers, and it makes it even more difficult to pick up the pattern."

  The challenge of detecting something like a knife blade is made harder still by the psychological demands on X-ray operators. What they are looking for — weapons — is called the "signal," and a well-documented principle of human-factors research is that as the "signal rate" declines, detection accuracy declines as well. If there was a gun in every second bag, for instance, you could expect the signals to be detected with almost perfect accuracy: the X-ray operator would be on his toes. But guns are almost never found in bags, which means that the vigilance of the operator inevitably falters. This is a significant problem in many fields, from nuclear-plant inspection to quality-control in manufacturing plants — where the job of catching defects on, say, a car becomes harder and harder as cars become better made. "I've studied this in people who look for cracks in the rotor disks of airplane engines," says Colin Drury, a human-factors specialist at the University of Buffalo. "Remember the DC-10 crash at Sioux City? That was a rotor disk. Well, the probability of that kind of crack happening is incredibly small. Most inspectors won't see one in their lifetime, so it's very difficult to remain alert to that." The F.A.A. periodically plants weapons in baggage to see whether they are detected. But it's not clear what effect that kind of test has on vigilance. In the wake of the September attacks, some commentators called for increased training for X-ray security operators. Yet the problem is not just a lack of expertise; it is the paucity of signals. "Better training is only going to get you so far," explains Douglas Harris, chairman of Anacapa Sciences, a California-based human-factors firm. "If it now takes a day to teach people the techniques they need, adding another day isn't going to make much difference."

  A sophisticated terrorist wanting to smuggle knives on board, in other words, has a good shot at "gaming" the X-ray machine by packing his bags cleverly and exploiting the limitations of the operator. If he chooses, he can also beat the metal detector by concealing on his person knives made of ceramic or plastic, which wouldn't trip the alarm. The knife strategy has its drawbacks, of course. It's an open question how long a group of terrorists armed only with knives can hold off a cabin full of passengers. But if all they need is to make a short flight from Boston to downtown Manhattan knives would suffice.

  5.

  Can we close the loopholes that led to the September 11th attack? Logistically, an all-encompassing security system is probably impossible. A new safety protocol that adds thirty seconds to the check-in time of every passenger would add more than three hours to the preparation time for a 747, assuming that there are no additional checkpoints. Reforms that further encumber the country's already overstressed air-traffic system are hardly reforms; they are self-inflicted wounds. People have suggested that we station armed federal marshals on more flights. This could be an obstacle for some terrorists but an opportunity for others, who could overcome a marshal to gain possession of a firearm.

  What we ought to do is beef up security for a small percentage of passengers deemed to be high-risk. The airlines already have in place a screening technology of this sort, known as CAPPS — Computer-Assisted Passenger Prescreening System. When a ticket is purchased on a domestic flight in the United States, the passenger is rated according to approximately forty pieces of data. Though the parameters are classified, they appear to include the traveller's address, credit history, and destination; whether he or she is travelling alone; whether the ticket was paid for in cash; how long before the departure it was bought; and whether it is one way. (A recent review by the Department of Justice affirmed that the criteria are not discriminatory on the basis of ethnicity.) A sixty-eight-year-old male who lives on Park Avenue, has a fifty-thousand-dollar limit on his credit card, and has flown on the Washington-New York shuttle twice a week for the past eight years, for instance, is never going to get flagged by the CAPPS system. Probably no more than a handful of people per domestic flight ever are, but those few have their checked luggage treated with the kind of scrutiny that, until this month, was reserved for international flights. Their bags are screened for explosives and held until the passengers are actually on board. It would be an easy step to use the CAPPS ratings at the gate as well. Those dubbed high-risk could have their hand luggage scrutinized by the slower but much more comprehensive CT scanner, which would make hiding knives or other weapons in hand luggage all but impossible.

  At the same time, high-risk passengers could be asked to undergo an electronic strip search known as a body scan. In a conventional X-ray, the rays pass through the body, leaving an imprint on a detector on the other side. In a body scanner, the X-rays are much weaker, penetrating clothes but not the body, so they bounce back and leave an imprint of whatever lies on the surface of the skin. A body scanner would have picked up a ceramic knife in an instant. Focussing on a smaller group of high-risk people would have the additional benefit of improving the detection accuracy of the security staff: it would raise the signal rate.

  We may never know, of course, whether an expanded CAPPS system would have flagged the September 11th terrorists, but certainly those who planned the attack would have had to take that possibility seriously. The chief distinction between American and Israeli airport defense, at the moment, is that the American system focusses on technological examination of the baggage while the Israeli system focusses on personal interrogation and assessment of the passenger — which has resulted in El Al's having an almost unblemished record against bombings and hijackings over the past twenty years. Wider use of CAPPS profiling would correct that shortcoming, and narrow still further the options available for any would-be terrorist. But we shouldn't delude ourselves that these steps will end hijackings, any more than the Cooper Vane did thirty years ago. Better law enforcement doesn't eliminate crime. It forces the criminals who remain to come up with something else. And, as we have just been reminded, that something else, all too frequently, is something worse.
GO TO TOP MENU

  One of the most striking aspects of the automobile industry is the precision with which it makes calculations of life and death. The head restraint on the back of a car seat has been determined to reduce an occupant's risk of dying in an accident by 0.36 per cent. The steel beams in a car's side doors cut fatalities by 1.7 per cent. The use of a seat belt in a right-front collision reduces the chances of a front-seat passenger's being killed through ejection by fourteen per cent, with a margin of error of plus or minus one per cent. When auto engineers discuss these numbers, they use detailed charts and draw curves on quadrille paper, understanding that it is through the exact and dispassionate measurement of fatality effects and the resulting technical tinkering that human lives are saved. They could wax philosophical about the sanctity of life, but what would that accomplish? Sometimes progress in matters of social policy occurs when the moralizers step back and the tinkerers step forward. In the face of the right-to-life debate in the country and show trials like the Bush Administration's recent handling of the stem-cell controversy, it's worth wondering what would happen if those involved in that debate were to learn the same lesson.

  Suppose, for example, that, instead of focussing on the legality of abortion, we focussed on the number of abortions in this country. That's the kind of thing that tinkerers do: they focus not on the formal status of social phenomena but on their prevalence. And the prevalence of abortion in this country is striking. In 1995, for example, American adolescents terminated pregnancies at a rate roughly a third greater than their Canadian, English, and Swedish counterparts, around triple that of French teen-agers, and six times that of Dutch and Italian adolescents.

  This is not because abortions are more readily available in America. The European countries with the lowest abortion rates are almost all places where abortions are easier to get than they are in the United States. And it's not because pregnant European teen-agers are more likely to carry a child to term than Americans. (If anything, the opposite is true.) Nor is it because American teen-agers have more sex than Europeans: sexual behavior, in the two places, appears to be much the same. American teen-agers have more abortions because they get pregnant more than anyone else: they simply don't use enough birth control.

  Bringing the numbers down is by no means an insurmountable problem. Many Western European countries managed to reduce birth rates among teen-agers by more than seventy per cent between 1970 and 1995, and reproductive-health specialists say that there's no reason we couldn't follow suit. Since the early nineteen-seventies, for instance, the federal Title X program has funded thousands of family-planning clinics around the country, and in the past twenty years the program has been responsible for preventing an estimated nine million abortions. It could easily be expanded. There is also solid evidence that a comprehensive, national sex-education curriculum could help to reduce unintended pregnancies still further. If these steps succeeded in bringing our teen-age-pregnancy rates into line with those in Canada and England, the number of abortions in this country could drop by about five hundred thousand a year. For those who believe that a fetus is a human being, this is like saying that if we could find a few hundred million dollars, and face the fact that, yes, teen-agers have sex, we could save the equivalent of the population of Arizona within a decade.

  But this is not, unfortunately, the way things are viewed in Washington. Since the eighties, Title X has been under constant attack. Taking inflation into account, its level of funding is now about sixty per cent lower than it was twenty years ago, and the Bush Administration's budget appropriation does little to correct that shortfall. As for sex education, the President's stated preference is that a curriculum instructing teen-agers to abstain from sex be given parity with forms of sex education that mention the option of contraception. The chief distinguishing feature of abstinence-only programs is that there's no solid evidence that they do any good. The right's squeamishness about sex has turned America into the abortion capital of the West.

  But, then, this is the same movement that considered Ronald Reagan to be an ally and Bill Clinton a foe. And what does the record actually show? In the eight years of President Reagan's Administration, there was an average of 1.6 million abortions a year; by the end of President Clinton's first term, when the White House was much more favorably disposed toward the kinds of policies that are now anathema in Washington, that annual figure had dropped by more than two hundred thousand. A tinkerer would look at those numbers and wonder whether we need a new definition of "pro-life."
GO TO TOP MENU

  To beat the competition, first you have to beat the drug test.

  1.

  At the age of twelve, Christiane Knacke-Sommer was plucked from a small town in Saxony to train with the elite SC Dynamo swim club, in East Berlin. After two years of steady progress, she was given regular injections and daily doses of small baby-blue pills, which she was required to take in the presence of a trainer. Within weeks, her arms and shoulders began to thicken. She developed severe acne. Her pubic hair began to spread over her abdomen. Her libido soared out of control. Her voice turned gruff. And her performance in the pool began to improve dramatically, culminating in a bronze medal in the hundred-metre butterfly at the 1980 Moscow Olympics. But then the Wall fell and the truth emerged about those little blue pills. In a new book about the East German sports establishment, "Faust's Gold," Steven Ungerleider recounts the moment in 1998 when Knacke-Sommer testified in Berlin at the trial of her former coaches and doctors:

  "Did defendant Gläser or defendant Binus ever tell you that the blue pills were the anabolic steroid known as Oral-Turinabol?" the prosecutor asked. "They told us they were vitamin tablets," Christiane said, "just like they served all the girls with meals." "Did defendant Binus ever tell you the injection he gave was Depot-Turinabol?" "Never," Christiane said, staring at Binus until the slight, middle-aged man looked away. "He said the shots were another kind of vitamin." "He never said he was injecting you with the male hormone testosterone?" the prosecutor persisted. "Neither he nor Herr Gläser ever mentioned Oral-Turinabol or Depot-Turinabol," Christiane said firmly. "Did you take these drugs voluntarily?" the prosecutor asked in a kindly tone. "I was fifteen years old when the pills started," she replied, beginning to lose her composure. "The training motto at the pool was, 'You eat the pills, or you die.' It was forbidden to refuse."

  As her testimony ended, Knacke-Sommer pointed at the two defendants and shouted, "They destroyed my body and my mind!" Then she rose and threw her Olympic medal to the floor.

  Anabolic steroids have been used to enhance athletic performance since the early sixties, when an American physician gave the drugs to three weight lifters, who promptly jumped from mediocrity to world records. But no one ever took the use of illegal drugs quite so far as the East Germans. In a military hospital outside the former East Berlin, in 1991, investigators discovered a ten-volume archive meticulously detailing every national athletic achievement from the mid-sixties to the fall of the — Berlin Wall, each entry annotated with the name of the drug and the dosage given to the athlete. An average teen-age girl naturally produces somewhere around half a milligram of testosterone a day. The East German sports authorities routinely prescribed steroids to young adolescent girls in doses of up to thirty-five milligrams a day. As the investigation progressed, former female athletes, who still had masculinized physiques and voices, came forward with tales of deformed babies, inexplicable tumors, liver dysfunction, internal bleeding, and depression. German prosecutors handed down hundreds of indictments of former coaches, doctors, and sports officials, and won numerous convictions. It was the kind of spectacle that one would have thought would shock the sporting world. Yet it didn't. In a measure of how much the use of drugs in competitive sports has changed in the past quarter century, the trials caused barely a ripple.

  Today, coaches no longer have to coerce athletes into taking drugs. Athletes take them willingly. The drugs themselves are used in smaller doses and in creative combinations, leaving few telltale physical signs, and drug testers concede that it is virtually impossible to catch all the cheaters, or even, at times, to do much more than guess when cheating is taking place. Among the athletes, meanwhile, there is growing uncertainty about what exactly is wrong with doping. When the cyclist Lance Armstrong asserted last year, after his second consecutive Tour de France victory, that he was drug-free, some doubters wondered whether he was lying, and others simply assumed he was, and wondered why he had to. The moral clarity of the East German scandal — with its coercive coaches, damaged athletes, and corrupted competitions - has given way to shades of gray. In today's climate, the most telling moment of the East German scandal was not Knacke-Sommer's outburst. It was when one of the system's former top officials, at the beginning of his trial, shrugged and quoted Brecht: "Competitive sport begins where healthy sport ends."

  2.

  Perhaps the best example of how murky the drug issue has become is the case of Ben Johnson, the Canadian sprinter who won the one hundred metres at the Seoul Olympics, in 1988. Johnson set a new world record, then failed a post-race drug test and was promptly stripped of his gold medal and suspended from international competition. No athlete of Johnson's calibre has ever been exposed so dramatically, but his disgrace was not quite the victory for clean competition that it appeared to be.

  Johnson was part of a group of world-class sprinters based in Toronto in the nineteen-seventies and eighties and trained by a brilliant coach named Charlie Francis. Francis was driven and ambitious, eager to give his athletes the same opportunities as their competitors from the United States and Eastern Europe, and in 1979 he began discussing steroids with one of his prize sprinters, Angella Taylor. Francis felt that Taylor had the potential that year to run the two hundred metres in close to 22.90 seconds, a time that would put her within striking distance of the two best sprinters in the world, Evelyn Ashford, of the United States, and Marita Koch, of East Germany. But, seemingly out of nowhere, Ashford suddenly improved her two-hundred-metre time by six-tenths of a second. Then Koch ran what Francis calls, in his autobiography, "Speed Trap," a "science fictional" 21.71. In the sprints, individual improvements are usually measured in hundredths of a second; athletes, once they have reached their early twenties, typically improve their performance in small, steady increments, as experience and strength increase. But these were quantum leaps, and to Francis the explanation was obvious. "Angella wasn't losing ground because of a talent gap," he writes; "she was losing because of a drug gap, and it was widening by the day." (In the case of Koch, at least, he was right. In the East German archives, investigators found a letter from Koch to the director of research at V.E.B. Jenapharm, an East German pharmaceutical house, in which she complained, "My drugs were not as potent as the ones that were given to my opponent Brbel Eckert, who kept beating me." In East Germany, Ungerleider writes, this particular complaint was known as "dope-envy.") Later, Francis says, he was confronted at a track meet by Brian Oldfield, then one of the world's best shot-putters:

  "When are you going to start getting serious?" he demanded. "When are you going to tell your guys the facts of life?" I asked him how he could tell they weren't already using steroids. He replied that the muscle density just wasn't there. "Your guys will never be able to compete against the Americans - their careers will be over," he persisted.

  Among world-class athletes, the lure of steroids is not that they magically transform performance - no drug can do that - but that they make it possible to train harder. An aging baseball star, for instance, may realize that what he needs to hit a lot more home runs is to double the intensity of his weight training. Ordinarily, this might actually hurt his performance. "When you're under that kind of physical stress," Charles Yesalis, an epidemiologist at Pennsylvania State University, says, "your body releases corticosteroids, and when your body starts making those hormones at inappropriate times it blocks testosterone. And instead of being anabolic - instead of building muscle - corticosteroids are catabolic. They break down muscle. That's clearly something an athlete doesn't want." Taking steroids counteracts the impact of corticosteroids and helps the body bounce back faster. If that home-run hitter was taking testosterone or an anabolic steroid, he'd have a better chance of handling the extra weight training.

  It was this extra training that Francis and his sprinters felt they needed to reach the top. Angella Taylor was the first to start taking steroids. Ben Johnson followed in 1981, when he was twenty years old, beginning with a daily dose of five milligrams of the steroid Dianabol, in three-week on-and-off cycles. Over time, that protocol grew more complex. In 1984, Taylor visited a Los Angeles doctor, Robert Kerr, who was famous for his willingness to provide athletes with pharmacological assistance. He suggested that the Canadians use human growth hormone, the pituitary extract that promotes lean muscle and that had become, in Francis's words, "the rage in elite track circles." Kerr also recommended three additional substances, all of which were believed to promote the body's production of growth hormone: the amino acids arginine and ornithine and the dopamine precursor L-dopa. "I would later learn," Francis writes, "that one group of American women was using three times as much growth hormone as Kerr had suggested, in addition to 15 milligrams per day of Dianabol, another 15 milligrams of Anavar, large amounts of testosterone, and thyroxine, the synthetic thyroid hormone used by athletes to speed the metabolism and keep people lean." But the Canadians stuck to their initial regimen, making only a few changes: Vitamin B12, a non-steroidal muscle builder called inosine, and occasional shots of testosterone were added; Dianabol was dropped in favor of a newer steroid called Furazabol; and L-dopa, which turned out to cause stiffness, was replaced with the blood-pressure drug Dixarit.

  Going into the Seoul Olympics, then, Johnson was a walking pharmacy. But - and this is the great irony of his case - none of the drugs that were part of his formal pharmaceutical protocol resulted in his failed drug test. He had already reaped the benefit of the steroids in intense workouts leading up to the games, and had stopped Furazabol and testosterone long enough in advance that all traces of both supplements should have disappeared from his system by the time of his race - a process he sped up by taking the diuretic Moduret. Human growth hormone wasn't - and still isn't - detectable by a drug test, and arginine, ornithine, and Dixarit were legal. Johnson should have been clean. The most striking (and unintentionally hilarious) moment in "Speed Trap" comes when Francis describes his bewilderment at being informed that his star runner had failed a drug test - for the anabolic steroid stanozolol. "I was floored," Francis writes:

  To my knowledge, Ben had never injected stanozolol. He occasionally used Winstrol, an oral version of the drug, but for no more than a few days at a time, since it tended to make him stiff. He'd always discontinued the tablets at least six weeks before a meet, well beyond the accepted "clearance time." . . . After seven years of using steroids, Ben knew what he was doing. It was inconceivable to me that he might take stanozolol on his own and jeopardize the most important race of his life.

  Francis suggests that Johnson's urine sample might have been deliberately contaminated by a rival, a charge that is less preposterous than it sounds. Documents from the East German archive show, for example, that in international competitions security was so lax that urine samples were sometimes switched, stolen from a "clean" athlete, or simply "borrowed" from a noncompetitor. "The pure urine would either be infused by a catheter into the competitor's bladder (a rather painful procedure) or be held in condoms until it was time to give a specimen to the drug control lab," Ungerleider writes. (The top East German sports official Manfred Höppner was once in charge of urine samples at an international weight-lifting competition. When he realized that several of his weight lifters would not pass the test, he broke open the seal of their specimens, poured out the contents, and, Ungerleider notes, "took a nice long leak of pure urine into them.") It is also possible that Johnson's test was simply botched. Two years later, in 1990, track and field's governing body claimed that Butch Reynolds, the world's four-hundred-metre record holder, had tested positive for the steroid nandrolone, and suspended him for two years. It did so despite the fact that half of his urine-sample data had been misplaced, that the testing equipment had failed during analysis of the other half of his sample, and that the lab technician who did the test identified Sample H6 as positive - and Reynolds's sample was numbered H5. Reynolds lost the prime years of his career.

  We may never know what really happened with Johnson's assay, and perhaps it doesn't much matter. He was a doper. But clearly this was something less than a victory for drug enforcement. Here was a man using human growth hormone, Dixarit, inosine, testosterone, and Furazabol, and the only substance that the testers could find in him was stanozolol - which may have been the only illegal drug that he hadn't used. Nor is it encouraging that Johnson was the only prominent athlete caught for drug use in Seoul. It is hard to believe, for instance, that the sprinter Florence Griffith Joyner, the star of the Seoul games, was clean. Before 1988, her best times in the hundred metres and the two hundred metres were, respectively, 10.96 and 21.96. In 1988, a suddenly huskier FloJo ran 10.49 and 21.34, times that no runner since has even come close to equalling. In other words, at the age of twenty-eight - when most athletes are beginning their decline - Griffith Joyner transformed herself in one season from a career-long better-than-average sprinter to the fastest female sprinter in history. Of course, FloJo never failed a drug test. But what does that prove? FloJo went on to make a fortune as a corporate spokeswoman. Johnson's suspension cost him an estimated twenty-five million dollars in lost endorsements. The real lesson of the Seoul Olympics may simply have been that Johnson was a very unlucky man.

  3.

  The basic problem with drug testing is that testers are always one step behind athletes. It can take years for sports authorities to figure out what drugs athletes are using, and even longer to devise effective means of detecting them. Anabolic steroids weren't banned by the International Olympic Committee until 1975, almost a decade after the East Germans started using them. In 1996, at the Atlanta Olympics, five athletes tested positive for what we now know to be the drug Bromantan, but they weren't suspended, because no one knew at the time what Bromantan was. (It turned out to be a Russian-made psycho-stimulant.) Human growth hormone, meanwhile, has been around for twenty years, and testers still haven't figured out how to detect it.

  Perhaps the best example of the difficulties of drug testing is testosterone. It has been used by athletes to enhance performance since the fifties, and the International Olympic Committee announced that it would crack down on testosterone supplements in the early nineteen-eighties. This didn't mean that the I.O.C. was going to test for testosterone directly, though, because the testosterone that athletes were getting from a needle or a pill was largely indistinguishable from the testosterone they produce naturally. What was proposed, instead, was to compare the level of testosterone in urine with the level of another hormone, epitestosterone, to determine what's called the T/E ratio. For most people, under normal circumstances, that ratio is 1:1, and so the theory was that if testers found a lot more testosterone than epitestosterone it would be a sign that the athlete was cheating. Since a small number of people have naturally high levels of testosterone, the I.O.C. avoided the risk of falsely accusing anyone by setting the legal limit at 6:1.

  Did this stop testosterone use? Not at all. Through much of the eighties and nineties, most sports organizations conducted their drug testing only at major competitions. Athletes taking testosterone would simply do what Johnson did, and taper off their use in the days or weeks prior to those events. So sports authorities began randomly showing up at athletes' houses or training sites and demanding urine samples. To this, dopers responded by taking extra doses of epitestosterone with their testosterone, so their T/E would remain in balance. Testers, in turn, began treating elevated epitestosterone levels as suspicious, too. But that still left athletes with the claim that they were among the few with naturally elevated testosterone. Testers, then, were forced to take multiple urine samples, measuring an athlete's T/E ratio over several weeks. Someone with a naturally elevated T/E ratio will have fairly consistent ratios from week to week. Someone who is doping will have telltale spikes - times immediately after taking shots or pills when the level of the hormone in his blood soars. Did all these precautions mean that cheating stopped? Of course not. Athletes have now switched from injection to transdermal testosterone patches, which administer a continuous low-level dose of the hormone, smoothing over the old, incriminating spikes. The patch has another advantage: once you take it off, your testosterone level will drop rapidly, returning to normal, depending on the dose and the person, in as little as an hour. "It's the peaks that get you caught," says Don Catlin, who runs the U.C.L.A. Olympic Analytical Laboratory. "If you took a pill this morning and an unannounced test comes this afternoon, you'd better have a bottle of epitestosterone handy. But, if you are on the patch and you know your own pharmacokinetics, all you have to do is pull it off." In other words, if you know how long it takes for you to get back under the legal limit and successfully stall the test for that period, you can probably pass the test. And if you don't want to take that chance, you can just keep your testosterone below 6:1, which, by the way, still provides a whopping performance benefit. "The bottom line is that only careless and stupid people ever get caught in drug tests," Charles Yesalis says. "The lite athletes can hire top medical and scientific people to make sure nothing bad happens, and you can't catch them."

  4.

  But here is where the doping issue starts to get complicated, for there's a case to be made that what looks like failure really isn't - that regulating aggressive doping, the way the 6:1 standard does, is a better idea than trying to prohibit drug use. Take the example of erythropoietin, or EPO. EPO is a hormone released by your kidneys that stimulates the production of red blood cells, the body's oxygen carriers. A man-made version of the hormone is given to those with suppressed red-blood-cell counts, like patients undergoing kidney dialysis or chemotherapy. But over the past decade it has also become the drug of choice for endurance athletes, because its ability to increase the amount of oxygen that the blood can carry to the muscles has the effect of postponing fatigue. "The studies that have attempted to estimate EPO's importance say it's worth about a three-, four-, or five-per-cent advantage, which is huge," Catlin says. EPO also has the advantage of being a copy of a naturally occurring substance, so it's very hard to tell if someone has been injecting it. (A cynic would say that this had something to do with the spate of remarkable times in endurance races during that period.)

  So how should we test for EPO? One approach, which was used in the late nineties by the International Cycling Union, is a test much like the T/E ratio for testosterone. The percentage of your total blood volume which is taken up by red blood cells is known as your hematocrit. The average adult male has a hematocrit of between thirty-eight and forty-four per cent. Since 1995, the cycling authorities have declared that any rider who had a hematocrit above fifty per cent would be suspended - a deliberately generous standard (like the T/E ratio) meant to avoid falsely accusing someone with a naturally high hematocrit. The hematocrit rule also had the benefit of protecting athletes' health. If you take too much EPO, the profusion of red blood cells makes the blood sluggish and heavy, placing enormous stress on the heart. In the late eighties, at least fifteen professional cyclists died from suspected EPO overdoses. A fifty-per-cent hematocrit limit is below the point at which EPO becomes dangerous.

  But, like the T/E standard, the hematocrit standard had a perverse effect: it set the legal limit so high that it actually encouraged cyclists to titrate their drug use up to the legal limit. After all, if you are riding for three weeks through the mountains of France and Spain, there's a big difference between a hematocrit of forty-four per cent and one of 49.9 per cent. This is why Lance Armstrong faced so many hostile questions about EPO from the European press - and why eyebrows were raised at his five-year relationship with an Italian doctor who was thought to be an expert on performance-enhancing drugs. If Armstrong had, say, a hematocrit of forty-four per cent, the thinking went, why wouldn't he have raised it to 49.9, particularly since the rules (at least, in 2000) implicitly allowed him to do so. And, if he didn't, how on earth did he win?

  The problems with hematocrit testing have inspired a second strategy, which was used on a limited basis at the Sydney Olympics and this summer's World Track and Field Championships. This test measures a number of physiological markers of EPO use, including the presence of reticulocytes, which are the immature red blood cells produced in large numbers by EPO injections. If you have a lot more reticulocytes than normal, then there's a good chance you've used EPO recently. The blood work is followed by a confirmatory urinalysis. The test has its weaknesses. It's really only useful in picking up EPO used in the previous week or so, whereas the benefits of taking the substance persist for a month. But there's no question that, if random EPO testing were done aggressively in the weeks leading to a major competition, it would substantially reduce cheating.

  On paper, this second strategy sounds like a better system. But there's a perverse effect here as well. By discouraging EPO use, the test is simply pushing savvy athletes toward synthetic compounds called hemoglobin-based oxygen carriers, which serve much the same purpose as EPO but for which there is no test at the moment. "I recently read off a list of these new blood-oxygen expanders to a group of toxicologists, and none had heard of any of them," Yesalis says. "That's how fast things are moving." The attempt to prevent EPO use actually promotes inequity: it gives an enormous advantage to those athletes with the means to keep up with the next wave of pharmacology. By contrast, the hematocrit limit, though more permissive, creates a kind of pharmaceutical parity. The same is true of the T/E limit. At the 1986 world swimming championships, the East German Kristin Otto set a world record in the hundred-metre freestyle, with an extraordinary display of power in the final leg of the race. According to East German records, on the day of her race Otto had a T/E ratio of 18:1. Testing can prevent that kind of aggressive doping; it can insure no one goes above 6:1. That is a less than perfect outcome, of course, but international sports is not a perfect world. It is a place where Ben Johnson is disgraced and FloJo runs free, where Butch Reynolds is barred for two years and East German coaches pee into cups - and where athletes without access to the cutting edge of medicine are condemned to second place. Since drug testers cannot protect the purity of sport, the very least they can do is to make sure that no athlete can cheat more than any other.

  5.

  The first man to break the four-minute mile was the Englishman Roger Bannister, on a windswept cinder track at Oxford, nearly fifty years ago. Bannister is in his early seventies now, and one day last summer he returned to the site of his historic race along with the current world-record holder in the mile, Morocco's Hicham El Guerrouj. The two men chatted and compared notes and posed for photographs. "I feel as if I am looking at my mirror image," Bannister said, indicating El Guerrouj's similarly tall, high-waisted frame. It was a polite gesture, an attempt to suggest that he and El Guerrouj were part of the same athletic lineage. But, as both men surely knew, nothing could be further from the truth.

  Bannister was a medical student when he broke the four-minute mile in 1954. He did not have time to train every day, and when he did he squeezed in his running on his hour-long midday break at the hospital. He had no coach or trainer or entourage, only a group of running partners who called themselves "the Paddington lunch time club." In a typical workout, they might run ten consecutive quarter miles - ten laps - with perhaps two minutes of recovery between each repetition, then gobble down lunch and hurry back to work. Today, that training session would be considered barely adequate for a high-school miler. A month or so before his historic mile, Bannister took a few days off to go hiking in Scotland. Five days before he broke the four-minute barrier, he stopped running entirely, in order to rest. The day before the race, he slipped and fell on his hip while working in the hospital. Then he ran the most famous race in the history of track and field. Bannister was what runners admiringly call an "animal," a natural.

  El Guerrouj, by contrast, trains five hours a day, in two two-and-a-half-hour sessions. He probably has a team of half a dozen people working with him: at the very least, a masseur, a doctor, a coach, an agent, and a nutritionist. He is not in medical school. He does not go hiking in rocky terrain before major track meets. When Bannister told him, last summer, how he had prepared for his four-minute mile, El Guerrouj was stunned. "For me, a rest day is perhaps when I train in the morning and spend the afternoon at the cinema," he said. El Guerrouj certainly has more than his share of natural ability, but his achievements are a reflection of much more than that: of the fact that he is better coached and better prepared than his opponents, that he trains harder and more intelligently, that he has found a way to stay injury free, and that he can recover so quickly from one day of five-hour workouts that he can follow it, the next day, with another five-hour workout.

  Of these two paradigms, we have always been much more comfortable with the first: we want the relation between talent and achievement to be transparent, and we worry about the way ability is now so aggressively managed and augmented. Steroids bother us because they violate the honesty of effort: they permit an athlete to train too hard, beyond what seems reasonable. EPO fails the same test. For years, athletes underwent high-altitude training sessions, which had the same effect as EPO - promoting the manufacture of additional red blood cells. This was considered acceptable, while EPO is not, because we like to distinguish between those advantages which are natural or earned and those which come out of a vial.

  Even as we assert this distinction on the playing field, though, we defy it in our own lives. We have come to prefer a world where the distractable take Ritalin, the depressed take Prozac, and the unattractive get cosmetic surgery to a world ruled, arbitrarily, by those fortunate few who were born focussed, happy, and beautiful. Cosmetic surgery is not "earned" beauty, but then natural beauty isn't earned, either. One of the principal contributions of the late twentieth century was the moral deregulation of social competition - the insistence that advantages derived from artificial and extraordinary intervention are no less legitimate than the advantages of nature. All that athletes want, for better or worse, is the chance to play by those same rules.
GO TO TOP MENU

  In the early morning of July 7th, the thirty-year-old publicist Lizzie Grubman backed her father's brand-new Mercedes-Benz S.U.V. into a crowd outside a Southampton night club, injuring sixteen people. Shortly before the incident, Grubman had had a loud argument with the night club's bouncers, one of whom wanted her to move her car from the fire lane. She allegedly told him, "Fuck you, white trash," and then hit the accelerator hard. To the tabloids, the event has been irresistible - Grubman's father is a famous entertainment lawyer; she is a bottle blonde; she represents Britney Spears - and for the past two weeks the city has been full of gleeful philosophizing about entitlement, arrogance, and the perils of spoiled rich kids getting angry behind the wheel of Daddy's S.U.V. But what, exactly, happened in the Mercedes that night? As it turns out, Grubman's argument that it was an accident has foundation. She appears to have been the victim of a well-understood problem known as "unintended acceleration," or "pedal error."

  This does not mean, as some have surmised, that Grubman put the car in reverse, thinking that it was in drive. If that was the case, why didn't she just hit the brake as she sped backward across the parking lot? Pedal error, by contrast, occurs when a driver has her right foot on what she thinks is the brake but is actually the accelerator. When the car begins to move, the driver responds by pressing down on the pedal further still, in an attempt to stop the car. But that, of course, only makes the problem worse.

  Why do people make pedal errors? In a comprehensive analysis of the phenomenon that appeared in the June, 1989, issue of the journal Human Factors, the U.C.L.A. psychologist Richard A. Schmidt argues that any number of relatively innocent factors can cause a "low-level variability in the foot trajectory toward the brake" - meaning that a driver aims for the brake and misses. In "Unintended Acceleration: A Review of Human Factors Contributions," Schmidt writes, "If the head is turned to the left, as it might be while looking in the left side mirror, reaching for the seatbelt, or other, similar maneuvers in the initiation of the driving sequence, the result could be systematic biases to the right in the perceived position of the brake pedal. This bias could be as large as 6 cm in a driver of average height if the angular bias were 5 deg." That bias is more than enough to cause pedal error.

  It is worth noting that there are five factors that have been associated with an increased probability of unintentional acceleration. It happens more frequently to older people, to women, to short people, to people who are unfamiliar with the cars they are driving, and to people who have just got into a car and started it up. Grubman, who is on the short side and had reportedly driven her father's car only twice before, qualifies on four of those five grounds.

  Here, then, is a perfectly plausible explanation for what happened that night. Grubman gets into the car, puts it in reverse, and then twists around to see if anyone is behind her, her foot slipping off the pedal as she does so. As a result, the trajectory of her right foot is thrown off by a few inches, and when she puts her foot back down, what she thinks is the brake is actually the accelerator. The car leaps backward. She panics. She presses harder on the accelerator, trying to stop the car. But her action makes the car speed up. Grubman was parked approximately fifty feet from the night club, and if we assume that she was accelerating at a rate of .4 g's (not unlikely, given her 342-horsepower vehicle), she would have covered that fifty feet in roughly 2.8 seconds. Wade Bartlett, an expert in mechanical forensics who has studied more than three dozen cases of unintended acceleration, says, "When faced with a completely new situation, it would not be unusual for someone to require three seconds to figure out what's going on and what to do about it." In some instances, it's been reported, drivers have mistakenly continued to press the accelerator for up to twelve seconds. Grubman's accident is a textbook case of pedal error.

  Understanding pedal error may help to explain Grubman's intent that night. Of course, nothing in the scientific literature explains why someone would park in a fire lane, swear at a bouncer, leave the scene of an accident, and dodge a Breathalyzer. For that, we have the lawyers and the New York Post.
GO TO TOP MENU

  How caffeine created the modern world.

  1.

  The original Coca-Cola was a late-nineteenth-century concoction known as Pemberton's French Wine Coca, a mixture of alcohol, the caffeine-rich kola nut, and coca, the raw ingredient of cocaine. In the face of social pressure, first the wine and then the coca were removed, leaving the more banal modern beverage in its place: carbonated, caffeinated sugar water with less kick to it than a cup of coffee. But is that the way we think of Coke? Not at all. In the nineteen-thirties, a commercial artist named Haddon Sundblom had the bright idea of posing a portly retired friend of his in a red Santa Claus suit with a Coke in his hand, and plastering the image on billboards and advertisements across the country. Coke, magically, was reborn as caffeine for children, caffeine without any of the weighty adult connotations of coffee and tea. It was - as the ads with Sundblom's Santa put it - "the pause that refreshes." It added life. It could teach the world to sing.

  One of the things that have always made drugs so powerful is their cultural adaptability, their way of acquiring meanings beyond their pharmacology. We think of marijuana, for example, as a drug of lethargy, of disaffection. But in Colombia, the historian David T. Courtwright points out in "Forces of Habit" (Harvard; $24.95), "peasants boast that cannabis helps them to quita el cansancio or reduce fatigue; increase their fuerza and ánimo, force and spirit; and become incansable, tireless." In Germany right after the Second World War, cigarettes briefly and suddenly became the equivalent of crack cocaine. "Up to a point, the majority of the habitual smokers preferred to do without food even under extreme conditions of nutrition rather than to forgo tobacco," according to one account of the period. "Many housewives.   bartered fat and sugar for cigarettes." Even a drug as demonized as opium has been seen in a more favorable light. In the eighteen-thirties, Franklin Delano Roosevelt's grandfather Warren Delano II made the family fortune exporting the drug to China, and Delano was able to sugarcoat his activities so plausibly that no one ever accused his grandson of being the scion of a drug lord. And yet, as Bennett Alan Weinberg and Bonnie K. Bealer remind us in their marvellous new book "The World of Caffeine" (Routledge; $27.50), there is no drug quite as effortlessly adaptable as caffeine, the Zelig of chemical stimulants.

  At one moment, in one form, it is the drug of choice of café intellectuals and artists; in another, of housewives; in another, of Zen monks; and, in yet another, of children enthralled by a fat man who slides down chimneys. King Gustav III, who ruled Sweden in the latter half of the eighteenth century, was so convinced of the particular perils of coffee over all other forms of caffeine that he devised an elaborate experiment. A convicted murderer was sentenced to drink cup after cup of coffee until he died, with another murderer sentenced to a lifetime of tea drinking, as a control. (Unfortunately, the two doctors in charge of the study died before anyone else did; then Gustav was murdered; and finally the tea drinker died, at eighty-three, of old age - leaving the original murderer alone with his espresso, and leaving coffee's supposed toxicity in some doubt.) Later, the various forms of caffeine began to be divided up along sociological lines. Wolfgang Schivelbusch, in his book "Tastes of Paradise," argues that, in the eighteenth century, coffee symbolized the rising middle classes, whereas its great caffeinated rival in those years - cocoa, or, as it was known at the time, chocolate - was the drink of the aristocracy. "Goethe, who used art as a means to lift himself out of his middle class background into the aristocracy, and who as a member of a courtly society maintained a sense of aristocratic calm even in the midst of immense productivity, made a cult of chocolate, and avoided coffee," Schivelbusch writes. "Balzac, who despite his sentimental allegiance to the monarchy, lived and labored for the literary marketplace and for it alone, became one of the most excessive coffee-drinkers in history. Here we see two fundamentally different working styles and means of stimulation - fundamentally different psychologies and physiologies." Today, of course, the chief cultural distinction is between coffee and tea, which, according to a list drawn up by Weinberg and Bealer, have come to represent almost entirely opposite sensibilities:

  Coffee Aspect
Male
Boisterous
Indulgence
Hardheaded
Topology
Heidegger
Beethoven
Libertarian
Promiscuous

  Tea Aspect
Female
Decorous
Temperance
Romantic
Geometry
Carnap
Mozart
Statist
Pure

  That the American Revolution began with the symbolic rejection of tea in Boston Harbor, in other words, makes perfect sense. Real revolutionaries would naturally prefer coffee. By contrast, the freedom fighters of Canada, a hundred years later, were most definitely tea drinkers. And where was Canada's autonomy won? Not on the blood-soaked fields of Lexington and Concord but in the genteel drawing rooms of Westminster, over a nice cup of Darjeeling and small, triangular cucumber sandwiches.

  2.

  All this is a bit puzzling. We don't fetishize the difference between salmon eaters and tuna eaters, or people who like their eggs sunny-side up and those who like them scrambled. So why invest so much importance in the way people prefer their caffeine? A cup of coffee has somewhere between a hundred and two hundred and fifty milligrams; black tea brewed for four minutes has between forty and a hundred milligrams. But the disparity disappears if you consider that many tea drinkers drink from a pot, and have more than one cup. Caffeine is caffeine. "The more it is pondered," Weinberg and Bealer write, "the more paradoxical this duality within the culture of caffeine appears. After all, both coffee and tea are aromatic infusions of vegetable matter, served hot or cold in similar quantities; both are often mixed with cream or sugar; both are universally available in virtually any grocery or restaurant in civilized society; and both contain the identical psychoactive alkaloid stimulant, caffeine."

  It would seem to make more sense to draw distinctions based on the way caffeine is metabolized rather than on the way it is served. Caffeine, whether it is in coffee or tea or a soft drink, moves easily from the stomach and intestines into the bloodstream, and from there to the organs, and before long has penetrated almost every cell of the body. This is the reason that caffeine is such a wonderful stimulant. Most substances can't cross the blood-brain barrier, which is the body's defensive mechanism, preventing viruses or toxins from entering the central nervous system. Caffeine does so easily. Within an hour or so, it reaches its peak concentration in the brain, and there it does a number of things - principally, blocking the action of adenosine, the neuromodulator that makes you sleepy, lowers your blood pressure, and slows down your heartbeat. Then, as quickly as it builds up in your brain and tissues, caffeine is gone - which is why it's so safe. (Caffeine in ordinary quantities has never been conclusively linked to serious illness.)

  But how quickly it washes away differs dramatically from person to person. A two-hundred-pound man who drinks a cup of coffee with a hundred milligrams of caffeine will have a maximum caffeine concentration of one milligram per kilogram of body weight. A hundred-pound woman having the same cup of coffee will reach a caffeine concentration of two milligrams per kilogram of body weight, or twice as high. In addition, when women are on the Pill, the rate at which they clear caffeine from their bodies slows considerably. (Some of the side effects experienced by women on the Pill may in fact be caffeine jitters caused by their sudden inability to tolerate as much coffee as they could before.) Pregnancy reduces a woman's ability to process caffeine still further. The half-life of caffeine in an adult is roughly three and a half hours. In a pregnant woman, it's eighteen hours. (Even a four-month-old child processes caffeine more efficiently.) An average man and woman sitting down for a cup of coffee are thus not pharmaceutical equals: in effect, the woman is under the influence of a vastly more powerful drug. Given these differences, you'd think that, instead of contrasting the caffeine cultures of tea and coffee, we'd contrast the caffeine cultures of men and women.

  3.

  But we don't, and with good reason. To parse caffeine along gender lines does not do justice to its capacity to insinuate itself into every aspect of our lives, not merely to influence culture but even to create it. Take coffee's reputation as the "thinker's" drink. This dates from eighteenth-century Europe, where coffeehouses played a major role in the egalitarian, inclusionary spirit that was then sweeping the continent. They sprang up first in London, so alarming Charles II that in 1676 he tried to ban them. It didn't work. By 1700, there were hundreds of coffeehouses in London, their subversive spirit best captured by a couplet from a comedy of the period: "In a coffeehouse just now among the rabble / I bluntly asked, which is the treason table." The movement then spread to Paris, and by the end of the eighteenth century coffeehouses numbered in the hundreds - most famously, the Café de la Régence, near the Palais Royal, which counted among its customers Robespierre, Napoleon, Voltaire, Victor Hugo, Théophile Gautier, Rousseau, and the Duke of Richelieu. Previously, when men had gathered together to talk in public places, they had done so in bars, which drew from specific socioeconomic niches and, because of the alcohol they served, created a specific kind of talk. The new coffeehouses, by contrast, drew from many different classes and trades, and they served a stimulant, not a depressant. "It is not extravagant to claim that it was in these gathering spots that the art of conversation became the basis of a new literary style and that a new ideal of general education in letters was born," Weinberg and Bealer write.

  It is worth noting, as well, that in the original coffeehouses nearly everyone smoked, and nicotine also has a distinctive physiological effect. It moderates mood and extends attention, and, more important, it doubles the rate of caffeine metabolism: it allows you to drink twice as much coffee as you could otherwise. In other words, the original coffeehouse was a place where men of all types could sit all day; the tobacco they smoked made it possible to drink coffee all day; and the coffee they drank inspired them to talk all day. Out of this came the Enlightenment. (The next time we so perfectly married pharmacology and place, we got Joan Baez.)

  In time, caffeine moved from the café to the home. In America, coffee triumphed because of the country's proximity to the new Caribbean and Latin American coffee plantations, and the fact that throughout the nineteenth century duties were negligible. Beginning in the eighteen-twenties, Courtwright tells us, Brazil "unleashed a flood of slave-produced coffee. American per capita consumption, three pounds per year in 1830, rose to eight pounds by 1859."

  What this flood of caffeine did, according to Weinberg and Bealer, was to abet the process of industrialization - to help "large numbers of people to coordinate their work schedules by giving them the energy to start work at a given time and continue it as long as necessary." Until the eighteenth century, it must be remembered, many Westerners drank beer almost continuously, even beginning their day with something called "beer soup." (Bealer and Weinberg helpfully provide the following eighteenth-century German recipe: "Heat the beer in a saucepan; in a separate small pot beat a couple of eggs. Add a chunk of butter to the hot beer. Stir in some cool beer to cool it, then pour over the eggs. Add a bit of salt, and finally mix all the ingredients together, whisking it well to keep it from curdling.") Now they began each day with a strong cup of coffee. One way to explain the industrial revolution is as the inevitable consequence of a world where people suddenly preferred being jittery to being drunk. In the modern world, there was no other way to keep up. That's what Edison meant when he said that genius was ninety-nine per cent perspiration and one per cent inspiration. In the old paradigm, working with your mind had been associated with leisure. It was only the poor who worked hard. (The quintessential pre-industrial narrative of inspiration belonged to Archimedes, who made his discovery, let's not forget, while taking a bath.) But Edison was saying that the old class distinctions no longer held true - that in the industrialized world there was as much toil associated with the life of the mind as there had once been with the travails of the body.

  In the twentieth century, the professions transformed themselves accordingly: medicine turned the residency process into an ordeal of sleeplessness, the legal profession borrowed a page from the manufacturing floor and made its practitioners fill out time cards like union men. Intellectual heroics became a matter of endurance. "The pace of computation was hectic," James Gleick writes of the Manhattan Project in "Genius," his biography of the physicist Richard Feynman. "Feynman's day began at 8:30 and ended fifteen hours later. Sometimes he could not leave the computing center at all. He worked through for thirty-one hours once and the next day found that an error minutes after he went to bed had stalled the whole team. The routine allowed just a few breaks." Did Feynman's achievements reflect a greater natural talent than his less productive forebears had? Or did he just drink a lot more coffee? Paul Hoffman, in "The Man Who Loved Only Numbers," writes of the legendary twentieth-century mathematician Paul Erdös that "he put in nineteen-hour days, keeping himself fortified with 10 to 20 milligrams of Benzedrine or Ritalin, strong espresso and caffeine tablets. 'A mathematician,' Erdös was fond of saying, 'is a machine for turning coffee into theorems.'" Once, a friend bet Erdös five hundred dollars that he could not quit amphetamines for a month. Erdös took the bet and won, but, during his time of abstinence, he found himself incapable of doing any serious work. "You've set mathematics back a month," he told his friend when he collected, and immediately returned to his pills.

  Erdös's unadulterated self was less real and less familiar to him than his adulterated self, and that is a condition that holds, more or less, for the rest of society as well. Part of what it means to be human in the modern age is that we have come to construct our emotional and cognitive states not merely from the inside out - with thought and intention - but from the outside in, with chemical additives. The modern personality is, in this sense, a synthetic creation: skillfully regulated and medicated and dosed with caffeine so that we can always be awake and alert and focussed when we need to be. On a bet, no doubt, we could walk away from caffeine if we had to. But what would be the point? The lawyers wouldn't make their billable hours. The young doctors would fall behind in their training. The physicists might still be stuck out in the New Mexico desert. We'd set the world back a month.

  4.

  That the modern personality is synthetic is, of course, a disquieting notion. When we talk of synthetic personality - or of constructing new selves through chemical means - we think of hard drugs, not caffeine. Timothy Leary used to make such claims about LSD, and the reason his revolution never took flight was that most of us found the concept of tuning in, turning on, and dropping out to be a bit creepy. Here was this shaman, this visionary - and yet, if his consciousness was so great, why was he so intent on altering it? More important, what exactly were we supposed to be tuning in to? We were given hints, with psychedelic colors and deep readings of "Lucy in the Sky with Diamonds," but that was never enough. If we are to re-create ourselves, we would like to know what we will become.

  Caffeine is the best and most useful of our drugs because in every one of its forms it can answer that question precisely. It is a stimulant that blocks the action of adenosine, and comes in a multitude of guises, each with a ready-made story attached, a mixture of history and superstition and whimsy which infuses the daily ritual of adenosine blocking with meaning and purpose. Put caffeine in a red can and it becomes refreshing fun. Brew it in a teapot and it becomes romantic and decorous. Extract it from little brown beans and, magically, it is hardheaded and potent. "There was a little known Russian émigré, Trotsky by name, who during World War I was in the habit of playing chess in Vienna's Café Central every evening," Bealer and Weinberg write, in one of the book's many fascinating café yarns:

  A typical Russian refugee, who talked too much but seemed utterly harmless, indeed, a pathetic figure in the eyes of the Viennese. One day in 1917 an official of the Austrian Foreign Ministry rushed into the minister's room, panting and excited, and told his chief, "Your excellency . . . Your excellency . . . Revolution has broken out in Russia." The minister, less excitable and less credulous than his official, rejected such a wild claim and retorted calmly, "Go away . . . Russia is not a land where revolutions break out. Besides, who on earth would make a revolution in Russia? Perhaps Herr Trotsky from the Café Central?"

  The minister should have known better. Give a man enough coffee and he's capable of anything.
GO TO TOP MENU

  Sumner Redstone and the rules of the corporate memoir.

  1.

  In the early nineteen-nineties, Sumner Redstone, the head of Viacom, wanted to merge his company with Paramount Communications. The problem was that the chairman of Paramount, Martin Davis, was being difficult. As Redstone recounts in his new autobiography, "A Passion to Win" (Simon & Schuster; $26), he and Davis would meet at, say, a charitable function, and Davis would make it sound as if the deal were imminent. Then, abruptly, he would back away. According to Redstone, Davis was a ditherer, a complicated and emotionally cold man who couldn't bear to part with his company. Yet, somehow in the course of their dealings, Redstone writes, he and Davis developed a "mutual respect and fond friendship." They became, he says a page later, "friends" who "enjoyed each other's company and were developing a close working rapport," and who had, he tells us two pages after that, "a great affection for each other." The turning point in the talks comes when Davis and Redstone have dinner in a dining room at Morgan Stanley, and Redstone is once more struck by how Davis "had a genuine affection for me." When the two have dinner again, this time at Redstone's suite in the Carlyle Hotel, Davis looks out over the spectacular lights of nighttime Manhattan and says, "You know, Sumner, when this deal gets done, they'll build a big statue of you in the middle of Central Park and I'll be forgotten." "No, Martin," Redstone replies. "They'll build statues of both of us and I will be looking up to you in admiration." Davis laughs. "It was just the right touch," Redstone reports, and one can almost imagine him at that point throwing a brawny forearm around Davis's shoulders and giving him a manly squeeze.

  2.

  "A Passion to Win," which Redstone wrote with Peter Knobler, is an account of a man's rise to the top of a multibillion-dollar media empire. It is the tale of the complex negotiations, blinding flashes of insight, and lengthy dinners at exclusive Manhattan hotels which created the colossus that is Viacom. But mostly it is a story about the value of friendship, about how very, very powerful tycoons like Redstone have the surprising ability to transcend their own egos and see each other, eye to eye, as human beings.

  For instance, Gerald Levin, the head of Time Warner, might look like a rival of Redstone's. Not at all. He is, Redstone tells us, "a very close friend." Redstone says that he and Sherry Lansing, who heads Paramount Pictures, a division of Viacom, "are not just business associates, we are extremely close friends." So, too, with Geraldine Laybourne, who used to head Viacom's Nickelodeon division. "We were not just business associates," he writes, in the plainspoken manner that is his trademark. "We were friends." The singer Tony Bennett was one of Redstone's idols for years, and then one day Redstone's employees threw him a surprise birthday party and there was Bennett, who had come thousands of miles to sing a song. "Now," Redstone says proudly, "he is my friend." The producer Bob Evans? "A good friend." Aaron Spelling? "One of my closest friends." Bill and Hillary? "I have come to know and like the Clintons." Redstone's great friend Martin Davis warned him once about Barry Diller: "Don't trust him. He's got too big an ego." But Redstone disagreed. "Barry Diller and I were extremely friendly," he says. Ted Kennedy he met years ago, at a get-together of business executives. Everyone else was flattering Kennedy. Not so Redstone. A true friend is never disingenuous. As he recalls, he said to Kennedy,

  "Look, I don't want to disagree with everybody, but, Senator, the problem is that you believe . . . that you can solve any problem by just throwing money at it. It doesn't work that way."

  Conversation ceased, glances were exchanged. Everyone was appalled. Then Senator Kennedy said: "Sumner's right." . . . After that, Senator Kennedy called me regularly when he came to Boston and we developed a lasting friendship.

  You might think that Redstone simply becomes friends with anyone he meets who is rich or famous. This is not the case. Once, Redstone was invited to dinner at the office of Robert Maxwell, the British press baron. It was no small matter, since Redstone is one of those moguls for whom dinner has enormous symbolic importance - it is the crucible in which friendships are forged. But Maxwell didn't show up until halfway through the meal: not a friend. On another occasion, Redstone hires a highly touted executive away from the retailing giant Wal-Mart to run his Blockbuster division, and then learns that the man is eating dinner alone in his hotel dining room and isn't inviting his fellow-executives to join him. Dinner alone? Redstone was worried. That was not friendly behavior. The executive, needless to say, was not long for Viacom.

  What Redstone likes most in a friend is someone who reminds him of himself. "I respected Malone for having started with nothing and rising to become chairman of the very successful Tele-Communications, Inc.," Redstone writes of John Malone, the billionaire cable titan. "I had admired Kerkorian's success over the years," he says of Kirk Kerkorian, the billionaire corporate raider. "He started with nothing, and I have a special affection for people who start with nothing and create empires. . . . Today Kirk Kerkorian and I are friends." (They are so friendly, in fact, that they recently had a meal together at Spago.) Of his first meeting with John Antioco, whom Redstone would later hire to run Blockbuster - replacing the executive who ate dinner alone - Redstone writes, "We hit it off immediately. . . . He had come from humble beginnings, which I empathized with." Soon, the two men are dining together. Look at my life, Redstone seems to marvel again and again in "A Passion to Win." You think you see a hard-nosed mogul, selflessly wringing the last dollar out of megadeals on behalf of his shareholders. But inside that mogul beats a heart of warmth and compassion. Ever wonder how Redstone was able to pull off his recent mammoth merger with CBS? He happens to have lunch with the CBS chief, Mel Karmazin, in an exclusive Viacom corporate dining room, and discovers that he and Karmazin are kindred spirits. "Both of us had started with nothing," Redstone writes, "and ended up in control of major corporations." Can you believe it?

  3.

  In 1984, Lee Iacocca, the chairman of the resurgent Chrysler Corporation, published his autobiography, "Iacocca," with the writer William Novak. It was a charming book, in which Ia-cocca came across as a homespun, no-nonsense man of the people, and it sold more copies than any other business book in history. This was good news for Iacocca, because it made him a household name. But it was bad news for the rest of us, because it meant that an entire class of C.E.O.s promptly signed up ghostwriters and turned out memoirs designed to portray themselves as homespun, no-nonsense men of the people.

  "Iacocca" began with a brief, dramatic prologue, in which he described his last day at Ford, where he had worked his entire life. He had just been fired by Henry Ford II, and it was a time of great personal crisis. "Before I left the house," he wrote, establishing the conflict between him and Henry Ford that would serve as the narrative engine of the book, "I kissed my wife, Mary, and my two daughters, Kathi and Lia. . . . Even today, their pain is what stays with me. It's like the lioness and her cubs. If the hunter knows what's good for him, he will leave the little ones alone. Henry Ford made my kids suffer, and for that I'll never forgive him." Now every C.E.O. book begins with a short, dramatic prologue, in which the author describes a day of great personal crisis that is intended to serve as the narrative engine of the book. In "Work in Progress," by Michael Eisner, Disney's C.E.O., it's the day he suffered chest pains at the Herb Allen conference in Sun Valley: "I spent much of dinner at Herb Allen's talking to Tom Brokaw, the NBC anchorman, who told me a long story about fly-fishing with his friend Robert Redford. . . . The pain in my arms returned." In "A Passion to Win," it's the day Redstone clung to a ledge during a fire at the Copley Plaza Hotel, in Boston, eventually suffering third-degree burns over forty-five per cent of his body: "The pain was excruciating but I refused to let go. That way was death."

  Iacocca followed the dramatic prologue with a chapter on his humble origins. It opens, "Nicola Iacocca, my father, arrived in this country in 1902 at the age of twelve - poor, alone, and scared." Now every C.E.O. has humble origins. Then Iacocca spoke of an early mentor, a gruff, no-nonsense man who instilled lessons that guide him still. His name was Charlie Beacham, and he was "the kind of guy you'd charge up the hill for even though you knew very well you could get killed in the process. He had the rare gift of being tough and generous at the same time." Sure enough, everywhere now there are gruff, no-nonsense men instilling lessons that guide C.E.O.s to this day. ("Nobbe, who was in his sixties, was a stern disciplinarian and a tough guy who didn't take crap from anyone," writes the former Scott Paper and Sunbeam C.E.O. Al Dunlap, in his book "Mean Business." "He was always chewing me out. . . . Still, Nobbe rapidly won my undying respect and admiration because he wore his bastardness like a well-earned badge of honor.")

  The legacy of "Iacocca" wouldn't matter so much if most C.E.O.s were, in fact, homespun men of the people who had gruff mentors, humble beginnings, and searing personal crises that shaped their lives and careers. But they aren't. Redstone's attempt to play the humble-beginnings card, for instance, is compromised by the fact that he didn't exactly have humble beginnings. Although his earliest years were spent in a tenement, his family's fortunes rapidly improved. He went to Harvard and Harvard Law School. His father was a highly successful businessman, and it was his father's company that served as the basis for the Viacom empire. (Just why Redstone continues to think that he comes from nothing, under the circumstances, is an interesting case study in the psychology of success: perhaps, if you are worth many billions, an upper-middle-class upbringing simply feels like nothing.) Eisner's personal crisis ends with him driving himself to Cedars-Sinai Hospital in Los Angeles - one of the best hospitals in the world - where he is immediately met by not one but two cardiologists, who take him to a third cardiologist, who tells Eisner that the procedure he is about to undergo, an angiogram, a common surgical procedure, is ninety-eight per cent safe. In "On the Firing Line," the former Apple C.E.O. Gil Amelio's day of personal crisis is triggered merely by walking down the halls of his new company: "In each of the offices near mine toiled some key executive I was just coming to know, wrestling with problems that would only gradually be revealed to me. I wondered what caged alligators they would let loose at me on some future date." Dunlap, meanwhile, tells us that one of his first acts as C.E.O. of Scott Paper was, in an orgy of unpretentiousness, to throw out the bookshelves in his office and replace them with Aboriginal paintings from Australia: "To me, the paintings made a lot more sense. They showed people who had to survive by their wits, people who couldn't call out for room service." Among Dunlap's gruff mentors was the Australian multimillionaire Kerry Packer, and one day, while playing tennis with Packer, Dunlap has a personal crisis. He pops a tendon. Packer rushes over, picks him up, and carries him to a lounge chair. "This was not only a wealthy man and a man who had political power, this was a physically powerful man," Dunlap reports. "In the end," he adds, taking Iacocca's lioness and Amelio's alligators to the next level, "Kerry and I split because we were just too similar. We were like two strong-willed, dominant animals who hunted together and brought down the biggest prey, but, when not hunting, fought each other." It is hard to read passages like these and not shudder at the prospect of the General Electric chairman Jack Welch's upcoming memoir, for which Warner Books paid a seven-milliondollar advance. Who will be tapped as the gruff mentor? What was Welch's career-altering personal crisis? What wild-animal metaphors will he employ? ("As I looked around the room, I felt like a young wildebeest being surveyed by a group of older and larger - but less nimble - wildebeests, whose superior market share and penetration of the herd were no match for my greater hunger, born of my impoverished middle-class upbringing in the jungles of suburban Boston.")

  The shame of it is that many of these books could have been fascinating. Scattered throughout Eisner's "Work in Progress," for example, are numerous hints about how wonderfully weird and obsessive Eisner is. He hears that Universal is thinking of building a rival theme park four miles from Disney in Orlando, and he and his assistant climb the fence at the Universal construction site at three in the morning to check on its progress. He sneaks into performances of the musical "Beauty and the Beast" in Los Angeles at least a dozen times, and when the stage version of "The Lion King" has its first tryout, in Minneapolis, he flies there from Los Angeles half a dozen times during the course of one summer to give his "notes." When he is thinking of building Euro Disney, outside Paris, he is told that it takes half an hour to travel by Métro from the Arc de Triomphe to the end of the line, six miles from the Disney site. Eisner gets on the Métro to see for himself. He sets his watch. It takes twenty-five minutes.

  By the end of the book, the truth is spilling out from under the breezy façade: Eisner is a compulsive, detail-oriented control freak. That fact, of course, says a lot about why Disney is successful. But you cannot escape the sense, while reading "Work in Progress," that you weren't supposed to reach that conclusion - that the bit about climbing the fence was supposed to be evidence of Eisner's boyish enthusiasm, and the bit about seeing "Beauty and the Beast" a dozen times was supposed to make it look as if he just loved the theatre. This is the sorry state of C.E.O. memoirs in the post-Iacocca era. It's only when they fail at their intended task that they truly succeed.

  4.

  "A Passion to Win" ought to have been a terrific book, because Redstone has a terrific story to tell. He graduated from Boston Latin High School with the highest grade-point average in the school's three-hundred-year history. During the war, he was a cryptographer, part of the team that successfully cracked Japanese military and diplomatic codes. After the war, he had a brilliant career as a litigator, arguing a case before the Supreme Court. The mobster Bugsy Siegel once offered him a job. Then Redstone took over his father's business, and, through a series of breathtaking acquisitions - Viacom, Paramount Communications, Blockbuster, and then CBS - turned himself, in the space of twenty years, into one of the richest men in the world.

  What links all these successes, it becomes clear, is a very particular and subtle intelligence. Here, in one of the book's best passages, is Redstone's description of his dealings with Wayne Huizenga, the man from whom he bought Blockbuster. Huizenga, he writes, put together his empire by buying out local video stores around the country:

  He and his Blockbuster associates would swoop in on some video guy who saw money for his store dangling from Huizenga's pockets. When negotiations came to an impasse, rather than say, "We have a problem with the proposal," and make a counteroffer, he would say, "Sorry we couldn't do a deal. Good luck to you," shake the guy's hand, pull on the leather coat and head for the elevator.

  Seeing the deal about to fall apart, the video operator, who only moments before was seeing dollar signs, would run after him. "Wait, don't go. Come back. Let's talk about it." Huizenga hadn't hit the down button. He had been waiting. That's how he got his concessions.

  When Redstone was negotiating for Blockbuster, Huizenga pulled the same stunt. It would be 2 a.m., Redstone says, and Huizenga would put on his coat and head for the exit. But Redstone was wise to him:

  Huizenga would get to the elevator and no one would run after him. One time he waited there for fifteen minutes before it dawned on him that we weren't going to chase him. He got to his car. Nothing.

  He would soon find some excuse to call - he left papers in our office - waiting for us to say, "Why don't you come back." Still, nothing. Once he was literally on his plane, perhaps even circling the neighborhood, when he phoned and said he had to be back in New York for a Merrill Lynch dinner anyway and maybe we could get together.

  Redstone has a great intuitive grasp of people. He understood immediately that Huizenga was simply a bully. This kind of insight is hardly rare among people who make their living at the negotiating table. It's the skill of the poker player. But poker is a game of manipulation and exploitation - and Redstone doesn't seem to manipulate or exploit. He persuades and seduces: he would concede that your straight flush beat his three of a kind, but then, over a very long dinner at Spago, he would develop such a rapport with you that you'd willingly split the pot with him. It's no accident that, of Paramount's many suitors, Redstone won the day, because he realized that what Martin Davis needed was the assurance of friendship: he needed to hear about the two statues in Central Park, one gazing in admiration at the other. Redstone's peculiar gift also explains why he seems to have ended up as "friends" with so many of the people with whom he's done business. In Redstone's eyes, these people really are his friends. At the moment when he looked into Davis's eyes that night at the Carlyle, he absolutely believed they had a special bond - and, more important, he made Davis believe it, too. Redstone's heart happily follows his mind, and that's a powerful gift for someone whose success depends on the serial seduction of takeover targets.

  Most of us, needless to say, don't think of friendships this way. Our hearts don't always follow our minds; they go off in crazy directions, and we develop loyalties that make no functional sense. But there's little of that fuzziness in Redstone's world, and perhaps that's why "A Passion to Win" is sometimes so chilling. A picture runs in the Post, Redstone tells us, that shows him walking down a street in Paris with "a beautiful woman." Phyllis, his wife of fifty-two years, files for divorce. The news hits him "like a bullet," he says. "I could not believe that she wanted to end it." It takes him only a few sentences, though, to recover from his wounds. "Of course, divorce settlement or no, my interest in Viacom's parent company, National Amusements, had been structured in such a way that events in Phyllis's and my personal life would not affect the ownership, control or management of Viacom," he assures us. Redstone says that he considered Frank Biondi, his longtime deputy at Viacom, "my friend." But one day he decides to get rid of Biondi, and immediately the gibes and cheap shots appear. Biondi is lazy. Biondi cannot negotiate deals. Biondi is not C.E.O. material. "Frank took the news calmly, almost as if he expected it," Redstone writes of the firing. "But I was shocked to learn that the first person he called was not his wife, but his lawyer to determine his rights under his contract. We were prepared to honor his contract to the fullest, so that was not an issue, but I found this implicit statement of his priorities to be revealing." What kind of person says this about a friend? Redstone aligns his passions with his interests, and when his interests change, so do his friendships.

  At the very end of "A Passion to Win," Redstone recounts Viacom's merger with CBS. The deal meant that the network's C.E.O., Mel Karmazin, would come aboard as chief operating officer of Viacom. But that in turn meant that two of Redstone's most trusted executives, Tom Dooley and Philippe Dauman, would have to give up their posts as deputy chairmen. Redstone says that he was "shocked" when he was told this. Dooley and Dauman were not just business associates; they were his "close friends." Redstone says that he could not accept this, that there was "no way" he could agree to the deal if it meant losing his deputies. At this point, though, we simply don't believe him - we don't believe that someone as smart as Redstone wouldn't have realized this going into the deal with CBS, and we don't believe that Redstone's entirely instrumental friendships could possibly stand in the way of his getting bigger and richer. "A Passion to Win" would have told us much more about Redstone, and about business, if it had confronted this fact and tried to make sense of it. But Redstone is a supremely unself-conscious man, and that trait, which has served him so well in the business world, is fatal in an author. Karmazin comes. Dauman and Dooley go. Redstone moves blithely on to make new best friends.
GO TO TOP MENU

  Millions of people owe their lives to Fred Soper. Why isn't he a hero?

  1.

  In the late nineteen-thirties, a chemist who worked for the J.R. Geigy company, in Switzerland, began experimenting with an odorless white crystalline powder called dichloro-diphenyl-trichloroethane. The chemist, Paul Müller, wanted to find a way to protect woollens against moths, and his research technique was to coat the inside of a glass box with whatever chemical he was testing, and then fill it with houseflies. To his dismay, the flies seemed unaffected by the new powder. But, in one of those chance decisions on which scientific discovery so often turns, he continued his experiment overnight - and in the morning all the flies were dead. He emptied the box, and put in a fresh batch of flies. By the next morning, they, too, were dead. He added more flies, and then a handful of other insects. They all died. He scrubbed the box with an acetone solvent, and repeated the experiment with a number of closely related compounds that he had been working with. The flies kept dying. Now he was excited: had he come up with a whole line of potent new insecticides? As it turned out, he hadn't. The new candidate chemicals were actually useless. To his amazement, what was killing the flies in the box were scant traces of the first compound, dichloro-diphenyl-trichloroethane - or, as it would come to be known, DDT.

  In 1942, Geigy sent a hundred kilograms of the miracle powder to its New York office. The package lay around, undisturbed, until another chemist, Victor Froelicher, happened to translate the extraordinary claims for DDT into English, and then passed on a sample to the Department of Agriculture, which in turn passed it on to its entomology research station, in Orlando, Florida. The Orlando laboratory had been charged by the Army to develop new pesticides, because the military, by this point in the war, was desperate for a better way to protect its troops against insect-borne disease. Typhus - the lethal fever spread by lice - had killed millions of people during and after the First World War and was lurking throughout the war zones. Worse, in almost every theatre of operations, malaria-carrying mosquitoes were causing havoc. As Robert Rice recounted in this magazine almost fifty years ago, the First Marine Division had to be pulled from combat in 1942 and sent to Melbourne to recuperate because, out of seventeen thousand men, ten thousand were incapacitated with malarial headaches, fevers, and chills. Malaria hit eighty-five per cent of the men holding onto Bataan. In fact, at any one time in the early stages of the war, according to General Douglas MacArthur, two-thirds of his troops in the South Pacific were sick with malaria. Unless something was done, MacArthur complained to the malariologist Paul Russell, it was going to be "a long war." Thousands of candidate insecticides were tested at Orlando, and DDT was by far the best.

  To gauge a chemical's potential against insects, the Orlando researchers filled a sleeve with lice and a candidate insecticide, slipped the sleeve over a subject's arm, and taped it down at both ends. After twenty-four hours, the dead lice were removed and fresh lice were added. A single application of DDT turned out to kill lice for a month, almost four times longer than the next-best insecticide. As Rice described it, researchers filled twelve beakers with mosquito larvae, and placed descending amounts of DDT in each receptacle - with the last beaker DDT free. The idea was to see how much chemical was needed to kill the mosquitoes. The mosquito larvae in every beaker died. Why? Because just the few specks of chemical that floated through the air and happened to land in the last beaker while the experiment was being set up were enough to kill the mosquitoes. Quickly, a field test was scheduled. Two duck ponds were chosen, several miles apart. One was treated with DDT. One was not. Spraying was done on a day when the wind could not carry the DDT from the treated to the untreated pond. The mosquito larvae in the first pond soon died. But a week later mosquito larvae in the untreated pond also died: when ducks from the first pond visited the second pond, there was enough DDT residue on their feathers to kill mosquitoes there as well.

  The new compound was administered to rabbits and cats. Rice tells how human volunteers slathered themselves with it, and sat in vaults for hours, inhaling the fumes. Tests were done to see how best to apply it. "It was put in solution or suspension, depending on what we were trying to do," Geoffrey Jeffery, who worked on DDT at the Tennessee Valley Authority, recalls. "Sometimes we'd use some sort of petroleum-based carrier, even diesel oil, or add water to a paste or concentration and apply it on the wall with a Hudson sprayer." Under conditions of great secrecy, factories were set up, to manufacture the new chemical by the ton. It was rushed to every Allied theatre. In Naples, in 1944, the Army averted a catastrophic typhus epidemic by "dusting" more than a million people with DDT powder. The Army Air Force built DDT "bombs," attaching six-hundred-and-twenty-five-gallon tanks to the underside of the wings of B-25s and C-47s, and began spraying Pacific beachheads in advance of troop arrivals. In Saipan, invading marines were overtaken by dengue, a debilitating fever borne by the Aedes variety of mosquito. Five hundred men were falling sick every day, each incapacitated for four to five weeks. The medical officer called in a DDT air strike that saturated the surrounding twenty-five square miles with nearly nine thousand gallons of five-per-cent DDT solution. The dengue passed. The marines took Saipan.

  It is hard to overestimate the impact that DDT's early success had on the world of public health. In the nineteen-forties, there was still malaria in the American South. There was malaria throughout Europe, Asia, and the Caribbean. In India alone, malaria killed eight hundred thousand people a year. When, in 1920, William Gorgas, the man who cleansed the Panama Canal Zone of malaria, fell mortally ill during a trip through England, he was knighted on his deathbed by King George V and given an official state funeral at St. Paul's Cathedral - and this for an American who just happened to be in town when he died. That is what it meant to be a malaria fighter in the first half of the last century. And now there was a chemical - the first successful synthetic pesticide - that seemed to have an almost magical ability to kill mosquitoes. In 1948, Müller won the Nobel Prize for his work with DDT, and over the next twenty years his discovery became the centerpiece of the most ambitious public-health campaign in history.

  Today, of course, DDT is a symbol of all that is dangerous about man's attempts to interfere with nature. Rachel Carson, in her landmark 1962 book, "Silent Spring," wrote memorably of the chemical's environmental consequences, how its unusual persistence and toxicity had laid waste to wildlife and aquatic ecosystems. Only two countries - India and China - continue to manufacture the substance, and only a few dozen more still use it. In May, at the Stockholm Convention on Persistent Organic Pollutants, more than ninety countries signed a treaty, placing DDT on a restricted-use list, and asking all those still using the chemical to develop plans for phasing it out entirely. On the eve of its burial, however - and at a time when the threat of insect-borne disease around the world seems to be resurgent - it is worth remembering that people once felt very differently about DDT, and that between the end of the Second World War and the beginning of the nineteen-sixties it was considered not a dangerous pollutant but a lifesaver. The chief proponent of that view was a largely forgotten man named Fred Soper, who ranks as one of the unsung heroes of the twentieth century. With DDT as his weapon, Soper almost saved the world from one of its most lethal afflictions. Had he succeeded, we would not today be writing DDT's obituary. We would view it in the same heroic light as penicillin and the polio vaccine.

  2.

  Fred Soper was a physically imposing man. He wore a suit, it was said, like a uniform. His hair was swept straight back from his forehead. His eyes were narrow. He had large wire-rimmed glasses, and a fastidiously maintained David Niven mustache. Soper was born in Kansas in 1893, received a doctorate from the Johns Hopkins School of Public Health, and spent the better part of his career working for the Rockefeller Foundation, which in the years before the Second World War - before the establishment of the United Nations and the World Health Organization - functioned as the world's unofficial public-health directorate, using its enormous resources to fight everything from yellow fever in Colombia to hookworm in Thailand.

  In those years, malaria warriors fell into one of two camps. The first held that the real enemy was the malaria parasite - the protozoan that mosquitoes pick up from the blood of an infected person and transmit to others. The best way to break the chain of infection, this group argued, was to treat the sick with antimalarial drugs, to kill the protozoan so there was nothing for mosquitoes to transmit. The second camp held, to the contrary, that the mosquito was the real enemy, since people would not get malaria in the first place if there were no mosquitoes around to bite them. Soper belonged to the latter group, and his special contribution was to raise the killing of mosquitoes to an art. Gorgas, Soper's legendary predecessor, said that in order to fight malaria you had to learn to think like a mosquito. Soper disagreed. Fighting malaria, he said, had very little to do with the intricacies of science and biology. The key was learning to think like the men he hired to go door-to-door and stream-to-stream, killing mosquitoes. His method was to apply motivation, discipline, organization, and zeal, in understanding human nature. Fred Soper was the General Patton of entomology.

  While working in South America in 1930, Soper had enforced a rigorous protocol for inspecting houses for mosquito infestation, which involved checking cisterns and climbing along roof gutters. (He pushed himself so hard perfecting the system in the field that he lost twenty-seven pounds in three months.) He would map an area to be cleansed of mosquitoes, give each house a number, and then assign each number to a sector. A sector, in turn, would be assigned to an inspector, armed with the crude pesticides then available; the inspector's schedule for each day was planned to the minute, in advance, and his work double-checked by a supervisor. If a supervisor found a mosquito that the inspector had missed, he received a bonus. And if the supervisor found that the inspector had deviated by more than ten minutes from his preassigned schedule the inspector was docked a day's pay. Once, in the state of Rio de Janeiro, a large ammunition dump - the Niterói Arsenal - blew up. Soper, it was said, heard the explosion in his office, checked the location of the arsenal on one of his maps, verified by the master schedule that an inspector was at the dump at the time of the accident, and immediately sent condolences and a check to the widow. The next day, the inspector showed up for work, and Soper fired him on the spot - for being alive. Soper, in one memorable description, "seemed equally capable of browbeating man or mosquito." He did not engage in small talk. In 1973, at Soper's eightieth-birthday party, a former colleague recounted how much weight he had lost working for Soper; another told a story of how Soper looked at him uncomprehendingly when he asked to go home to visit his ailing wife; a third spoke of Soper's betting prowess. "He was very cold and very formal," remembers Andrew Spielman, a senior investigator in tropical disease at the Harvard School of Public Health and the author, with Michael D'Antonio, of the marvellous new book "Mosquito: A Natural History of Our Most Persistent and Deadly Foe." "He always wore a suit and tie. With that thin little mustache and big long upper lip, he scared the hell out of me."

  One of Soper's greatest early victories came in Brazil, in the late nineteen-thirties, when he took on a particularly vicious strain of mosquito known as Anopheles gambiae. There are about twenty-five hundred species of mosquito in the world, each with its own habits and idiosyncrasies - some like running water, some like standing water, some bite around the ankles, some bite on the arms, some bite indoors, some bite outdoors - but only mosquitoes of the genus Anopheles are capable of carrying the human malaria parasite. And, of the sixty species of Anopheles that can transmit malaria, gambiae is the variety best adapted to spreading the disease. In California, there is a strain of Anopheles known as freeborni, which is capable of delivering a larger dose of malaria parasite than gambiae ever could. But freeborni is not a good malaria vector, because it prefers animals to people. Gambiae, by contrast, bites humans ninety-five per cent of the time. It has long legs and yellow-and-black spotted wings. It likes to breed in muddy pools of water, even in a water-filled footprint. And, unlike many mosquitoes, it is long-lived, meaning that once it has picked up the malaria parasite it can spread the protozoan to many others. Gambiae gathers in neighborhoods in the evenings, slips into houses at dusk, bites quietly and efficiently during the night, digests its "blood meal" while resting on the walls of the house, and then slips away in the morning. In epidemiology, there is a concept known as the "basic reproduction number," or BRN, which refers to the number of people one person can infect with a contagious disease. The number for H.I.V., which is relatively difficult to transmit, is just above one. For measles, the BRN is between twelve and fourteen. But with a vector like gambiae in the picture the BRN for malaria can be more than a hundred, meaning that just one malarious person can be solely responsible for making a hundred additional people sick. The short answer to the question of why malaria is such an overwhelming problem in Africa is that gambiae is an African mosquito.

  In March, 1930, a Rockefeller Foundation entomologist named Raymond Shannon was walking across tidal flats to the Potengi River, in Natal, Brazil, when he noticed, to his astonishment, two thousand gambiae larvae in a pool of water, thousands of miles from their homeland. Less than a kilometre away was a port where French destroyers brought mail across the Atlantic from Africa, and Shannon guessed that the mosquito larvae had come over, fairly recently, aboard one of the mail ships. He notified Soper, who was his boss, and Soper told Brazilian officials to open the dykes damming the tidal flats, because salt water from the ocean would destroy the gambiae breeding spots. The government refused. Over the next few years, there were a number of small yet worrisome outbreaks of malaria, followed by a few years of drought, which kept the problem in check. Then, in 1938, the worst malaria epidemic in the history of the Americas broke out. Gambiae had spread a hundred and fifty miles along the coast and inland, infecting a hundred thousand people and killing as many as twenty thousand. Soper was called in. This was several years before the arrival of DDT, so he brought with him the only tools malariologists had in those years: diesel oil and an arsenic-based mixture called Paris green, both of which were spread on the pools of water where gambiae larvae bred; and pyrethrum, a natural pesticide made from a variety of chrysanthemum, which was used to fumigate buildings. Four thousand men were put at his disposal. He drew maps and divided up his troops. The men wore uniforms, and carried flags to mark where they were working, and they left detailed written records of their actions, to be reviewed later by supervisors. When Soper discovered twelve gambiae in a car leaving an infected area, he set up thirty de-insectization posts along the roads, spraying the interiors of cars and trucks; seven more posts on the rail lines; and defumigation posts at the ports and airports. In Soper's personal notes, now housed at the National Library of Medicine, in Bethesda, there is a cue card, on which is typed a quotation from a veteran of the Rockefeller Foundation's efforts, in the early twentieth century, to eradicate hookworm. "Experience proved that the best way to popularize a movement so foreign to the customs of the people . . . was to prosecute it as though it were the only thing in the universe left undone." It is not hard to imagine the card tacked above Soper's desk in Rio for inspiration: his goal was not merely to cripple the population of gambiae, since that would simply mean that they would return, to kill again. His goal was to eliminate gambiae from every inch of the region of Brazil that they had colonized - an area covering some eighteen thousand square miles. It was an impossible task. Soper did it in twenty-two months.

  3.

  While DDT was being tested in Orlando, Soper was in North Africa with the United States Typhus Commission, charged with preventing the kind of louse-spread typhus epidemics that were so devastating during the First World War. His tool of choice was a delousing powder called MYL. Lice live in the folds of clothing, and a previous technique had been to treat the clothing after people had disrobed. But that was clearly not feasible in Muslim cities like Cairo and Algiers, nor was it practical for large-scale use. So Soper devised a new technique. He had people tie their garments at the ankles and wrists, and then he put the powder inside a dust gun, of the sort used in gardening, and blew it down the collar, creating a balloon effect. "We were in Algiers, waiting for Patton to get through Sicily," Thomas Aitken, an entomologist who worked with Soper in those years, remembers. "We were dusting people out in the countryside. This particular day, a little old Arab man, only about so high, came along with his donkey and stopped to talk to us. We told him what we were doing, and we dusted him. The next day, he comes by again and says that that had been the first time in his life that he had ever been able to sleep through the night."

  In December of 1943, the typhus team was dispatched to Naples, where in the wake of the departing German Army the beginnings of a typhus epidemic had been detected. The rituals of Cairo were repeated, only this time the typhus fighters, instead of relying on MYL (which easily lost its potency), were using DDT. Men with dusters careered through the narrow cobblestoned streets of the town, amid the wreckage of the war, delousing the apartment buildings of typhus victims. Neapolitans were dusted as they came out of the railway stations in the morning, and dusted in the streets, and dusted in the crowded grottoes that served as bomb shelters beneath the city streets. In the first month, more than 1.3 million people were dusted, saving countless lives.

  Soper's diary records a growing fascination with this new weapon. July 25, 1943: "Lunch with L.L. Williams and Justin Andrews. L.L. reports that he has ordered 10,000 lbs of Neocid [DDT]and that Barber reports it to be far superior to [Paris Green]for mosquitoes." February 25, 1944: "Knipling visits laboratory. Malaria results [for DDT]ARE FANTASTIC." When Rome fell, in mid-1944, Soper declared that he wanted to test DDT in Sardinia, the most malarious part of Italy. In 1947, he got his wish. He pulled out his old organization charts from Brazil. The island - a rocky, mountainous region the size of New Hampshire, with few roads - was mapped and divided up hierarchically, the smallest unit being the area that could be covered by a sprayer in a week. Thirty-three thousand people were hired. More than two hundred and eighty-six tons of DDT were acquired. Three hundred and thirty-seven thousand buildings were sprayed. The target Anopheles was labranchiae, which flourishes not just in open water but also in the thick weeds that surround the streams and ponds and marshes of Sardinia. Vegetation was cut back, and a hundred thousand acres of swampland were drained. Labranchiae larvae were painstakingly collected and counted and shipped to a central laboratory, where precise records were kept of the status of the target vector. In 1946, before the campaign started, there were seventy-five thousand malaria cases on the island. In 1951, after the campaign finished, there were nine.

  "The locals regarded this as the best thing that had ever happened to them," Thomas Aitken says. He had signed on with the Rockefeller Foundation after the war, and was one of the leaders of the Sardinian effort. "The fact that malaria was gone was welcome," he went on. "But also the DDT got rid of the houseflies. Sardinian houses were made of stone. The wires for the lights ran along the walls near the ceiling. And if you looked up at the wires they were black with housefly droppings from over the years. And suddenly the flies disappeared." Five years ago, Aitken says, he was invited back to Sardinia for a celebration to mark the forty-fifth anniversary of malaria's eradication from the island. "There was a big meeting at our hotel. The public was invited, as well as a whole bunch of island and city officials, the mayor of Cagliari, and representatives of the Italian government. We all sat on a dais, at the side of the room, and I gave a speech there, in Italian, and when I finished everybody got up and clapped their hands and was shouting. It was very embarrassing. I started crying. I couldn't help it. Just reminiscing now . . ."

  Aitken is a handsome, courtly man of eighty-eight, lean and patrician in appearance. He lives outside New Haven, in an apartment filled with art and furniture from his time in Sardinia. As he thought back to those years, there were tears in his eyes, and at that moment it was possible to appreciate the excitement that gripped malariologists in the wake of the Second World War. The old-school mosquito men called themselves mud-hen malariologists, because they did their job in swamps and ditches and stagnant pools of water. Paris green and pyrethrum were crude insecticides that had to be applied repeatedly; pyrethrum killed only those mosquitoes that happened to be in the room when you were spraying. But here, seemingly, was a clean, pure, perfectly modern weapon. You could spray a tiny amount on a wall, and that single application would kill virtually every mosquito landing on that surface for the next six months. Who needed a standing army of inspectors anymore? Who needed to slog through swamps? This was an age of heroics in medicine. Sabin and Salk were working on polio vaccines with an eye to driving that disease to extinction. Penicillin was brand new, and so effective that epidemiologists were dreaming of an America without venereal disease. The extinction of smallpox, that oldest of scourges, seemed possible. All the things that we find sinister about DDT today - the fact that it killed everything it touched, and kept on killing everything it touched - were precisely what made it so inspiring at the time. "The public-health service didn't pay us a lot," says McWilson Warren, who spent the early part of his career fighting malaria in the Malaysian jungle. "So why were we there? Because there was something so wonderful about being involved with people who thought they were doing something more important than themselves." In the middle of the war, Soper had gone to Egypt, and warned the government that it had an incipient invasion of gambiae. The government ignored him, and the next year the country was hit with an epidemic that left more than a hundred thousand dead. In his diary, Soper wrote of his subsequent trip to Egypt, "In the afternoon to the Palace where Mr. Jacobs presents me to His Majesty King Faruk. The King says that he is sorry to know that measures I suggested last year were not taken at that time." Soper had triumphed over gambiae in Brazil, driven lice from Cairo and Naples, and had a weapon, DDT, that seemed like a gift from God - and now kings were apologizing to him. Soper started to dream big: Why not try to drive malaria from the entire world?

  4.

  Fred Soper's big idea came to be known as the Global Malaria Eradication Programme. In the early nineteen-fifties, Soper had been instrumental in getting the Brazilian malariologist Marcolino Candau - whom he had hired during the anti-gambiae campaign of the nineteen-thirties - elected as director-general of the World Health Organization, and, in 1955, with Candau's help, Soper pushed through a program calling on all member nations to begin a rigorous assault on any malaria within their borders. Congress was lobbied, and John Kennedy, then a senator, became an enthusiastic backer. Beginning in 1958, the United States government pledged the equivalent of billions in today's dollars for malaria eradication - one of the biggest commitments that a single country has ever made to international health. The appeal of the eradication strategy was its precision. The idea was not to kill every Anopheles mosquito in a given area, as Soper had done with gambiae in Brazil. That was unnecessary. The idea was to use DDT to kill only those mosquitoes which were directly connected to the spread of malaria - only those which had just picked up the malaria parasite from an infected person and were about to fly off and infect someone else. When DDTis used for this purpose, Spielman writes in "Mosquito," "it is applied close to where people sleep, on the inside walls of houses. After biting, the mosquitoes generally fly to the nearest vertical surface and remain standing there for about an hour, anus down, while they drain the water from their gut contents and excrete it in a copious, pink-tinged stream. If the surfaces the mosquitoes repair to are coated by a poison that is soluble in the wax that covers all insects' bodies, the mosquitoes will acquire a lethal dose." Soper pointed out that people who get malaria, and survive, generally clear their bodies of the parasite after three years. If you could use spraying to create a hiatus during which minimal transmission occurred - and during which anyone carrying the parasite had a chance to defeat it - you could potentially eradicate malaria. You could stop spraying and welcome the mosquitoes back, because there would be no more malaria around for them to transmit. Soper was under no illusions about how difficult this task would be. But, according to his calculations, it was technically possible, if he and his team achieved eighty-per-cent coverage - if they sprayed eight out of every ten houses in infected areas.

  Beginning in the late fifties, DDT was shipped out by the ton. Training institutes were opened. In India alone, a hundred and fifty thousand people were hired. By 1960, sixty-six nations had signed up. "What we all had was a handheld pressure sprayer of three-gallon capacity," Jesse Hobbs, who helped run the eradication effort in Jamaica in the early sixties, recalls. "Generally, we used a formulation that was water wettable, meaning you had powder you mixed with water. Then you pressurized the tank. The squad chief would usually have notified the household some days before. The instructions were to take the pictures off the wall, pull everything away from the wall. Take the food and eating utensils out of the house. The spray man would spray with an up-and-down movement - at a certain speed, according to a pattern. You started at a certain point and sprayed the walls and ceiling, then went outside to spray the eaves of the roof. A spray man could cover ten to twelve houses a day. You were using about two hundred milligrams per square foot of DDT, which isn't very much, and it was formulated in a way that you could see where you sprayed. When it dried, it left a deposit, like chalk. It had a bit of a chlorine smell. It's not perfume. It's kind of like swimming-pool water. People were told to wait half an hour for the spray to dry, then they could go back." The results were dramatic. In Taiwan, much of the Caribbean, the Balkans, parts of northern Africa, the northern region of Australia, and a large swath of the South Pacific, malaria was eliminated. Sri Lanka saw its cases drop to about a dozen every year. In India, where malaria infected an estimated seventy-five million and killed eight hundred thousand every year, fatalities had dropped to zero by the early sixties. Between 1945 and 1965, DDT saved millions - even tens of millions - of lives around the world, perhaps more than any other man-made drug or chemical before or since.

  What DDT could not do, however, was eradicate malaria entirely. How could you effectively spray eighty per cent of homes in the Amazonian jungle, where communities are spread over hundreds of thousands of highly treacherous acres? Sub-Saharan Africa, the most malarious place on earth, presented such a daunting logistical challenge that the eradication campaign never really got under way there. And, even in countries that seemed highly amenable to spraying, problems arose. "The rich had houses that they didn't want to be sprayed, and they were giving bribes," says Socrates Litsios, who was a scientist with the W.H.O. for many years and is now a historian of the period. "The inspectors would try to double their spraying in the morning so they wouldn't have to carry around the heavy tanks all day, and as a result houses in the afternoon would get less coverage. And there were many instances of corruption with insecticides, because they were worth so much on the black market. People would apply diluted sprays even when they knew they were worthless." Typical of the logistical difficulties is what happened to the campaign in Malaysia. In Malaysian villages, the roofs of the houses were a thatch of palm fronds called atap. They were expensive to construct, and usually lasted five years. But within two years of DDT spraying the roofs started to fall down. As it happened, the atap is eaten by caterpillar larvae, which in turn are normally kept in check by parasitic wasps. But the DDT repelled the wasps, leaving the larvae free to devour the atap. "Then the Malaysians started to complain about bedbugs, and it turns out what normally happens is that ants like to eat bedbug larvae," McWilson Warren said. "But the ants were being killed by the DDT and the bedbugs weren't - they were pretty resistant to it. So now you had a bedbug problem." He went on, "The DDT spray teams would go into villages, and no one would be at home and the doors would be locked and you couldn't spray the house. And, understand, for that campaign to work almost every house had to be sprayed. You had to have eighty-per-cent coverage. I remember there was a malaria meeting in '62 in Saigon, and the Malaysians were saying that they could not eradicate malaria. It was not possible. And everyone was arguing with them, and they were saying, 'Look, it's not going to work.' And if Malaysia couldn't do it - and Malaysia was one of the most sophisticated places in the region - who could?"

  At the same time, in certain areas DDT began to lose its potency. DDT kills by attacking a mosquito's nervous system, affecting the nerve cells so that they keep firing and the insect goes into a spasm, lurching, shuddering, and twitching before it dies. But in every population of mosquitoes there are a handful with a random genetic mutation that renders DDT nontoxic - that prevents it from binding to nerve endings. When mass spraying starts, those genetic outliers are too rare to matter. But, as time goes on, they are the only mosquitoes still breeding, and entire new generations of insects become resistant. In Greece, in the late nineteen-forties, for example, a malariologist noticed Anopheles sacharovi mosquitoes flying around a room that had been sprayed with DDT. In time, resistance began to emerge in areas where spraying was heaviest. To the malaria warriors, it was a shock. "Why should they have known?" Janet Hemingway, an expert in DDT resistance at the University of Wales in Cardiff, says. "It was the first synthetic insecticide. They just assumed that it would keep on working, and that the insects couldn't do much about it." Soper and the malariologist Paul Russell, who was his great ally, responded by pushing for an all-out war on malaria. We had to use DDT, they argued, or lose it. "If countries, due to lack of funds, have to proceed slowly, resistance is almost certain to appear and eradication will become economically impossible," Russell wrote in a 1956 report. "TIME IS OF THE ESSENCE because DDT resistance has appeared in six or seven years." But, with the administrative and logistical problems posed by the goal of eighty-per-cent coverage, that deadline proved impossible to meet.

  5.

  In 1963, the money from Congress ran out. Countries that had been told they could wipe out malaria in four years - and had diverted much of their health budgets to that effort - grew disillusioned as the years dragged on and eradication never materialized. Soon, they put their money back into areas that seemed equally pressing, like maternal and child health. Spraying programs were scaled back. In those countries where the disease had not been completely eliminated, malaria rates began to inch upward. In 1969, the World Health Organization formally abandoned global eradication, and in the ensuing years it proved impossible to muster any great enthusiasm from donors to fund antimalaria efforts. The W.H.O. now recommends that countries treat the disease largely through the health-care system - through elimination of the parasite - but many anti-malarial drugs are no longer effective. In the past thirty years, there have been outbreaks in India, Sri Lanka, Brazil, and South Korea, among other places. "Our troubles with mosquitoes are getting worse," Spielman concludes in "Mosquito," "making more people sick and claiming more lives, millions of lives, every year."

  For Soper, the unravelling of his dream was pure torture. In 1959, he toured Asia to check on the eradication campaigns of Thailand, the Philippines, Ceylon, and India, and came back appalled at what he had seen. Again and again, he found, countries were executing his strategy improperly. They weren't spraying for long enough. They didn't realize that unless malaria was ground into submission it would come roaring back. But what could he do? He had prevailed against gambiae in Brazil in the nineteen-thirties because he had been in charge; he had worked with the country's dictator to make it illegal to prevent an inspector from entering a house, and illegal to prevent the inspector from treating any open container of water. Jesse Hobbs tells of running into Soper one day in Trinidad, after driving all day in an open jeep through the tropical heat. Soper drove up in a car and asked Hobbs to get in; Hobbs demurred, gesturing at his sweaty shirt. "Son," Soper responded, "we used to go out in a day like this in Brazil and if we found a sector chief whose shirt was not wet we'd fire him." Killing mosquitoes, Soper always said, was not a matter of knowledge and academic understanding; it was a matter of administration and discipline. "He used to say that if you have a democracy you can't have eradication," Litsios says. "When Soper was looking for a job at Johns Hopkins - this would have been '46 - he told a friend that 'they turned me down because they said I was a fascist.'" Johns Hopkins was right, of course: he was a fascist - a disease fascist - because he believed a malaria warrior had to be. But now roofs were falling down in Malaysia, and inspectors were taking bribes, and local health officials did not understand the basic principles of eradication - and his critics had the audacity to blame his ideas, rather than their own weakness.

  It was in this same period that Rachel Carson published "Silent Spring," taking aim at the environmental consequences of DDT. "The world has heard much of the triumphant war against disease through the control of insect vectors of infection," she wrote, alluding to the efforts of men like Soper, "but it has heard little of the other side of the story - the defeats, the short-lived triumphs that now strongly support the alarming view that the insect enemy has been made actually stronger by our efforts." There had already been "warnings," she wrote, of the problems created by pesticides:

  On Nissan Island in the South Pacific, for example, spraying had been carried on intensively during the Second World War, but was stopped when hostilities came to an end. Soon swarms of a malaria-carrying mosquito reinvaded the island. All of its predators had been killed off and there had not been time for new populations to become established. The way was therefore clear for a tremendous population explosion. Marshall Laird, who had described this incident, compares chemical control to a treadmill; once we have set foot on it we are unable to stop for fear of the consequences.

  It is hard to read that passage and not feel the heat of Soper's indignation. He was familiar with "Silent Spring" - everyone in the malaria world was - and what was Carson saying?Of course the mosquitoes came back when DDT spraying stopped. The question was whether the mosquitoes were gone long enough to disrupt the cycle of malaria transmission. The whole point of eradication, to his mind, was that it got you off the treadmill: DDT was so effective that if you used it properly you could stop spraying and not fear the consequences. Hadn't that happened in places like Taiwan and Jamaica and Sardinia?

  "Silent Spring" was concerned principally with the indiscriminate use of DDT for agricultural purposes; in the nineteen-fifties, it was being sprayed like water in the Western countryside, in an attempt to control pests like the gypsy moth and the spruce budworm. Not all of Carson's concerns about the health effects of DDT have stood the test of time - it has yet to be conclusively linked to human illness - but her larger point was justified: DDT was being used without concern for its environmental consequences. It must have galled Soper, however, to see how Carson effectively lumped the malaria warriors with those who used DDT for economic gain. Nowhere in "Silent Spring" did Carson acknowledge that the chemical she was excoriating as a menace had, in the two previous decades, been used by malariologists to save somewhere in the vicinity of ten million lives. Nor did she make it clear how judiciously the public-health community was using the chemical. By the late fifties, health experts weren't drenching fields and streams and poisoning groundwater and killing fish. They were leaving a microscopic film on the inside walls of houses; spraying every house in a country the size of Guyana, for example, requires no more DDT in a year than a large cotton farm does. Carson quoted a housewife from Hinsdale, Illinois, who wrote about the damage left by several years of DDT spraying against bark beetles: "The town is almost devoid of robins and starlings; chickadees have not been on my shelf for two years, and this year the cardinals are gone too; the nesting population in the neighborhood seems to consist of one dove pair and perhaps one catbird family. . . . 'Will they ever come back?' [the children]ask, and I do not have the answer." Carson then quoted a bird-lover from Alabama:"There was not a sound of the song of a bird. It was eerie, terrifying. What was man doing to our perfect and beautiful world?" But to Soper the world was neither perfect nor beautiful, and the question of what man could do to nature was less critical than what nature, unimpeded, could do to man. Here, from a well-thumbed page inserted in Soper's diaries, is a description of a town in Egypt during that country's gambiae invasion of 1943 - a village in the grip of its own, very different, unnatural silence:

  Most houses are without roofs. They are just a square of dirty earth. In those courtyards and behind the doors of these hovels were found whole families lying on the floor; some were just too weakened by illness to get up and others were lying doubled up shaking from head to foot with their teeth chattering and their violently trembling hands trying in vain to draw some dirty rags around them for warmth. They were in the middle of the malaria crisis. There was illness in every house. There was hardly a house which had not had its dead and those who were left were living skeletons, their old clothing in rags, their limbs swollen from undernourishment and too weak to go into the fields to work or even to get food.

  It must have seemed to Soper that the ground had shifted beneath his feet - that the absolutes that governed his life, that countenanced even the most extreme of measures in the fight against disease, had suddenly and bewilderingly been set aside. "I was on several groups who evaluated malaria-eradication programs in some of the Central American countries and elsewhere," Geoffrey Jeffery recalls. "Several times we came back with the answer that with the present technology and effort it wasn't going to work. Well, that didn't suit Soper very much. He harangued us. We shouldn't be saying things like that!" Wilbur Downs, a physician who worked for the Rockefeller Foundation in Mexico in the fifties, used to tell of a meeting with Soper and officials of the Mexican government about the eradication of malaria in that country. Soper had come down from Washington, and amid excited talk of ending malaria forever Downs pointed out that there were serious obstacles to eradication - among them the hastened decomposition and absorption of DDT by the clays forming adobe walls. It was all too much for Soper. This was the kind of talk that was impeding eradication - the doubting, the equivocation, the incompetence, the elevation of songbirds over human life. In the middle of the meeting, Soper - ramrod straight, eyes afire - strode over to Downs, put both his hands around his neck, and began to shake.

  6.

  Fred Soper ran up against the great moral of the late twentieth century - that even the best-intentioned efforts have perverse consequences, that benefits are inevitably offset by risks. This was the lesson of "Silent Spring," and it was the lesson, too, that malariologists would take from the experience with global eradication. DDT, Spielman argues, ought to be used as selectively as possible, to quell major outbreaks. "They should have had a strong rule against spraying the same villages again and again," he says. "But that went against their doctrine. They wanted eighty-per-cent coverage. They wanted eight out of ten houses year after year after year, and that's a sure formula for resistance." Soper and Russell once argued about whether, in addition to house spraying, malaria fighters should continue to drain swamps. Russell said yes; Soper said no, that it would be an unnecessary distraction. Russell was right: it made no sense to use only one weapon against malaria. Spielman points out that malaria transmission in sub-Saharan Africa is powerfully affected by the fact that so many people live in mud huts. The walls of that kind of house need to be constantly replastered, and to do that villagers dig mud holes around their huts. But a mud hole is a prime breeding spot for gambiae. If economic aid were directed at helping villagers build houses out of brick, Spielman argues, malaria could be dealt a blow. Similarly, the Princeton University malariologist Burton Singer says that since the forties it has been well known that mosquito larvae that hatch in rice fields - a major breeding site in southeast Asia - can be killed if the water level in the fields is intermittently drained, a practice that has the additional effect of raising rice yields. Are these perfect measures? No. But, under the right circumstances, they are sustainable. In a speech Soper presented on eradication, he quoted Louis Pasteur: "It is within the power of man to rid himself of every parasitic disease." The key phrase, for Soper, was "within the power." Soper believed that the responsibility of the public-health professional was to make an obligation out of what was possible. He never understood that concessions had to be made to what was practical. "This is the fundamental difference between those of us in public health who have an epidemiological perspective, and people, like Soper, with more of a medical approach," Spielman says. "We deal with populations over time, populations of individuals. They deal with individuals at a moment in time. Their best outcome is total elimination of the condition in the shortest possible period. Our first goal is to cause no outbreaks, no epidemics, to manage, to contain the infection." Bringing the absolutist attitudes of medicine to a malarious village, Spielman says, "is a good way to do a bad thing." The Fred Soper that we needed, in retrospect, was a man of more modest ambitions.

  But, of course, Fred Soper with modest ambitions would not be Fred Soper; his epic achievements arose from his fanaticism, his absolutism, his commitment to saving as many lives as possible in the shortest period of time. For all the talk of his misplaced ambition, there are few people in history to whom so many owe their lives. The Global Malaria Eradication Programme helped eliminate the disease from the developed world, and from many parts of the developing world. In a number of cases where the disease returned, it came back at a lower level than it had been in the prewar years, and even in those places where eradication made little headway the campaign sometimes left in place a public infrastructure that had not existed before. The problem was that Soper had raised expectations too high. He had said that the only acceptable outcome for Global Eradication was global eradication, and when that did not happen he was judged - and, most important, he judged himself - a failure. But isn't the urgency Soper felt just what is lacking in the reasonableness of our contemporary attitude - in our caution and thoughtfulness and restraint? In the wake of the failure of eradication, it was popular to say that truly effective malaria control would have to await the development of a public-health infrastructure in poorer countries. Soper's response was, invariably: What about now? In a letter to a friend, he snapped, "The delay in handling malaria until it can be done by local health units is needlessly sacrificing the generation now living." There is something to admire in that attitude; it is hard to look at the devastation wrought by H.I.V. and malaria and countless other diseases in the Third World and not conclude that what we need, more than anything, is someone who will marshal the troops, send them house to house, monitor their every movement, direct their every success, and, should a day of indifference leave their shirts unsullied, send them packing. Toward the end of his life, Soper, who died in 1975, met with an old colleague, M. A. Farid, with whom he had fought gambiae in Egypt years before. "How do things go?" Soper began. "Bad!" Farid replied, for this was in the years when everyone had turned against Soper's vision. "Who will be our ally?" Soper asked. And Farid said simply, "Malaria," and Soper, he remembered, almost hugged him, because it was clear what Farid meant: Someday, when DDT is dead and buried, and the West wakes up to a world engulfed by malaria, we will think back on Fred Soper and wish we had another to take his place.
GO TO TOP MENU

  How the fight to make America's  highways safer went off course.

  I. BANG

  Every two miles, the average driver makes four hundred observations, forty decisions, and one mistake. Once every five hundred miles, one of those mistakes leads to a near collision, and once every sixty-one thousand miles one of those mistakes leads to a crash. When people drive, in other words, mistakes are endemic and accidents inevitable, and that is the first and simplest explanation for what happened to Robert Day on the morning of Saturday, April 9, 1994. He was driving a 1980 Jeep Wagoneer from his home, outside Philadelphia, to spend a day working on train engines in Winslow Township, New Jersey. He was forty-four years old, and made his living as an editor for the Chilton Book Company. His ten-year-old son was next to him, in the passenger seat. It was a bright, beautiful spring day. Visibility was perfect, and the roadway was dry, although one of the many peculiarities of car crashes is that they happen more often under ideal road conditions than in bad weather. Day's route took him down the Atlantic City Expressway to Fleming Pike, a two-lane country road that winds around a sharp curve and intersects, about a mile later, with Egg Harbor Road. In that final stretch of Fleming Pike, there is a scattering of houses and a fairly thick stand of trees on either side of the road, obscuring all sight lines to the left and right. As he approached the intersection, then, Day could not have seen a blue-and-gray 1993 Ford Aerostar minivan travelling between forty and fifty miles per hour southbound on Egg Harbor, nor a white 1984 Mazda 626 travelling at approximately fifty miles per hour in the other direction. Nor, apparently, did he see the stop sign at the corner, or the sign a tenth of a mile before that, warning of the intersection ahead. Day's son, in the confusing aftermath of the accident, told police that he was certain his father had come to a stop at the corner. But the accident's principal witness says he never saw any brake lights on the Wagoneer, and, besides, there is no way that the Jeep could have done the damage that it did from a standing start. Perhaps Day was distracted. The witness says that Day's turn signal had been on since he left the expressway. Perhaps he was looking away and looked back at the road at the wrong time, since there is an area, a few hundred yards before Egg Harbor Road, just on the near side of a little ridge, where the trees and houses make it look as if Fleming Pike ran without interruption well off into the distance. We will never know, and in any case it does not matter much. Day merely did what all of us do every time we get in a car: he made a mistake. It's just that he was unlucky enough that his mistake led him directly into the path of two other cars.

  The driver of the Ford Aerostar was Stephen Capoferri, then thirty-nine. He worked in the warehouse of Whitehall Laboratories, in southern New Jersey. He had just had breakfast with his parents and was on his way to the bank. The driver of the Mazda was Elizabeth Wolfrum. She was twenty-four. She worked as the manager of a liquor store. Her eighteen-year-old sister, Julie, was in the passenger seat; a two-year-old girl was in the back seat. Because of the vegetation on either side of Fleming Pike, Capoferri did not see Day's vehicle until it was just eighty-five feet from the point of impact, and if we assume that Day was travelling at forty miles per hour, or fifty-nine feet per second, that means that Capoferri had about 1.5 seconds to react. That is scarcely enough time. The average adult needs about that long simply to translate an observation ("That car is going awfully fast") into an action ("I ought to hit my brake"). Capoferri hit Day broadside, at a slight angle, the right passenger side of the Aerostar taking most of the impact. The Jeep was pushed sidewise, but it kept going forward, pulling off the grille and hood of the Aerostar, and sending it into a two-hundred-and-seventy-degree counterclockwise spin. As the Jeep lurched across the intersection, it slammed into the side of Wolfrum's Mazda. The cars slapped together, and then skidded together across the intersection, ending on the grass on the far, southeastern corner. According to documents filed by Elizabeth Wolfrum's lawyers, Wolfrum suffered eighteen injuries, including a ruptured spleen, multiple liver lacerations, brain damage, and fractures to the legs, ribs, ankles, and nose. Julie Wolfrum was partially ejected from the Mazda and her face hit the ground. She subsequently underwent seventeen separate surgical procedures and remained in intensive care for forty-four days. In post-crash photographs, their car looks as if it had been dropped head first from an airplane. Robert Day suffered massive internal injuries and was pronounced dead two hours later, at West Jersey Hospital. His son was bruised and shaken up. Capoferri walked away largely unscathed.

  "Once the impact occurred, I did a spin," he remembers. "I don't recall doing that. I may have blacked out. It couldn't have been for very long. I wanted to get out. I was trying to judge how I was. I was having a little trouble breathing. But I knew I could walk. My senses were gradually coming back to normal. I'm pretty sure I went to Day's vehicle first. I went to the driver's side. He was semi-conscious. He had blood coming out of his mouth. I tried to keep him awake. His son was in the passenger seat. He had no injuries. He said, 'Is my father O.K.?' I seem to remember looking in the Mazda. My first impression was that they were dead, because the driver's side of the vehicle was very badly smashed in. I think they needed the 'jaws of life' to get them out. There was a little girl in the back. She was crying."

  Capoferri has long black hair and a beard and the build of a wrestler. He is a thoughtful man who chooses his words carefully. As he talked, he was driving his Taurus back toward the scene of the accident, and he was apologetic that he could not recall more details of those moments leading up to the accident. But what is there to remember? In the popular imagination - fuelled by the car crashes of Hollywood movies, with their special effects and complicated stunts - an accident is a protracted sequence, played out in slow motion, over many frames. It is not that way in real life. The time that elapsed between the collision of Capoferri and Day and Day and Wolfrum was probably no more than twenty-five milliseconds, faster than the blinking of an eye, and the time that elapsed between the moment Capoferri struck Day and the moment his van came to a rest, two hundred and seventy degrees later, was probably no more than a second. Capoferri said that a friend of his, who lived right on the corner where the accident happened, told him later that all the crashing and spinning and skidding sounded like an single, sharp explosion - bang!

  II. THE PASSIVE APPROACH

  In the middle part of the last century, a man named William Haddon changed forever the way Americans think about car accidents. Haddon was, by training, a medical doctor and an epidemiologist and, by temperament, a New Englander - tall and reed-thin, with a crewcut, a starched white shirt, and a bow tie. He was exacting and cerebral, and so sensitive to criticism that it was said of him that he could be "blistered by moonbeams." He would not eat mayonnaise, or anything else subject to bacterial contamination. He hated lawyers, which was ironic, because it was lawyers who became his biggest disciples. Haddon was discovered by Daniel Patrick Moynihan, when Moynihan was working for Averell Harriman, then the Democratic governor of New York State. It was 1958. Moynihan was chairing a meeting on traffic safety, in Albany's old state-executive-office chambers, and a young man at the back of the room kept asking pointed questions. "What's your name?" Moynihan eventually asked, certain he had collared a Republican spy. "Haddon, sir," the young man answered. He was just out of the Harvard School of Public Health, and convinced that what the field of traffic safety needed was the rigor of epidemiology. Haddon asked Moynihan what data he was using. Moynihan shrugged. He wasn't using any data at all.

  Haddon and Moynihan went across the street to Yezzi's, a local watering hole, and Moynihan fell under Haddon's spell. The orthodoxy of that time held that safety was about reducing accidents - educating drivers, training them, making them slow down. To Haddon, this approach made no sense. His goal was to reduce the injuries that accidents caused. In particular, he did not believe in safety measures that depended on changing the behavior of the driver, since he considered the driver unreliable, hard to educate, and prone to error. Haddon believed the best safety measures were passive. "He was a gentle man," Moynihan recalls. "Quiet, without being mum. He never forgot that what we were talking about were children with their heads smashed and broken bodies and dead people."

  Several years later, Moynihan was working for President Johnson in the Department of Labor, and hired a young lawyer out of Harvard named Ralph Nader to work on traffic-safety issues. Nader, too, was a devotee of Haddon's ideas, and he converted a young congressional aide named Joan Claybrook. In 1959, Moynihan wrote an enormously influential article, articulating Haddon's principles, called "Epidemic on the Highways." In 1965, Nader wrote his own homage to the Haddon philosophy, "Unsafe at Any Speed," which became a best-seller, and in 1966 the Haddon crusade swept Washington. In the House and the Senate, there were packed hearings on legislation to create a federal regulatory agency for traffic safety. Moynihan and Haddon testified, as did a liability lawyer from South Carolina, in white shoes and a white suit, and a Teamsters official, Jimmy Hoffa, whom Claybrook remembers as a "fabulous" witness. It used to be that, during a frontal crash, steering columns in cars were pushed back through the passenger compartment, potentially impaling the driver. The advocates argued that columns should collapse inward on impact. Instrument panels ought to be padded, they said, and knobs shouldn't stick out, where they might cause injury. Doors ought to have strengthened side-impact beams. Roofs should be strong enough to withstand a rollover. Seats should have head restraints to protect against neck injuries. Windshields ought to be glazed, so that if you hit them with your head at high speed your face wasn't cut to ribbons. The bill sailed through both houses of Congress, and a regulatory body, which eventually became the National Highway Traffic Safety Administration, was established. Haddon was made its commissioner, Claybrook his special assistant. "I remember a Senate hearing we had with Warren Magnuson," Nader recalls. "He was listening to a pediatrician who was one of our allies, Seymour Charles, from New Jersey, and Charles was showing how there were two cars that collided, and one had a collapsible steering column and one didn't, and one driver walked away, the other was killed. And, just like that, Magnuson caught on. 'You mean,' he said, 'you can have had a crash without an injury?' That's it! A crash without an injury. That idea was very powerful."

  There is no question that the improvements in auto design which Haddon and his disciples pushed for saved countless lives. They changed the way cars were built, and put safety on the national agenda. What they did not do, however, is make American highways the safest in the world. In fact - and this is the puzzling thing about the Haddon crusade - the opposite happened. United States auto-fatality rates were the lowest in the world before Haddon came along. But, since the late nineteen-seventies, just as the original set of N.H.T.S.A. safety standards were having their biggest impact, America's safety record has fallen to eleventh place. According to calculations by Leonard Evans, a longtime General Motors researcher and one of the world's leading experts on traffic safety, if American traffic fatalities had declined at the same rate as Canada's or Australia's between 1979 and 1997, there would have been somewhere in the vicinity of a hundred and sixty thousand fewer traffic deaths in that span.

  This is not to suggest, of course, that Haddon's crusade is responsible for a hundred and sixty thousand highway deaths. Traffic safety is the most complex of phenomena - fatality rates can be measured in many ways, and reflect a hundred different variables - and in this period there were numerous factors that distinguished the United States from places like Canada and Australia, including different trends in drunk driving. Nor is it to say that the Haddonites had anything but the highest motives. Still, Evans's figures raise a number of troubling questions. Haddon and Nader and Claybrook told us, after all, that the best way to combat the epidemic on the highways was to shift attention from the driver to the vehicle. No other country pursued the passive strategy as vigorously, and no other country had such high expectations for its success. But America's slipping record on auto safety suggests that somewhere in the logic of that approach there was a mistake. And, if so, it necessarily changes the way we think about car crashes like the one that happened seven years ago on the corner of Fleming Pike and Egg Harbor Road.

  "I think that the philosophical argument behind the passive approach is a strong one," Evans says. A physicist by training, he is a compact, spry man in his sixties, with a trace in his voice of his native Northern Ireland. On the walls of his office in suburban Detroit is a lifetime of awards and certifications from safety researchers, but, like many technical types, he is embittered by how hard it has been to make his voice heard in the safety debates of the past thirty years. "Either you can persuade people to boil their own water because there is a typhoid epidemic or you can put chlorine in the water," he went on. "And the second, passive solution is obviously preferred to the first, because there is no way you can persuade everyone to act in a prudent way. But starting from that philosophical principle and then ignoring reality is a recipe for disaster. And that's what happened. Why?" Here Evans nearly leaped out of his chair. "Because there isn't any chlorine for traffic crashes."

  III. THE FIRST COLLISION

  Robert Day's crash was not the accident of a young man. He was hit from the side, and adolescents and young adults usually have side-impact crashes when their cars slide off the road into a fixed object like a tree, often at reckless speeds. Older people tend to have side-impact crashes at normal speeds, in intersections, and as the result of error, not negligence. In fact, Day's crash was not merely typical in form; it was the result of a common type of driver error. He didn't see something he was supposed to see.

  His mistake is, on one level, difficult to understand. There was a sign, clearly visible from the roadway, telling him of an intersection ahead, and then another, in bright red, telling him to stop. How could he have missed them both? From what we know of human perception, though, this kind of mistake happens all the time. Imagine, for instance, that you were asked to look at the shape of a cross, briefly displayed on a computer screen, and report on which arm of the cross was longer. After you did this a few times, another object, like a word or a small colored square - what psychologists call a critical stimulus - flashes next to the cross on the screen, right in front of your eyes. Would you see the critical stimulus? Most of us would say yes. Intuitively, we believe that we "see" everything in our field of vision - particularly things right in front of us - and that the difference between the things we pay attention to and the things we don't is simply that the things we focus on are the things we become aware of. But when experiments to test this assumption were conducted recently by Arien Mack, a psychologist at the New School, in New York, she found, to her surprise, that a significant portion of her observers didn't see the second object at all: it was directly in their field of vision, and yet, because their attention was focussed on the cross, they were oblivious of it. Mack calls this phenomenon "inattentional blindness."

  Daniel Simons, a professor of psychology at Harvard, has done a more dramatic set of experiments, following on the same idea. He and a colleague, Christopher Chabris, recently made a video of two teams of basketball players, one team in white shirts and the other in black, each player in constant motion as two basketballs are passed back and forth. Observers were asked to count the number of passes completed by the members of the white team. After about forty-five seconds of passes, a woman in a gorilla suit walks into the middle of the group, stands in front of the camera, beats her chest vigorously, and then walks away. "Fifty per cent of the people missed the gorilla," Simons says. "We got the most striking reactions. We'd ask people, 'Did you see anyone walking across the screen?' They'd say no. Anything at all? No. Eventually, we'd ask them, 'Did you notice the gorilla?' And they'd say, 'The what?'" Simons's experiment is one of those psychological studies which are impossible to believe in the abstract: if you look at the video (called "Gorillas in Our Midst") when you know what's coming, the woman in the gorilla suit is inescapable. How could anyone miss that? But people do. In recent years, there has been much scientific research on the fallibility of memory - on the fact that eyewitnesses, for example, often distort or omit critical details when they recall what they saw. But the new research points to something that is even more troubling: it isn't just that our memory of what we see is selective; it's that seeing itself is selective.

  This is a common problem in driving. Talking on a cell phone and trying to drive, for instance, is not unlike trying to count passes in a basketball game and simultaneously keep track of wandering animals. "When you get into a phone conversation, it's different from the normal way we have evolved to interact," David Strayer, a professor of psychology at the University of Utah, says. "Normally, conversation is face to face. There are all kinds of cues. But when you are on the phone you strip that away. It's virtual reality. You attend to that virtual reality, and shut down processing of the here and now." Strayer has done tests of people who were driving and talking on phones, and found that they remember far fewer things than those driving without phones. Their field of view shrinks. In one experiment, he flashed red and green lights at people while they were driving, and those on the phone missed twice as many lights as the others, and responded far more slowly to those lights they did see. "We tend to find the biggest deficits in unexpected events, a child darting onto the road, a light changing," Strayer says. "Someone going into your lane. That's what you don't see. There is a part of driving that is automatic and routine. There is a second part of driving that is completely unpredictable, and that is the part that requires attention." This is what Simons found with his gorilla, and it is the scariest part of inattentional blindness. People allow themselves to be distracted while driving because they think that they will still be able to pay attention to anomalies. But it is precisely those anomalous things, those deviations from the expected script, which they won't see.

  Marc Green, a psychologist with an accident-consulting firm in Toronto, once worked on a case where a woman hit a bicyclist with her car. "She was pulling into a gas station," Green says. "It was five o'clock in the morning. She'd done that almost every day for a year. She looks to the left, and then she hears a thud. There's a bicyclist on the ground. She'd looked down that sidewalk nearly every day for a year and never seen anybody. She adaptively learned to ignore what was on that sidewalk because it was useless information. She may actually have turned her eyes toward him and failed to see him." Green says that, once you understand why the woman failed to see the bicyclist, the crash comes to seem almost inevitable.

  It's the same conclusion that Haddon reached, and that formed the basis for his conviction that Americans were spending too much time worrying about what happened before an accident and not enough time worrying about what happened during and after an accident. Sometimes crashes happen because people do stupid things that they shouldn't have done - like drink or speed or talk on their cell phone. But sometimes people do stupid things that they cannot help, and it makes no sense to construct a safety program that does not recognize human fallibility. Just imagine, for example, that you're driving down a country road. The radio is playing. You're talking to your son, next to you. There is a highway crossing up ahead, but you can't see it, nor can you see any cars on the roadway, because of a stand of trees on both sides of the road. Maybe you look away from the road, for a moment, to change the dial on the radio, or something catches your eye outside, and when you glance back it happens to be at the very moment when a trick of geography makes it look as if your road stretched without interruption well off into the distance. Suddenly, up ahead, right in front of your eyes looms a bright-red anomalous stop sign - as out of place in the momentary mental universe that you have constructed for yourself as a gorilla in a basketball game - and, precisely because it is so anomalous, it doesn't register. Then - bang! How do you prevent an accident like that?

  IV. THE SECOND COLLISION

  One day in 1968, a group of engineers from the Cleveland-based auto-parts manufacturer Eaton, Yale &Towne went to Washington, D.C., to see William Haddon. They carried with them a secret prototype of what they called the People Saver. It was a nylon air cushion that inflated on impact, and the instant Haddon saw it he was smitten. "Oh, he was ecstatic, just ecstatic," Claybrook recalls. "I think it was one of the most exciting moments of his life."

  The air bag had been invented in the early fifties by a man named John Hetrick, who became convinced, after running his car into a ditch, that drivers and passengers would be much safer if they could be protected by some kind of air cushion. But how could one inflate it in the first few milliseconds of a crash? As he pondered the problem, Hetrick remembered a freak accident that had happened during the war, when he was in the Navy working in a torpedo-maintenance shop. Torpedos carry a charge of compressed air, and one day a torpedo covered in canvas accidentally released its charge. All at once, Hetrick recalled years later, the canvas "shot up into the air, quicker than you could blink an eye." Thus was the idea for the air bag born.

  In its earliest incarnation, the air bag was a crude device; one preliminary test inadvertently killed a baboon, and there were widespread worries about the safety of detonating what was essentially a small bomb inside a car. (Indeed, as a result of numerous injuries to children and small adults, air bags have now been substantially depowered.) But to Haddon the People Saver was the embodiment of everything he believed in - it was the chlorine in the water, and it solved a problem that had been vexing him for years. The Haddonites had always insisted that what was generally called a crash was actually two separate events. The first collision was the initial contact between two automobiles, and in order to prevent the dangerous intrusion of one car into the passenger compartment of another, they argued, cars ought to be built with a protective metal cage around the front and back seats. The second collision, though, was even more important. That was the collision between the occupants of a car and the inside of their own vehicle. If the driver and his passengers were to survive the abrupt impact of a crash, they needed a second safety system, which carefully and gradually decelerated their bodies. The logical choice for that task was seat belts, but Haddon, with his background in public health, didn't trust safety measures that depended on an individual's active cooperation. "The biggest problem we had back then was that only about twelve per cent of the public used seat belts," Claybrook says. "They were terribly designed, and people didn't use them." With the air bag, there was no decision to make. The Haddonites called it a "technological vaccine," and attacked its doubters in Detroit for showing "an absence of moral and ethical leadership." The air bag, they vowed, was going to replace the seat belt. In "Unsafe at Any Speed," Nader wrote:

  The seat belt should have been introduced in the twenties and rendered obsolete by the early fifties, for it is only the first step toward a more rational passenger restraint system which modern technology could develop and perfect for mass production. Such a system ideally would not rely on the active participation of the passenger to take effect; it would be the superior passive safety design which would come into use only when needed, and without active participation of the occupant. . . . Protection like this could be achieved by a kind of inflatable air bag restraint which would be actuated to envelop a passenger before a crash.

  For the next twenty years, Haddon, Nader, and Claybrook were consumed by the battle to force a reluctant Detroit to make the air bag mandatory equipment. There were lawsuits, and heated debates, and bureaucratic infighting. The automakers, mindful of cost and other concerns, argued that the emphasis ought to be on seat belts. But, to the Haddonites, Detroit was hopelessly in the grip of the old paradigm on auto safety. His opponents, Haddon wrote, with typical hauteur, were like "Malinowski's natives in their approaches to the hazards out the reef which they did not understand." Their attitudes were "redolent of the extranatural, supernatural and the pre-scientific." In 1991, the Haddonites won. That year, a law was passed requiring air bags in every new car by the end of the decade. It sounded like a great victory. But was it?

  V. HADDON'S MISTAKE

  When Stephen Capoferri's Aerostar hit Robert Day's Jeep Wagoneer, Capoferri's seat belt lay loose across his hips and chest. His shoulder belt probably had about two inches of slack. At impact, his car decelerated, but Capoferri's body kept moving forward, and within thirty milliseconds the slack in his seat belts was gone. In the language of engineers, he "loaded" his restraints. Under the force of Capoferri's onrushing weight, his belts began to stretch - the fabric giving by as much as six inches. As his shoulder belt grew taut, it dug into his chest, compressing it by another two inches, and if you had seen Capoferri at the moment of maximum forward trajectory his shoulder belt around his chest would have looked like a rubber band around a balloon. Simultaneously, within those first few milliseconds, his air bag exploded and rose to meet him at more than a hundred miles per hour. Forty to fifty milliseconds after impact, it had enveloped his face, neck, and upper chest. A fraction of a second later, the bag deated. Capoferri was thrown back against his seat. Total time elapsed: one hundred milliseconds.

  Would Capoferri have lived without an air bag? Probably. He would have stretched his seat belt so far that his head would have hit the steering wheel. But his belts would have slowed him down enough that he might only have broken his nose or cut his forehead or suffered a mild concussion. The other way around, however, with an air bag but not a seat belt, his fate would have been much more uncertain. In the absence of seat belts, air bags work best when one car hits another squarely, so that the driver pitches forward directly into the path of the oncoming bag. But Capoferri hit Day at a slight angle. The front-passenger side of the Aerostar sustained more damage than the driver's side, which means that without his belts holding him in place he would have been thrown away from the air bag off to the side, toward the rearview mirror or perhaps even the front-passenger "A" pillar. Capoferri's air bag protected him only because he was wearing his seat belt. Car-crash statistics show this to be the rule. Wearing a seat belt cuts your chances of dying in an accident by forty-three per cent. If you add the protection of an air bag, your fatality risk is cut by forty-seven per cent. But an air bag by itself reduces the risk of dying in an accident by just thirteen per cent.

  That the effectiveness of an air bag depended on the use of a seat belt was a concept that the Haddonites, in those early days, never properly understood. They wanted the air bag to replace the seat belt when in fact it was capable only of supplementing it, and they clung to that belief, even in the face of mounting evidence to the contrary. Don Huelke, a longtime safety researcher at the University of Michigan, remembers being on an N.H.T.S.A. advisory committee in the early nineteen-seventies, when people at the agency were trying to come up with statistics for the public on the value of air bags. "Their estimates were that something like twenty-eight thousand people a year could be saved by the air bags," he recalls, "and then someone pointed out to them that there weren't that many driver fatalities in frontal crashes in a year. It was kind of like 'Oops.' So the estimates were reduced." In 1977, Claybrook became the head of N.H.T.S.A. and renewed the push for air bags. The agency's estimate now was that air bags would cut a driver's risk of dying in a crash by forty per cent - a more modest but still implausible figure. "In 1973, there was a study in the open literature, performed at G.M., that estimated that the air bag would reduce the fatality risk to an unbelted driver by eighteen per cent," Leonard Evans says. "N.H.T.S.A. had this information and dismissed it. Why? Because it was from the automobile industry."

  The truth is that even today it is seat belts, not air bags, that are providing the most important new safety advances. Had Capoferri been driving a late-model Ford minivan, for example, his seat belt would have had what is called a pretensioner: a tiny explosive device that would have taken the slack out of the belt just after the moment of impact. Without the pretensioner, Stephen Kozak, an engineer at Ford, explains, "you start to accelerate before you hit the belt. You get the clothesline effect." With it, Capoferri's deceleration would have been a bit more gradual. At the same time, belts are now being designed which cut down on chest compression. Capoferri's chest wall was pushed in two inches, and had he been a much older man, with less resilient bones and cartilage, that two-inch compression might have been enough to fracture three or four ribs. So belts now "pay out" extra webbing after a certain point: as Capoferri stretched forward, his belt would have been lengthened by several inches, relieving the pressure on his chest. The next stage in seat-belt design is probably to offer car buyers the option of what is called a four-point belt - two shoulder belts that run down the chest, like suspenders attached to a lap belt. Ford showed a four-point prototype at the auto shows this spring, and early estimates are that it might cut fatality risk by another ten per cent - which would make seat belts roughly five times more effective in saving lives than air bags by themselves. "The best solution is to provide automatic protection, including air bags, as baseline protection for everyone, with seat belts as a supplement for those who will use them," Haddon wrote in 1984. In putting air bags first and seat belts second, he had things backward.

  Robert Day suffered a very different kind of accident from Stephen Capoferri's: he was hit from the side, and the physics of a side-impact crash are not nearly so forgiving. Imagine, for instance, that you punched a brick wall as hard as you could. If your fist was bare, you'd break your hand. If you had a glove with two inches of padding, your hand would sting. If you had a glove with six inches of padding, you might not feel much of anything. The more energyabsorbing material - the more space - you can put between your body and the wall, the better off you are. An automobile accident is no different. Capoferri lived, in part, because he had lots of space between himself and Day's Wagoneer. Cars have steel rails connecting the passenger compartment with the bumper, and each of those rails is engineered with what are called convolutions - accordionlike folds designed to absorb, slowly and evenly, the impact of a collision. Capoferri's van was engineered with twenty-seven inches of crumple room, and at the speed he was travelling he probably used about twenty-one inches of that. But Day had four inches, no more, between his body and the door, and perhaps another five to six inches in the door itself. Capoferri hit the wall with a boxing glove. Day punched it with his bare hand.

  Day's problems were compounded by the fact that he was not wearing his seat belt. The right-front fender of Capoferri's Aerostar struck his Wagoneer squarely on the driver's door, pushing the Jeep sidewise, and if Day had been belted he would have moved with his vehicle, away from the onrushing Aerostar. But he wasn't, and so the Jeep moved out from under him: within fifteen milliseconds, the four inches of space between his body and the side of the Jeep was gone. The impact of the Aerostar slammed the driver's door against his ribs and spleen.

  Day could easily have been ejected from his vehicle at that point. The impact of Capoferri's van shattered the glass in Day's door, and a Wagoneer, like most sports-utility vehicles, has a low belt line - meaning that the side windows are so large that with the glass gone there's a hole big enough for an unrestrained body to fly through. This is what it means to be "thrown clear" of a crash, although when that phrase is used in the popular literature it is sometimes said as if it were a good thing, when of course to be "thrown clear" of a crash is merely to be thrown into some other hard and even more lethal object, like the pavement or a tree or another car. Day, for whatever reason, was not thrown clear, and in that narrow sense he was lucky. This advantage, however, amounted to little. Day's door was driven into him like a sledgehammer.

  Would a front air bag have saved Robert Day? Not at all. He wasn't moving forward into the steering wheel. He was moving sidewise into the door. Some cars now have additional air bags that are intended to protect the head as it hits the top of the door frame in a side-impact crash. But Day didn't die of head injuries. He died of abdominal injuries. Conceivably, a side-impact bag might have offered his abdomen some slight protection. But Day's best chance of surviving the accident would have been to wear his seat belt. It would have held him in place in those first few milliseconds of impact. It would have preserved some part of the space separating him from the door, diminishing the impact of the Aerostar. Day made two mistakes that morning, then, the second of which was not buckling up. But this is a point on which the Haddonites were in error as well, because the companion to their obsession with air bags was the equally false belief that encouraging drivers to wear their seat belts was a largely futile endeavor.

  In the early nineteen-seventies, just at the moment when Haddon and Claybrook were pushing hardest for air bags, the Australian state of Victoria passed the world's first mandatory seat-belt legislation, and the law was an immediate success. With an aggressive public-education campaign, rates of seat-belt use jumped from twenty to eighty per cent. During the next several years, Canada, New Zealand, Germany, France, and others followed suit. But a similar movement in the United States in the early seventies stalled. James Gregory, who headed the N.H.T.S.A. during the Ford years, says that if Nader had advocated mandatory belt laws they might have carried the day. But Nader, then at the height of his fame and influence, didn't think that belt laws would work in this country. "You push mandatory belts, you might get a very adverse reaction," Nader says today of his thinking back then. "Mindless reaction. And how many tickets do you give out a day? What about back seats? At what point do you require a seat belt for small kids? And it's administratively difficult when people cross state lines. That's why I always focussed on the passive. We have a libertarian streak that Europe doesn't have." Richard Peet, a congressional staffer who helped draft legislation in Congress giving states financial incentives to pass belt laws, founded an organization in the early seventies to promote belt-wearing. "After I did that, some of the people who worked for Nader's organization went after me, saying that I was selling out the air-bag movement," Peet recalls. "That pissed me off. I thought the safety movement was the safety movement and we were all working together for common aims." In "Auto Safety," a history of auto-safety regulation, John Graham, of the Harvard School of Public Health, writes of Claybrook's time at the N.H.T.S.A.:

  Her lack of aggressive leadership on safety belt use was a major source of irritation among belt use advocates, auto industry officials, and officials from state safety programs. They saw her pessimistic attitudes as a self-fulfilling prophecy. One of Claybrook's aides at N.H.T.S.A. who worked with state agencies acknowledged: "It is fair to say that Claybrook never made a dedicated effort to get mandatory belt-use laws." Another aide offered the following explanation of her philosophy: "Joan didn't do much on mandatory belt use because her primary interests were in vehicle regulation. She was fond of saying 'it is easier to get twenty auto companies to do something than to get 200 million Americans to do something.' "

  Claybrook says that while at the N.H.T.S.A. she mailed a letter to all the state governors encouraging them to pass mandatory seat-belt legislation, and "not one governor would help us." It is clear that she had low expectations for her efforts. Even as late as 1984, Claybrook was still insisting that trying to encourage seat-belt use was a fool's errand. "It is not likely that mandatory seat belt usage laws will be either enacted or found acceptable to the public in large numbers," Claybrook wrote. "There is massive public resistance to adult safety belt usage." In the very year her words were published, however, a coalition of medical groups finally managed to pass the country's first mandatory seat-belt law, in New York, and the results were dramatic. One state after another soon did likewise, and public opinion about belts underwent what the pollster Gary Lawrence has called "one of the most phenomenal shifts in attitudes ever measured." Americans, it turned out, did not have a cultural aversion to seat belts. They just needed some encouragement. "It's not a big Freudian thing whether you buckle up or not," says B. J. Campbell, a former safety researcher at the University of North Carolina, who was one of the veterans of the seat-belt movement. "It's just a habit, and either you're in the habit of doing it or you're not."

  Today, belt-wearing rates in the United States are just over seventy per cent, and every year they inch up a little more. But if the seat-belt campaign had begun in the nineteen-seventies, instead of the nineteen-eighties, the use rate in this country would be higher right now, and in the intervening years an awful lot of car accidents might have turned out differently, including one at the intersection of Egg Harbor Road and Fleming Pike.

  VI. CRASH TEST

  William Haddon died in 1985, of kidney disease, at the age of fty-eight. From the time he left government until his death, he headed an influential research group called the Insurance Institute for Highway Safety.

  Joan Claybrook left the N.H.T.S.A. in 1980 and went on to run Ralph Nader's advocacy group Public Citizen, where she has been a powerful voice on auto safety ever since. In an interview this spring, Claybrook listed the things that she would do if she were back as the country's traffic-safety czar. "I'd issue a rollover standard, and have a thirty-miles-per-hour test for air bags," she said. "Upgrade the seating structure. Integrate the head restraint better. Upgrade the tire-safety standard. Provide much more consumer information. And also do more crash testing, whether it's rollover or offset crash testing and rear-crash testing." The most effective way to reduce automobile fatalities, she went on, would be to focus on rollovers - lowering the center of gravity in S.U.V.s, strengthening doors and roofs. In the course of outlining her agenda, Claybrook did not once mention the words "seat belt."

  Ralph Nader, for his part, spends a great deal of time speaking at college campuses about political activism. He remains a distinctive figure, tall and slightly stooped, with a bundle of papers under his arm. His interests have widened in recent years, but he is still passionate about his first crusade. "Haddon was all business - never made a joke, didn't tolerate fools easily," Nader said not long ago, when he was asked about the early days. He has a deep, rumbling press-conference voice, and speaks in sentence fragments, punctuated with long pauses. "Very dedicated. He influenced us all." The auto-safety campaign, he went on, "was a spectacular success of the federal-government mission. When the regulations were allowed, they worked. And it worked because it deals with technology rather than human behavior." Nader had just been speaking in Detroit, at Wayne State University, and was on the plane back to Washington, D.C. He was folded into his seat, his knees butting up against the tray table in front of him, and from time to time he looked enviously over at the people stretching their legs in the exit row. Did he have any regrets? Yes, he said. He wished that back in 1966 he had succeeded in keeping the criminal-penalties provision in the auto-safety bill that Congress passed that summer. "That would have gone right to the executive suite," he said.

  There were things, he admitted, that had puzzled him over the years. He couldn't believe the strides that had been made against drunk driving. "You've got to hand it to MADD. It took me by surprise. The drunk-driving culture is deeply embedded. I thought it was too ingrained." And then there was what had happened with seat belts. "Use rates are up sharply," he said. "They're a lot higher than I thought they would be. I thought it would be very hard to hit fifty per cent. The most unlikely people now buckle up." He shook his head, marvelling. He had always been a belt user, and recommends belts to others, but who knew they would catch on?

  Other safety activists, who had seen what had happened to driver behavior in Europe and Australia in the seventies, weren't so surprised, of course. But Nader was never the kind of activist who had great faith in the people whose lives he was trying to protect.He and the other Haddonites were sworn to a theory that said that the way to prevent typhoid is to chlorinate the water, even though there are clearly instances where chlorine will not do the trick. This is the blindness of ideology. It is what happens when public policy is conducted by those who cannot conceive that human beings will do willingly what is in their own interest. What was the truly poignant thing about Robert Day, after all? Not just that he was a click away from saving his only life but that his son, sitting right next to him, was wearinghis seat belt. In the Days' Jeep Wagoneer, a fight that experts assumed was futile was already half won.

  One day this spring, a team of engineers at Ford conducted a crash test on a 2003 Mercury. This was at Ford's test facility in Dearborn, a long, rectangular white steel structure, bisected by a five-hundred-and-fifty-foot runway. Ford crashes as many as two cars a day there, ramming them with specially designed sleds or dragging them down the runway with a cable into a twenty-foot cube of concrete. Along the side of the track were the twisted hulks of previous experiments: a Ford Focus wagon up on blocks; a mangled BMW S.U.V. that had been crashed, out of competitive curiosity, the previous week; a Ford Explorer that looked as though it had been thrown into a blender. In a room at the back, there were fifty or sixty crash-test dummies, propped up on tables and chairs, in a dozen or more configurations - some in Converse sneakers, some in patent-leather shoes, some without feet and legs at all, each one covered with multiple electronic sensors, all designed to measure the kinds of injuries possible in a crash.

  The severity of any accident is measured not by the speed of the car at the moment of impact but by what is known as the delta V - the difference between how fast a car is going at the moment of impact and how fast it is moving after the accident. Capoferri's delta V was about twenty-five miles per hour, seven miles per hour higher than the accident average. The delta V of the Mercury test, though, was to be thirty-five miles per hour, which is the equivalent of hitting an identical parked car at seventy miles per hour. The occupants were two adult-size dummies in orange shorts. Their faces were covered in wet paint, red above the upper jaw and blue below it, to mark where their faces hit on the air bag. The back seat carried a full cargo of computers and video cameras. A series of yellow lights began flashing. An engineer stood to the side, holding an abort button. Then a bank of stage lights came on, directly above the point of impact. Sixteen video cameras began rolling. A voice came over a loudspeaker, counting down: five, four, three.   There was a blur as the Mercury swept by - then bang, as the car hit the barrier and the dual front air bags exploded. A plastic light bracket skittered across the floor, and the long warehouse was suddenly still.

  It was a moment of extraordinary violence, yet it was also strangely compelling. This was performance art, an abstract and ritualized rendering of reality, given in a concrete-and-steel gallery. The front end of the Mercury was perfectly compressed; the car was thirty inches shorter than it had been a moment before. The windshield was untouched. The "A" pillars and roofline were intact. The passenger cabin was whole. In the dead center of the deflated air bags, right where they were supposed to be, were perfect blue-and-red paint imprints of the dummies' faces.

  But it was only a performance, and that was the hard thing to remember. In the real world, people rarely have perfectly square frontal collisions, sitting ramrod straight and ideally positioned; people rarely have accidents that so perfectly showcase the minor talents of the air bag. A crash test is beautiful. In the sequence we have all seen over and over in automobile commercials, the dummy rises magically to meet the swelling cushion, always in slow motion, the bang replaced by Mozart, and on those theatrical terms the dowdy fabric strips of the seat belt cannot compete with the billowing folds of the air bag. This is the image that seduced William Haddon when the men from Eaton, Yale showed him the People Saver so many years ago, and the image that warped auto safety for twenty long years. But real accidents are seldom like this. They are ugly and complicated, shaped by the messy geometries of the everyday world and by the infinite variety of human frailty. A man looks away from the road at the wrong time. He does not see what he ought to see. Another man does not have time to react. The two cars collide, but at a slight angle. There is a two-hundred-and-seventy-degree spin. There is skidding and banging. A belt presses deep into one man's chest - and that saves his life. The other man's unrestrained body smashes against the car door - and that kills him.

  "They left pretty early, about eight, nine in the morning," Susan Day, Robert Day's widow, recalls. "I was at home when the hospital called. I went to see my son first. He was pretty much O.K., had a lot of bruising. Then they came in and said, 'Your husband didn't make it.'"
GO TO TOP MENU

  Fast food is killing us. Can it be fixed?

  1.

  In 1954, a man named Ray Kroc, who made his living selling the five-spindle Multimixer milkshake machine, began hearing about a hamburger stand in San Bernardino, California. This particular restaurant, he was told, had no fewer than eight of his machines in operation, meaning that it could make forty shakes simultaneously. Kroc was astounded. He flew from Chicago to Los Angeles, and drove to San Bernardino, sixty miles away, where he found a small octagonal building on a corner lot. He sat in his car and watched as the workers showed up for the morning shift. They were in starched white shirts and paper hats, and moved with a purposeful discipline. As lunchtime approached, customers began streaming into the parking lot, lining up for bags of hamburgers. Kroc approached a strawberry blonde in a yellow convertible.

  "How often do you come here?" he asked.

  "Anytime I am in the neighborhood," she replied, and, Kroc would say later, "it was not her sex appeal but the obvious relish with which she devoured the hamburger that made my pulse begin to hammer with excitement." He came back the next morning, and this time set up inside the kitchen, watching the griddle man, the food preparers, and, above all, the French-fry operation, because it was the French fries that truly captured his imagination. They were made from top-quality oblong Idaho russets, eight ounces apiece, deep-fried to a golden brown, and salted with a shaker that, as he put it, kept going like a Salvation Army girl's tambourine. They were crispy on the outside and buttery soft on the inside, and that day Kroc had a vision of a chain of restaurants, just like the one in San Bernardino, selling golden fries from one end of the country to the other. He asked the two brothers who owned the hamburger stand if he could buy their franchise rights. They said yes. Their names were Mac and Dick McDonald.

  Ray Kroc was the great visionary of American fast food, the one who brought the lessons of the manufacturing world to the restaurant business. Before the fifties, it was impossible, in most American towns, to buy fries of consistent quality. Ray Kroc was the man who changed that. "The french fry," he once wrote, "would become almost sacrosanct for me, its preparation a ritual to be followed religiously." A potato that has too great a percentage of water - and potatoes, even the standard Idaho russet burbank, vary widely in their water content - will come out soggy at the end of the frying process. It was Kroc, back in the fifties, who sent out field men, armed with hydrometers, to make sure that all his suppliers were producing potatoes in the optimal solids range of twenty to twenty-three per cent. Freshly harvested potatoes, furthermore, are rich in sugars, and if you slice them up and deep-fry them the sugars will caramelize and brown the outside of the fry long before the inside is cooked. To make a crisp French fry, a potato has to be stored at a warm temperature for several weeks in order to convert those sugars to starch. Here Kroc led the way as well, mastering the art of "curing" potatoes by storing them under a giant fan in the basement of his first restaurant, outside Chicago.

  Perhaps his most enduring achievement, though, was the so-called potato computer - developed for McDonald's by a former electrical engineer for Motorola named Louis Martino - which precisely calibrated the optimal cooking time for a batch of fries. (The key: when a batch of cold raw potatoes is dumped into a vat of cooking oil, the temperature of the fat will drop and then slowly rise. Once the oil has risen three degrees, the fries are ready.) Previously, making high-quality French fries had been an art. The potato computer, the hydrometer, and the curing bins made it a science. By the time Kroc was finished, he had figured out how to turn potatoes into an inexpensive snack that would always be hot, salty, flavorful, and crisp, no matter where or when you bought it.

  This was the first fast-food revolution - the mass production of food that had reliable mass appeal. But today, as the McDonald's franchise approaches its fiftieth anniversary, it is clear that fast food needs a second revolution. As many Americans now die every year from obesity-related illnesses - heart disease and complications of diabetes - as from smoking, and the fast-food toll grows heavier every year. In the fine new book "Fast Food Nation," the journalist Eric Schlosser writes of McDonald's and Burger King in the tone usually reserved for chemical companies, sweatshops, and arms dealers, and, as shocking as that seems at first, it is perfectly appropriate. Ray Kroc's French fries are killing us. Can fast food be fixed?

  2.

  Fast-food French fries are made from a baking potato like an Idaho russet, or any other variety that is mealy, or starchy, rather than waxy. The potatoes are harvested, cured, washed, peeled, sliced, and then blanched - cooked enough so that the insides have a fluffy texture but not so much that the fry gets soft and breaks. Blanching is followed by drying, and drying by a thirty-second deep fry, to give the potatoes a crisp shell. Then the fries are frozen until the moment of service, when they are deep-fried again, this time for somewhere around three minutes. Depending on the fast-food chain involved, there are other steps interspersed in this process. McDonald's fries, for example, are briefly dipped in a sugar solution, which gives them their golden-brown color; Burger King fries are dipped in a starch batter, which is what gives those fries their distinctive hard shell and audible crunch. But the result is similar. The potato that is first harvested in the field is roughly eighty per cent water. The process of creating a French fry consists, essentially, of removing as much of that water as possible - through blanching, drying, and deep-frying - and replacing it with fat.

  Elisabeth Rozin, in her book "The Primal Cheeseburger," points out that the idea of enriching carbohydrates with fat is nothing new. It's a standard part of the cuisine of almost every culture. Bread is buttered; macaroni comes with cheese; dumplings are fried; potatoes are scalloped, baked with milk and cheese, cooked in the dripping of roasting meat, mixed with mayonnaise in a salad, or pan-fried in butterfat as latkes. But, as Rozin argues, deep-frying is in many ways the ideal method of adding fat to carbohydrates. If you put butter on a mashed potato, for instance, the result is texturally unexciting: it simply creates a mush. Pan-frying results in uneven browning and crispness. But when a potato is deep-fried the heat of the oil turns the water inside the potato into steam, which causes the hard granules of starch inside the potato to swell and soften: that's why the inside of the fry is fluffy and light. At the same time, the outward migration of the steam limits the amount of oil that seeps into the interior, preventing the fry from getting greasy and concentrating the oil on the surface, where it turns the outer layer of the potato brown and crisp. "What we have with the french fry," Rozin writes, "is a near perfect enactment of the enriching of a starch food with oil or fat."

  This is the trouble with the French fry. The fact that it is cooked in fat makes it unhealthy. But the contrast that deep-frying creates between its interior and its exterior - between the golden shell and the pillowy whiteness beneath - is what makes it so irresistible. The average American now eats a staggering thirty pounds of French fries a year, up from four pounds when Ray Kroc was first figuring out how to mass-produce a crisp fry. Meanwhile, fries themselves have become less healthful. Ray Kroc, in the early days of McDonald's, was a fan of a hot-dog stand on the North Side of Chicago called Sam's, which used what was then called the Chicago method of cooking fries. Sam's cooked its fries in animal fat, and Kroc followed suit, prescribing for his franchises a specially formulated beef tallow called Formula 47 (in reference to the forty-seven-cent McDonald's "All-American meal" of the era: fifteen-cent hamburger, twelve-cent fries, twenty-cent shake). Among aficionados, there is general agreement that those early McDonald's fries were the finest mass-market fries ever made: the beef tallow gave them an unsurpassed rich, buttery taste. But in 1990, in the face of public concern about the health risks of cholesterol in animal-based cooking oil, McDonald's and the other major fast-food houses switched to vegetable oil. That wasn't an improvement, however. In the course of making vegetable oil suitable for deep frying, it is subjected to a chemical process called hydrogenation, which creates a new substance called a trans unsaturated fat. In the hierarchy of fats, polyunsaturated fats - the kind found in regular vegetable oils - are the good kind; they lower your cholesterol. Saturated fats are the bad kind. But trans fats are worse: they wreak havoc with the body's ability to regulate cholesterol.

  According to a recent study involving some eighty thousand women, for every five-per-cent increase in the amount of saturated fats that a woman consumes, her risk of heart disease increases by seventeen per cent. But only a two-per-cent increase in trans fats will increase her heart-disease risk by ninety-three per cent. Walter Willett, an epidemiologist at Harvard - who helped design the study - estimates that the consumption of trans fats in the United States probably causes about thirty thousand premature deaths a year.

  McDonald's and the other fast-food houses aren't the only purveyors of trans fats, of course; trans fats are in crackers and potato chips and cookies and any number of other processed foods. Still, a lot of us get a great deal of our trans fats from French fries, and to read the medical evidence on trans fats is to wonder at the odd selectivity of the outrage that consumers and the legal profession direct at corporate behavior. McDonald's and Burger King and Wendy's have switched to a product, without disclosing its risks, that may cost human lives. What is the difference between this and the kind of thing over which consumers sue companies every day?

  3.

  The French-fry problem ought to have a simple solution: cook fries in oil that isn't so dangerous. Oils that are rich in monounsaturated fats, like canola oil, aren't nearly as bad for you as saturated fats, and are generally stable enough for deep-frying. It's also possible to "fix" animal fats so that they aren' t so problematic. For example, K. C. Hayes, a nutritionist at Brandeis University, has helped develop an oil called Appetize. It's largely beef tallow, which gives it a big taste advantage over vegetable shortening, and makes it stable enough for deep-frying. But it has been processed to remove the cholesterol, and has been blended with pure corn oil, in a combination that Hayes says removes much of the heart-disease risk.

  Perhaps the most elegant solution would be for McDonald's and the other chains to cook their fries in something like Olestra, a fat substitute developed by Procter & Gamble. Ordinary fats are built out of a molecular structure known as a triglyceride: it's a microscopic tree, with a trunk made of glycerol and three branches made of fatty acids. Our bodies can't absorb triglycerides, so in the digestive process each of the branches is broken off by enzymes and absorbed separately. In the production of Olestra, the glycerol trunk of a fat is replaced with a sugar, which has room for not three but eight fatty acids. And our enzymes are unable to break down a fat tree with eight branches - so the Olestra molecule can't be absorbed by the body at all. "Olestra" is as much a process as a compound: you can create an "Olestra" version of any given fat. Potato chips, for instance, tend to be fried in cottonseed oil, because of its distinctively clean taste. Frito-Lay's no-fat Wow! chips are made with an Olestra version of cottonseed oil, which behaves just like regular cottonseed oil except that it's never digested. A regular serving of potato chips has a hundred and fifty calories, ninety of which are fat calories from the cooking oil. A serving of Wow! chips has seventy-five calories and no fat. If Procter & Gamble were to seek F.D.A. approval for the use of Olestra in commercial deep-frying (which it has not yet done), it could make an Olestra version of the old McDonald's Formula 47, which would deliver every nuance of the old buttery, meaty tallow at a fraction of the calories.

  Olestra, it must be said, does have some drawbacks - in particular, a reputation for what is delicately called "gastrointestinal distress." The F.D.A. has required all Olestra products to carry a somewhat daunting label saying that they may cause "cramping and loose stools." Not surprisingly, sales have been disappointing, and Olestra has never won the full acceptance of the nutrition community. Most of this concern, however, appears to be overstated. Procter & Gamble has done randomized, double-blind studies - one of which involved more than three thousand people over six weeks - and found that people eating typical amounts of Olestra-based chips don't have significantly more gastrointestinal problems than people eating normal chips. Diarrhea is such a common problem in America - nearly a third of adults have at least one episode each month - that even F.D.A. regulators now appear to be convinced that in many of the complaints they received Olestra was unfairly blamed for a problem that was probably caused by something else. The agency has promised Procter & Gamble that the warning label will be reviewed.

  Perhaps the best way to put the Olestra controversy into perspective is to compare it to fibre. Fibre is vegetable matter that goes right through you: it's not absorbed by the gastrointestinal tract. Nutritionists tell us to eat it because it helps us lose weight and it lowers cholesterol - even though if you eat too many baked beans or too many bowls of oat bran you will suffer the consequences. Do we put warning labels on boxes of oat bran? No, because the benefits of fibre clearly outweigh its drawbacks. Research has suggested that Olestra, like fibre, helps people lose weight and lowers cholesterol; too much Olestra, like too much fibre, may cause problems. (Actually, too much Olestra may not be as troublesome as too much bran. According to Procter &amp; Gamble, eating a large amount of Olestra - forty grams - causes no more problems than eating a small bowl - twenty grams - of wheat bran.) If we had Olestra fries, then, they shouldn't be eaten for breakfast, lunch, and dinner. In fact, fast-food houses probably shouldn't use hundred-per-cent Olestra; they should cook their fries in a blend, using the Olestra to displace the most dangerous trans and saturated fats. But these are minor details. The point is that it is entirely possible, right now, to make a delicious French fry that does not carry with it a death sentence. A French fry can be much more than a delivery vehicle for fat.

  4.

  Is it really that simple, though? Consider the cautionary tale of the efforts of a group of food scientists at Auburn University, in Alabama, more than a decade ago to come up with a better hamburger. The Auburn team wanted to create a leaner beef that tasted as good as regular ground beef. They couldn't just remove the fat, because that would leave the meat dry and mealy. They wanted to replace the fat. "If you look at ground beef, it contains moisture, fat, and protein," says Dale Huffman, one of the scientists who spearheaded the Auburn project. "Protein is relatively constant in all beef, at about twenty per cent. The traditional McDonald's ground beef is around twenty per cent fat. The remainder is water. So you have an inverse ratio of water and fat. If you reduce fat, you need to increase water." The goal of the Auburn scientists was to cut about two-thirds of the fat from normal ground beef, which meant that they needed to find something to add to the beef that would hold an equivalent amount of water - and continue to retain that water even as the beef was being grilled. Their choice? Seaweed, or, more precisely, carrageenan. "It's been in use for centuries," Huffman explains. "It's the stuff that keeps the suspension in chocolate milk - otherwise the chocolate would settle at the bottom. It has tremendous water-holding ability. There's a loose bond between the carrageenan and the moisture." They also selected some basic flavor enhancers, designed to make up for the lost fat "taste." The result was a beef patty that was roughly three-quarters water, twenty per cent protein, five per cent or so fat, and a quarter of a per cent seaweed. They called it AU Lean.

  It didn't take the Auburn scientists long to realize that they had created something special. They installed a test kitchen in their laboratory, got hold of a McDonald's grill, and began doing blind taste comparisons of AU Lean burgers and traditional twenty- per-cent-fat burgers. Time after time, the AU Lean burgers won. Next, they took their invention into the field. They recruited a hundred families and supplied them with three kinds of ground beef for home cooking over consecutive three-week intervals - regular "market" ground beef with twenty per cent fat, ground beef with five percent fat, and AU Lean. The families were asked to rate the different kinds of beef, without knowing which was which. Again, the AU Lean won hands down - trumping the other two on "likability," "tenderness," "flavorfulness," and "juiciness."

  What the Auburn team showed was that, even though people love the taste and feel of fat - and naturally gravitate toward high-fat food - they can be fooled into thinking that there is a lot of fat in something when there isn't. Adam Drewnowski, a nutritionist at the University of Washington, has found a similar effect with cookies. He did blind taste tests of normal and reduced-calorie brownies, biscotti, and chocolate-chip, oatmeal, and peanut-butter cookies. If you cut the sugar content of any of those cookies by twenty-five per cent, he found, people like the cookies much less. But if you cut the fat by twenty-five per cent they barely notice. "People are very finely attuned to how much sugar there is in a liquid or a solid," Drewnowski says. "For fat, there's no sensory break point. Fat comes in so many guises and so many textures it is very difficult to perceive how much is there." This doesn't mean we are oblivious of fat levels, of course. Huffman says that when his group tried to lower the fat in AU Lean below five per cent, people didn't like it anymore. But, within the relatively broad range of between five and twenty-five per cent, you can add water and some flavoring and most people can't tell the difference.

  What's more, people appear to be more sensitive to the volume of food they consume than to its calorie content. Barbara Rolls, a nutritionist at Penn State, has demonstrated this principle with satiety studies. She feeds one group of people a high-volume snack and another group a low-volume snack. Even though the two snacks have the same calorie count, she finds that people who eat the high-volume snack feel more satisfied. "People tend to eat a constant weight or volume of food in a given day, not a constant portion of calories," she says. Eating AU Lean, in short, isn't going to leave you with a craving for more calories; you'll feel just as full.

  For anyone looking to improve the quality of fast food, all this is heartening news. It means that you should be able to put low-fat cheese and low-fat mayonnaise in a Big Mac without anyone's complaining. It also means that there's no particular reason to use twenty-per-cent-fat ground beef in a fast-food burger. In 1990, using just this argument, the Auburn team suggested to McDonald's that it make a Big Mac out of AU Lean. Shortly thereafter, McDonald's came out with the McLean Deluxe. Other fast-food houses scrambled to follow suit. Nutritionists were delighted. And fast food appeared on the verge of a revolution.

  Only, it wasn't. The McLean was a flop, and four years later it was off the market. What happened? Part of the problem appears to have been that McDonald's rushed the burger to market before many of the production kinks had been worked out. More important, though, was the psychological handicap the burger faced. People liked AU Lean in blind taste tests because they didn't know it was AU Lean; they were fooled into thinking it was regular ground beef. But nobody was fooled when it came to the McLean Deluxe. It was sold as the healthy choice - and who goes to McDonald's for health food?

  Leann Birch, a developmental psychologist at Penn State, has looked at the impact of these sorts of expectations on children. In one experiment, she took a large group of kids and fed them a big lunch. Then she turned them loose in a room with lots of junk food. "What we see is that some kids eat almost nothing," she says. "But other kids really chow down, and one of the things that predicts how much they eat is the extent to which parents have restricted their access to high-fat, high-sugar food in the past: the more the kids have been restricted, the more they eat." Birch explains the results two ways. First, restricting food makes kids think not in terms of their own hunger but in terms of the presence and absence of food. As she puts it, "The kid is essentially saying, 'If the food's here I better get it while I can, whether or not I'm hungry.' We see these five-year-old kids eating as much as four hundred calories." Birch's second finding, though, is more important. Because the children on restricted diets had been told that junk food was bad for them, they clearly thought that it had to taste good. When it comes to junk food, we seem to follow an implicit script that powerfully biases the way we feel about food. We like fries not in spite of the fact that they're unhealthy but because of it.

  That is sobering news for those interested in improving the American diet. For years, the nutrition movement in this country has made transparency one of its principal goals: it has assumed that the best way to help people improve their diets is to tell them precisely what's in their food, to label certain foods good and certain foods bad. But transparency can backfire, because sometimes nothing is more deadly for our taste buds than the knowledge that what we are eating is good for us. McDonald's should never have called its new offering the McLean Deluxe, in other words. They should have called it the Burger Supreme or the Monster Burger, and then buried the news about reduced calories and fat in the tiniest type on the remotest corner of their Web site. And if we were to cook fries in some high-tech, healthful cooking oil - whether Olestrized beef tallow or something else with a minimum of trans and saturated fats - the worst thing we could do would be to market them as healthy fries. They will not taste nearly as good if we do. They have to be marketed as better fries, as Classic Fries, as fries that bring back the rich tallowy taste of the original McDonald's.

  What, after all, was Ray Kroc's biggest triumph? A case could be made for the field men with their hydrometers, or the potato-curing techniques, or the potato computer, which turned the making of French fries from an art into a science. But we should not forget Ronald McDonald, the clown who made the McDonald's name irresistible to legions of small children. Kroc understood that taste comprises not merely the food on our plate but also the associations and assumptions and prejudices we bring to the table - that half the battle in making kids happy with their meal was calling what they were eating a Happy Meal. The marketing of healthful fast food will require the same degree of subtlety and sophistication. The nutrition movement keeps looking for a crusader - someone who will bring about better public education and tougher government regulations. But we need much more than that. We need another Ray Kroc.
GO TO TOP MENU

  Why your bosses want to turn your new office into Greenwich Village.

  1.

  In the early nineteen-sixties, Jane Jacobs lived on Hudson Street, in Greenwich Village, near the intersection of Eighth Avenue and Bleecker Street. It was then, as now, a charming district of nineteenth-century tenements and town houses, bars and shops, laid out over an irregular grid, and Jacobs loved the neighborhood. In her 1961 masterpiece, "The Death and Life of Great American Cities," she rhapsodized about the White Horse Tavern down the block, home to Irish longshoremen and writers and intellectuals - a place where, on a winter's night, as "the doors open, a solid wave of conversation and animation surges out and hits you." Her Hudson Street had Mr. Slube, at the cigar store, and Mr. Lacey, the locksmith, and Bernie, the candy-store owner, who, in the course of a typical day, supervised the children crossing the street, lent an umbrella or a dollar to a customer, held on to some keys or packages for people in the neighborhood, and "lectured two youngsters who asked for cigarettes." The street had "bundles and packages, zigzagging from the drug store to the fruit stand and back over to the butcher's," and "teenagers, all dressed up, are pausing to ask if their slips show or their collars look right." It was, she said, an urban ballet.

  The miracle of Hudson Street, according to Jacobs, was created by the particular configuration of the streets and buildings of the neighborhood. Jacobs argued that when a neighborhood is oriented toward the street, when sidewalks are used for socializing and play and commerce, the users of that street are transformed by the resulting stimulation: they form relationships and casual contacts they would never have otherwise. The West Village, she pointed out, was blessed with a mixture of houses and apartments and shops and offices and industry, which meant that there were always people "outdoors on different schedules and.   in the place for different purposes." It had short blocks, and short blocks create the greatest variety in foot traffic. It had lots of old buildings, and old buildings have the low rents that permit individualized and creative uses. And, most of all, it had people, cheek by jowl, from every conceivable walk of life. Sparely populated suburbs may look appealing, she said, but without an active sidewalk life, without the frequent, serendipitous interactions of many different people, "there is no public acquaintanceship, no foundation of public trust, no cross-connections with the necessary people - and no practice or ease in applying the most ordinary techniques of city public life at lowly levels."

  Jane Jacobs did not win the battle she set out to fight. The West Village remains an anomaly. Most developers did not want to build the kind of community Jacobs talked about, and most Americans didn't want to live in one. To reread "Death and Life" today, however, is to be struck by how the intervening years have given her arguments a new and unexpected relevance. Who, after all, has a direct interest in creating diverse, vital spaces that foster creativity and serendipity? Employers do. On the fortieth anniversary of its publication, "Death and Life" has been reborn as a primer on workplace design.

  The parallels between neighborhoods and offices are striking. There was a time, for instance, when companies put their most valued employees in palatial offices, with potted plants in the corner, and secretaries out front, guarding access. Those offices were suburbs - gated communities, in fact - and many companies came to realize that if their best employees were isolated in suburbs they would be deprived of public acquaintanceship, the foundations of public trust, and cross-connections with the necessary people. In the eighties and early nineties, the fashion in corporate America was to follow what designers called "universal planning" - rows of identical cubicles, which resembled nothing so much as a Levittown. Today, universal planning has fallen out of favor, for the same reason that the postwar suburbs like Levittown did: to thrive, an office space must have a diversity of uses - it must have the workplace equivalent of houses and apartments and shops and industry.

  If you visit the technology companies of Silicon Valley, or the media companies of Manhattan, or any of the firms that self-consciously identify themselves with the New Economy, you'll find that secluded private offices have been replaced by busy public spaces, open-plan areas without walls, executives next to the newest hires. The hush of the traditional office has been supplanted by something much closer to the noisy, bustling ballet of Hudson Street. Forty years ago, people lived in neighborhoods like the West Village and went to work in the equivalent of suburbs. Now, in one of the odd reversals that mark the current economy, they live in suburbs and, increasingly, go to work in the equivalent of the West Village.

  2.

  The office used to be imagined as a place where employees punch clocks and bosses roam the halls like high-school principals, looking for miscreants. But when employees sit chained to their desks, quietly and industriously going about their business, an office is not functioning as it should. That's because innovation - the heart of the knowledge economy - is fundamentally social. Ideas arise as much out of casual conversations as they do out of formal meetings. More precisely, as one study after another has demonstrated, the best ideas in any workplace arise out of casual contacts among different groups within the same company. If you are designing widgets for Acme.com, for instance, it is unlikely that a breakthrough idea is going to come from someone else on the widget team: after all, the other team members are as blinkered by the day-to- day demands of dealing with the existing product as you are. Someone from outside Acme.com - your old engineering professor, or a guy you used to work with at Apex.com - isn't going to be that helpful, either. A person like that doesn't know enough about Acme's widgets to have a truly useful idea. The most useful insights are likely to come from someone in customer service, who hears firsthand what widget customers have to say, or from someone in marketing, who has wrestled with the problem of how to explain widgets to new users, or from someone who used to work on widgets a few years back and whose work on another Acme product has given him a fresh perspective. Innovation comes from the interactions of people at a comfortable distance from one another, neither too close nor too far. This is why - quite apart from the matter of logistics and efficiency - companies have offices to begin with. They go to the trouble of gathering their employees under one roof because they want the widget designers to bump into the people in marketing and the people in customer service and the guy who moved to another department a few years back.

  The catch is that getting people in an office to bump into people from another department is not so easy as it looks. In the sixties and seventies, a researcher at M.I.T. named Thomas Allen conducted a decade-long study of the way in which engineers communicated in research-and-development laboratories. Allen found that the likelihood that any two people will communicate drops off dramatically as the distance between their desks increases: we are four times as likely to communicate with someone who sits six feet away from us as we are with someone who sits sixty feet away. And people seated more than seventy-five feet apart hardly talk at all.

  Allen's second finding was even more disturbing. When the engineers weren't talking to those in their immediate vicinity, many of them spent their time talking to people outside their company - to their old computer-science professor or the guy they used to work with at Apple. He concluded that it was actually easier to make the outside call than to walk across the room. If you constantly ask for advice or guidance from people inside your organization, after all, you risk losing prestige. Your colleagues might think you are incompetent. The people you keep asking for advice might get annoyed at you. Calling an outsider avoids these problems. "The engineer can easily excuse his lack of knowledge by pretending to be an `expert in something else' who needs some help in `broadening into this new area,'" Allen wrote. He did his study in the days before E-mail and the Internet, but the advent of digital communication has made these problems worse. Allen's engineers were far too willing to go outside the company for advice and new ideas. E-mail makes it even easier to talk to people outside the company.

  The task of the office, then, is to invite a particular kind of social interaction - the casual, nonthreatening encounter that makes it easy for relative strangers to talk to each other. Offices need the sort of social milieu that Jane Jacobs found on the sidewalks of the West Village. "It is possible in a city street neighborhood to know all kinds of people without unwelcome entanglements, without boredom, necessity for excuses, explanations, fears of giving offense, embarrassments respecting impositions or commitments, and all such paraphernalia of obligations which can accompany less limited relationships," Jacobs wrote. If you substitute "office" for "city street neighborhood," that sentence becomes the perfect statement of what the modern employer wants from the workplace.

  3.

  Imagine a classic big-city office tower, with a floor plate of a hundred and eighty feet by a hundred and eighty feet. The center part of every floor is given over to the guts of the building: elevators, bathrooms, electrical and plumbing systems. Around the core are cubicles and interior offices, for support staff and lower management. And around the edges of the floor, against the windows, are rows of offices for senior staff, each room perhaps two hundred or two hundred and fifty square feet. The best research about office communication tells us that there is almost no worse way to lay out an office. The executive in one corner office will seldom bump into any other executive in a corner office. Indeed, stringing the exterior offices out along the windows guarantees that there will be very few people within the critical sixty-foot radius of those offices. To maximize the amount of contact among employees, you really ought to put the most valuable staff members in the center of the room, where the highest number of people can be within their orbit. Or, even better, put all places where people tend to congregate - the public areas - in the center, so they can draw from as many disparate parts of the company as possible. Is it any wonder that creative firms often prefer loft-style buildings, which have usable centers?

  Another way to increase communication is to have as few private offices as possible. The idea is to exchange private space for public space, just as in the West Village, where residents agree to live in tiny apartments in exchange for a wealth of nearby cafés and stores and bars and parks. The West Village forces its residents outdoors. Few people, for example, have a washer and dryer in their apartment, and so even laundry is necessarily a social event: you have to take your clothes to the laundromat down the street. In the office equivalent, designers force employees to move around, too. They build in "functional inefficiencies"; they put kitchens and copiers and printers and libraries in places that can be reached only by a circuitous journey.

  A more direct approach is to create an office so flexible that the kinds of people who need to spontaneously interact can actually be brought together. For example, the Ford Motor Company, along with a group of researchers from the University of Michigan, recently conducted a pilot project on the effectiveness of "war rooms" in software development. Previously, someone inside the company who needed a new piece of software written would have a series of meetings with the company's programmers, and the client and the programmers would send messages back and forth. In the war-room study, the company moved the client, the programmers, and a manager into a dedicated conference room, and made them stay there until the project was done. Using the war room cut the software-development time by two-thirds, in part because there was far less time wasted on formal meetings or calls outside the building: the people who ought to have been bumping into each other were now sitting next to each other.

  Two years ago, the advertising agency TBWAChiatDay moved into new offices in Los Angeles, out near the airport. In the preceding years, the firm had been engaged in a radical, and in some ways disastrous, experiment with a "nonterritorial" office: no one had a desk or any office equipment of his own. It was a scheme that courted failure by neglecting all the ways in which an office is a sort of neighborhood. By contrast, the new office is an almost perfect embodiment of Jacobsian principles of community. The agency is in a huge old warehouse, three stories high and the size of three football fields. It is informally known as Advertising City, and that's what it is: a kind of artfully constructed urban neighborhood. The floor is bisected by a central corridor called Main Street, and in the center of the room is an open space, with café tables and a stand of ficus trees, called Central Park. There's a basketball court, a game room, and a bar. Most of the employees are in snug workstations known as nests, and the nests are grouped together in neighborhoods that radiate from Main Street like Paris arrondissements. The top executives are situated in the middle of the room. The desk belonging to the chairman and creative director of the company looks out on Central Park. The offices of the chief financial officer and the media director abut the basketball court. Sprinkled throughout the building are meeting rooms and project areas and plenty of nooks where employees can closet themselves when they need to. A small part of the building is elevated above the main floor on a mezzanine, and if you stand there and watch the people wander about with their portable phones, and sit and chat in Central Park, and play basketball in the gym, and you feel on your shoulders the sun from the skylights and listen to the gentle buzz of human activity, it is quite possible to forget that you are looking at an office.

  4.

  In "The Death and Life of Great American Cities," Jacobs wrote of the importance of what she called "public characters" - people who have the social position and skills to orchestrate the movement of information and the creation of bonds of trust:

  A public character is anyone who is in frequent contact with a wide circle of people and who is sufficiently interested to make himself a public character.   .The director of a settlement on New York's Lower East Side, as an example, makes a regular round of stores. He learns from the cleaner who does his suits about the presence of dope pushers in the neighborhood. He learns from the grocer that the Dragons are working up to something and need attention. He learns from the candy store that two girls are agitating the Sportsmen toward a rumble. One of his most important information spots is an unused breadbox on Rivington Street.   . A message spoken there for any teen-ager within many blocks will reach his ears unerringly and surprisingly quickly, and the opposite flow along the grapevine similarly brings news quickly in to the breadbox.

  A vital community, in Jacobs's view, required more than the appropriate physical environment. It also required a certain kind of person, who could bind together the varied elements of street life. Offices are no different. In fact, as office designers have attempted to create more vital workplaces, they have become increasingly interested in identifying and encouraging public characters.

  One of the pioneers in this way of analyzing offices is Karen Stephenson, a business-school professor and anthropologist who runs a New York-based consulting company called Netform. Stephenson studies social networks. She goes into a company - her clients include J.P. Morgan, the Los Angeles Police Department, T.R.W., and I.B.M. - and distributes a questionnaire to its employees, asking about which people they have contact with. Whom do you like to spend time with? Whom do you talk to about new ideas? Where do you go to get expert advice? Every name in the company becomes a dot on a graph, and Stephenson draws lines between all those who have regular contact with each other. Stephenson likens her graphs to X-rays, and her role to that of a radiologist. What she's depicting is the firm's invisible inner mechanisms, the relationships and networks and patterns of trust that arise as people work together over time, and that are hidden beneath the organization chart. Once, for example, Stephenson was doing an "X-ray" of a Head Start organization. The agency was mostly female, and when Stephenson analyzed her networks she found that new hires and male staffers were profoundly isolated, communicating with the rest of the organization through only a handful of women. "I looked at tenure in the organization, office ties, demographic data. I couldn't see what tied the women together, and why the men were talking only to these women," Stephenson recalls. "Nor could the president of the organization. She gave me a couple of ideas. She said, `Sorry I can't figure it out.' Finally, she asked me to read the names again, and I could hear her stop, and she said, `My God, I know what it is. All those women are smokers.'" The X- ray revealed that the men - locked out of the formal power structure of the organization - were trying to gain access and influence by hanging out in the smoking area with some of the more senior women.

  What Stephenson's X-rays do best, though, is tell you who the public characters are. In every network, there are always one or two people who have connections to many more people than anyone else. Stephenson calls them "hubs," and on her charts lines radiate out from them like spokes on a wheel. (Bernie the candy-store owner, on Jacobs's Hudson Street, was a hub.) A few people are also what Stephenson calls "gatekeepers": they control access to critical people, and link together a strategic few disparate groups. Finally, if you analyze the graphs there are always people who seem to have lots of indirect links to other people - who are part of all sorts of networks without necessarily being in the center of them. Stephenson calls those people "pulsetakers." (In Silicon Valleyspeak, the person in a sea of cubicles who pops his or her head up over the partition every time something interesting is going on is called a prairie dog: prairie dogs are pulsetakers.)

  5.

  In the past year, Stephenson has embarked on a partnership with Steelcase, the world's largest manufacturer of office furniture, in order to use her techniques in the design of offices. Traditionally, office designers would tell a company what furniture should go where. Stephenson and her partners at Steelcase propose to tell a company what people should go where, too. At Steelcase, they call this "floor-casting."

  One of the first projects for the group is the executive level at Steelcase's headquarters, a five-story building in Grand Rapids, Michigan. The executive level, on the fourth floor, is a large, open room filled with small workstations. (Jim Hackett, the head of the company, occupies what Steelcase calls a Personal Harbor, a black, freestanding metal module that may be - at seven feet by eight - the smallest office of a Fortune 500 C.E.O.) One afternoon recently, Stephenson pulled out a laptop and demonstrated how she had mapped the communication networks of the leadership group onto a seating chart of the fourth floor. The dots and swirls are strangely compelling - abstract representations of something real and immediate. One executive, close to Hackett, was inundated with lines from every direction. "He's a hub, a gatekeeper, and a pulsetaker across all sorts of different dimensions," Stephenson said. "What that tells you is that he is very strategic. If there is no succession planning around that person, you have got a huge risk to the knowledge base of this company. If he's in a plane accident, there goes your knowledge." She pointed to another part of the floor plan, with its own thick overlay of lines. "That's sales and marketing. They have a pocket of real innovation here. The guy who runs it is very good, very smart." But then she pointed to the lines connecting that department with other departments. "They're all coming into this one place," she said, and she showed how all the lines coming out of marketing converged on one senior executive. "There's very little path redundancy. In human systems, you need redundancy, you need communication across multiple paths." What concerned Stephenson wasn't just the lack of redundancy but the fact that, in her lingo, many of the paths were "unconfirmed": they went only one way. People in marketing were saying that they communicated with the senior management, but there weren't as many lines going in the other direction. The sales-and-marketing team, she explained, had somehow become isolated from senior management. They couldn't get their voices heard when it came to innovation - and that fact, she said, ought to be a big consideration when it comes time to redo the office. "If you ask the guy who heads sales and marketing who he wants to sit next to, he'll pick out all the people he trusts," she said. "But do you sit him with those people? No. What you want to do is put people who don't trust each other near each other. Not necessarily next to each other, because they get too close. But close enough so that when you pop your head up, you get to see people, they are in your path, and all of a sudden you build an inviting space where they can hang out, kitchens and things like that. Maybe they need to take a hub in an innovation network and place the person with a pulsetaker in an expert network - to get that knowledge indirectly communicated to a lot of people."

  The work of translating Stephenson's insights onto a new floor plan is being done in a small conference room - a war room - on the second floor of Steelcase headquarters. The group consists of a few key people from different parts of the firm, such as human resources, design, technology, and space-planning research. The walls of the room are cluttered with diagrams and pictures and calculations and huge, blownup versions of Stephenson's X-rays. Team members stress that what they are doing is experimental. They don't know yet how directly they want to translate findings from the communications networks to office plans. After all, you don't want to have to redo the entire office every time someone leaves or joins the company. But it's clear that there are some very simple principles from the study of public characters which ought to drive the design process. "You want to place hubs at the center," Joyce Bromberg, the director of space planning, says. "These are the ones other people go to in order to get information. Give them an environment that allows access. But there are also going to be times that they need to have control - so give them a place where they can get away. Gatekeepers represent the fit between groups. They transmit ideas. They are brokers, so you might want to put them at the perimeter, and give them front porches" - areas adjoining the workspace where you might put little tables and chairs. "Maybe they could have swinging doors with white boards, to better transmit information. As for pulsetakers, they are the roamers. Rather than give them one fixed work location, you might give them a series of touchdown spots - where you want them to stop and talk. You want to enable their meandering."

  One of the other team members was a tall, thoughtful man named Frank Graziano. He had a series of pencil drawings - with circles representing workstations of all the people whose minds, as he put it, he wanted to make "explicit." He said that he had done the plan the night before. "I think we can thread innovation through the floor," he went on, and with a pen drew a red line that wound its way through the maze of desks. It was his Hudson Street.

  6.

  "The Death and Life of Great American Cities" was a controversial book, largely because there was always a whiff of paternalism in Jacobs's vision of what city life ought to be. Chelsea - the neighborhood directly to the north of her beloved West Village - had "mixtures and types of buildings and densities of dwelling units per acre.   almost identical with those of Greenwich Village," she noted. But its long-predicted renaissance would never happen, she maintained, because of the "barriers of long, self-isolating blocks." She hated Chatham Village, a planned "garden city" development in Pittsburgh. It was a picturesque green enclave, but it suffered, in Jacobs's analysis, from a lack of sidewalk life. She wasn't concerned that some people might not want an active street life in their neighborhood; that what she saw as the "self-isolating blocks" of Chelsea others would see as a welcome respite from the bustle of the city, or that Chatham Village would appeal to some people precisely because one did not encounter on its sidewalks a "solid wave of conversation and animation." Jacobs felt that city dwellers belonged in environments like the West Village, whether they realized it or not.

  The new workplace designers are making the same calculation, of course. The point of the new offices is to compel us to behave and socialize in ways that we otherwise would not - to overcome our initial inclination to be office suburbanites. But, in all the studies of the new workplaces, the reservations that employees have about a more social environment tend to diminish once they try it. Human behavior, after all, is shaped by context, but how it is shaped - and whether we'll be happy with the result - we can understand only with experience. Jane Jacobs knew the virtues of the West Village because she lived there. What she couldn't know was that her ideas about community would ultimately make more sense in the workplace. From time to time, social critics have bemoaned the falling rates of community participation in American life, but they have made the same mistake. The reason Americans are content to bowl alone (or, for that matter, not bowl at all) is that, increasingly, they receive all the social support they need - all the serendipitous interactions that serve to make them happy and productive - from nine to five.
GO TO TOP MENU

  Out of the Frying Pan, Into the Voting Booth

  My parents have an electric stove in their kitchen made by a company called Moffat. It has four burners on top and a raised panel that runs across the back with a set of knobs on it, and the way the panel is laid out has always been a bone of contention in our family. The knobs for the two left-hand burners are on the left side of the back panel, stacked one on top of the other, with the top knob controlling the back burner and the bottom knob controlling the front burner - same thing on the right-hand side. My mother finds this logical. But not my father. Every time he looks at the stove, he gets confused and thinks that the top knob controls the front burner.

  Does this mean that my mother is more rational than my father? I don't think so. It simply means that any time you create a visual guide to an action that isn't intuitive - that requires some kind of interpretation or physical contortion - you're going to baffle some people. From the perspective of "usability" researchers, my father has fallen victim to an ill-designed interface. People who pop the trunk of their car when they mean to pop the gas-tank lid are experiencing the same kind of confusion as the singer John Denver did, apparently, when he died in an airplane crash a few years ago. Denver was flying a new, experimental plane, and may not have realized how little fuel he had, since the fuel gauge wasn't linear, the way you'd expect it to be. When the line on that sort of gauge registers one-quarter, for example, it doesn't mean that the twenty-six-gallon tank is a quarter full; it means that the tank has less than five gallons left.

  Then, there's the question of voting. Susan King Roth, an associate professor of visual communication at Ohio State University, did an experiment recently with voting machines and found that a surprising number of the people in her study didn't vote on the issues section of the ballot. Why? Because the issues proposals were at the top of the ballot, sixty-seven inches from the floor, and the eye height of the average American woman is sixty inches. Some people in the study simply couldn't see the proposals.

  The Florida butterfly ballot may be the textbook example of what can go wrong when design isn't intuitive. The usability expert Kevin Fox has identified three "cognitive paths" that could have led voters to misunderstand the butterfly layout: gestalt grouping, linear visual search, and numeric mapping - all of which point out that the way the butterfly ballot invites itself to be read does not match the way it invites voters to act. In the language of usability studies, there is an incompatibility between input and output. In a sense, it's just like the problem with the Moffat stove. My father hasn't burned down the house yet. But sometimes he puts a pot on the back burner and turns on the front burner. If he's used the stove twenty thousand times in his life, it's a reasonable guess that he's made this mistake maybe a few hundred times. In the grand scheme of things, that's not a very high percentage. Then again, sometimes a few hundred mistakes can turn out to be awfully important.
GO TO TOP MENU

  Ron Popeil and the conquest of the American kitchen.

  The extraordinary story of the Ronco Showtime Rotisserie & BBQ begins with Nathan Morris, the son of the shoemaker and cantor Kidders Morris, who came over from the Old Country in the eighteen-eighties, and settled in Asbury Park, New Jersey. Nathan Morris was a pitchman. He worked the boardwalk and the five-and-dimes and county fairs up and down the Atlantic coast, selling kitchen gadgets made by Acme Metal, out of Newark. In the early forties, Nathan set up N. K. Morris Manufacturing - turning out the KwiKi-Pi and the Morris Metric Slicer - and perhaps because it was the Depression and job prospects were dim, or perhaps because Nathan Morris made such a compelling case for his new profession, one by one the members of his family followed him into the business. His sons Lester Morris and Arnold (the Knife) Morris became his pitchmen. He set up his brother-in-law Irving Rosenbloom, who was to make a fortune on Long Island in plastic goods, including a hand grater of such excellence that Nathan paid homage to it with his own Dutch Kitchen Shredder Grater. He partnered with his brother Al, whose own sons worked the boardwalk, alongside a gangly Irishman by the name of Ed McMahon. Then, one summer just before the war, Nathan took on as an apprentice his nephew Samuel Jacob Popeil. S.J., as he was known, was so inspired by his uncle Nathan that he went on to found Popeil Brothers, based in Chicago, and brought the world the Dial-O-Matic, the Chop-O-Matic, and the Veg-O-Matic. S. J. Popeil had two sons. The elder was Jerry, who died young. The younger is familiar to anyone who has ever watched an infomercial on late- night television. His name is Ron Popeil.

  In the postwar years, many people made the kitchen their life's work. There were the Klinghoffers of New York, one of whom, Leon, died tragically in 1985, during the Achille Lauro incident, when he was pushed overboard in his wheelchair by Palestinian terrorists). They made the Roto-Broil 400, back in the fifties, an early rotisserie for the home, which was pitched by Lester Morris. There was Lewis Salton, who escaped the Nazis with an English stamp from his father's collection and parlayed it into an appliance factory in the Bronx. He brought the world the Salton Hotray - a sort of precursor to the microwave - and today Salton, Inc., sells the George Foreman Grill.

  But no rival quite matched the Morris-Popeil clan. They were the first family of the American kitchen. They married beautiful women and made fortunes and stole ideas from one another and lay awake at night thinking of a way to chop an onion so that the only tears you shed were tears of joy. They believed that it was a mistake to separate product development from marketing, as most of their contemporaries did, because to them the two were indistinguishable: the object that sold best was the one that sold itself. They were spirited, brilliant men. And Ron Popeil was the most brilliant and spirited of them all. He was the family's Joseph, exiled to the wilderness by his father only to come back and make more money than the rest of the family combined. He was a pioneer in taking the secrets of the boardwalk pitchmen to the television screen. And, of all the kitchen gadgets in the Morris-Popeil pantheon, nothing has ever been quite so ingenious in its design, or so broad in its appeal, or so perfectly representative of the Morris-Popeil belief in the interrelation of the pitch and the object being pitched, as the Ronco Showtime Rotisserie & BBQ, the countertop oven that can be bought for four payments of $39.95 and may be, dollar for dollar, the finest kitchen appliance ever made.

  A Rotisserie Is Born

  Ron Popeil is a handsome man, thick through the chest and shoulders, with a leonine head and striking, over-size features. He is in his mid-sixties, and lives in Beverly Hills, halfway up Coldwater Canyon, in a sprawling bungalow with a stand of avocado trees and a vegetable garden out back. In his habits he is, by Beverly Hills standards, old school. He carries his own bags. He has been known to eat at Denny's. He wears T-shirts and sweatpants. As often as twice a day, he can be found buying poultry or fish or meat at one of the local grocery stores - in particular, Costco, which he favors because the chickens there are ninety-nine cents a pound, as opposed to a dollar forty-nine at standard supermarkets. Whatever he buys, he brings back to his kitchen, a vast room overlooking the canyon, with an array of industrial appliances, a collection of fifteen hundred bottles of olive oil, and, in the corner, an oil painting of him, his fourth wife, Robin (a former Frederick's of Hollywood model), and their baby daughter, Contessa. On paper, Popeil owns a company called Ronco Inventions, which has two hundred employees and a couple of warehouses in Chatsworth, California, but the heart of Ronco is really Ron working out of his house, and many of the key players are really just friends of Ron's who work out of their houses, too, and who gather in Ron's kitchen when, every now and again, Ron cooks a soup and wants to talk things over.

  In the last thirty years, Ron has invented a succession of kitchen gadgets, among them the Ronco Electric Food Dehydrator and the Popeil Automatic Pasta and Sausage Maker, which featured a thrust bearing made of the same material used in bulletproof glass. He works steadily, guided by flashes of inspiration. This past August, for instance, he suddenly realized what product should follow the Showtime Rotisserie. He and his right-hand man, Alan Backus, had been working on a bread-and-batter machine, which would take up to ten pounds of chicken wings or scallops or shrimp or fish fillets and do all the work - combining the eggs, the flour, the breadcrumbs - in a few minutes, without dirtying either the cook's hands or the machine. "Alan goes to Korea, where we have some big orders coming through," Ron explained recently over lunch - a hamburger, medium-well, with fries - in the V.I.P. booth by the door in the Polo Lounge, at the Beverly Hills Hotel. "I call Alan on the phone. I wake him up. It was two in the morning there. And these are my exact words: `Stop. Do not pursue the bread-and-batter machine. I will pick it up later. This other project needs to come first.' " The other project, his inspiration, was a device capable of smoking meats indoors without creating odors that can suffuse the air and permeate furniture. Ron had a version of the indoor smoker on his porch - "a Rube Goldberg kind of thing" that he'd worked on a year earlier - and, on a whim, he cooked a chicken in it. "That chicken was so good that I said to myself" - and with his left hand Ron began to pound on the table - "This is the best chicken sandwich I have ever had in my life." He turned to me: "How many times have you had a smoked-turkey sandwich? Maybe you have a smoked- turkey or a smoked-chicken sandwich once every six months. Once! How many times have you had smoked salmon? Aah. More. I'm going to say you come across smoked salmon as an hors d'oeuvre or an entrée once every three months. Baby-back ribs? Depends on which restaurant you order ribs at. Smoked sausage, same thing. You touch on smoked food" - he leaned in and poked my arm for emphasis - "but I know one thing, Malcolm. You don't have a smoker."

  The idea for the Showtime came about in the same way. Ron was at Costco about four years ago when he suddenly realized that there was a long line of customers waiting to buy chickens from the in-store rotisserie ovens. They touched on rotisserie chicken, but Ron knew one thing: they did not have a rotisserie oven. Ron went home and called Backus. Together, they bought a glass aquarium, a motor, a heating element, a spit rod, and a handful of other spare parts, and began tinkering. Ron wanted something big enough for a fifteen-pound turkey but small enough to fit into the space between the base of an average kitchen cupboard and the countertop. He didn't want a thermostat, because thermostats break, and the constant clicking on and off of the heat prevents the even, crispy browning that he felt was essential. And the spit rod had to rotate on the horizontal axis, not the vertical axis, because if you cooked a chicken or a side of beef on the vertical axis the top would dry out and the juices would drain to the bottom. Roderick Dorman, Ron's patent attorney, says that when he went over to Coldwater Canyon he often saw five or six prototypes on the kitchen counter, lined up in a row. Ron would have a chicken in each of them, so that he could compare the consistency of the flesh and the browning of the skin, and wonder if, say, there was a way to rotate a shish kebab as it approached the heating element so that the inner side of the kebab would get as brown as the outer part. By the time Ron finished, the Showtime prompted no fewer than two dozen patent applications. It was equipped with the most powerful motor in its class. It had a drip tray coated with a nonstick ceramic, which was easily cleaned, and the oven would still work even after it had been dropped on a concrete or stone surface ten times in succession, from a distance of three feet. To Ron, there was no question that it made the best chicken he had ever had in his life.

  It was then that Ron filmed a television infomercial for the Showtime, twenty-eight minutes and thirty seconds in length. It was shot live before a studio audience, and aired for the first time on August 8, 1998. It has run ever since, often in the wee hours of the morning, or on obscure cable stations, alongside the get-rich schemes and the "Three's Company" reruns. The response to it has been such that within the next three years total sales of the Showtime should exceed a billion dollars. Ron Popeil didn't use a single focus group. He had no market researchers, R. & D. teams, public-relations advisers, Madison Avenue advertising companies, or business consultants. He did what the Morrises and the Popeils had been doing for most of the century, and what all the experts said couldn't be done in the modern economy. He dreamed up something new in his kitchen and went out and pitched it himself.

  Pitchmen

  Nathan Morris, Ron Popeil's great-uncle, looked a lot like Cary Grant. He wore a straw boater. He played the ukulele, drove a convertible, and composed melodies for the piano. He ran his business out of a low-slung, whitewashed building on Ridge Avenue, near Asbury Park, with a little annex in the back where he did pioneering work with Teflon. He had certain eccentricities, such as a phobia he developed about travelling beyond Asbury Park without the presence of a doctor. He feuded with his brother Al, who subsequently left in a huff for Atlantic City, and then with his nephew S. J. Popeil, whom Nathan considered insufficiently grateful for the start he had given him in the kitchen- gadget business. That second feud led to a climactic legal showdown over S. J. Popeil's Chop-O-Matic, a food preparer with a pleated, W-shaped blade rotated by a special clutch mechanism. The Chop-O-Matic was ideal for making coleslaw and chopped liver, and when Morris introduced a strikingly similar product, called the Roto-Chop, S. J. Popeil sued his uncle for patent infringement. (As it happened, the Chop-O-Matic itself seemed to have been inspired by the Blitzhacker, from Switzerland, and S.J. later lost a patent judgment to the Swiss.)

  The two squared off in Trenton, in May of 1958, in a courtroom jammed with Morrises and Popeils. When the trial opened, Nathan Morris was on the stand, being cross-examined by his nephew's attorneys, who were out to show him that he was no more than a huckster and a copycat. At a key point in the questioning, the judge suddenly burst in. "He took the index finger of his right hand and he pointed it at Morris," Jack Dominik, Popeil's longtime patent lawyer, recalls, "and as long as I live I will never forget what he said. `I know you! You're a pitchman! I've seen you on the boardwalk!' And Morris pointed his index finger back at the judge and shouted, `No! I'm a manufacturer. I'm a dignified manufacturer, and I work with the most eminent of counsel!' " (Nathan Morris, according to Dominik, was the kind of man who referred to everyone he worked with as eminent.) "At that moment," Dominik goes on, "Uncle Nat's face was getting red and the judge's was getting redder, so a recess was called." What happened later that day is best described in Dominik's unpublished manuscript, "The Inventions of Samuel Joseph Popeil by Jack E. Dominik - His Patent Lawyer." Nathan Morris had a sudden heart attack, and S.J. was guilt-stricken. "Sobbing ensued," Dominik writes. "Remorse set in. The next day, the case was settled. Thereafter, Uncle Nat's recovery from his previous day's heart attack was nothing short of a miracle."

  Nathan Morris was a performer, like so many of his relatives, and pitching was, first and foremost, a performance. It's said that Nathan's nephew Archie (the Pitchman's Pitchman) Morris once sold, over a long afternoon, gadget after gadget to a well-dressed man. At the end of the day, Archie watched the man walk away, stop and peer into his bag, and then dump the whole lot into a nearby garbage can. The Morrises were that good. "My cousins could sell you an empty box," Ron says.

  The last of the Morrises to be active in the pitching business is Arnold (the Knife) Morris, so named because of his extraordinary skill with the Sharpcut, the forerunner of the Ginsu. He is in his early seventies, a cheerful, impish man with a round face and a few wisps of white hair, and a trademark move whereby, after cutting a tomato into neat, regular slices, he deftly lines the pieces up in an even row against the flat edge of the blade. Today, he lives in Ocean Township, a few miles from Asbury Park, with Phyllis, his wife of twenty-nine years, whom he refers to (with the same irresistible conviction that he might use to describe, say, the Feather Touch Knife) as "the prettiest girl in Asbury Park." One morning recently, he sat in his study and launched into a pitch for the Dial-O-Matic, a slicer produced by S. J. Popeil some forty years ago.

  "Come on over, folks. I'm going to show you the most amazing slicing machine you have ever seen in your life," he began. Phyllis, sitting nearby, beamed with pride. He picked up a package of barbecue spices, which Ron Popeil sells alongside his Showtime Rotisserie, and used it as a prop. "Take a look at this!" He held it in the air as if he were holding up a Tiffany vase. He talked about the machine's prowess at cutting potatoes, then onions, then tomatoes. His voice, a marvellous instrument inflected with the rhythms of the Jersey Shore, took on a singsong quality: "How many cut tomatoes like this? You stab it. You jab it. The juices run down your elbow. With the Dial-O-Matic, you do it a little differently. You put it in the machine and you wiggle" - he mimed fixing the tomato to the bed of the machine. "The tomato! Lady! The tomato! The more you wiggle, the more you get. The tomato! Lady! Every slice comes out perfectly, not a seed out of place. But the thing I love my Dial-O-Matic for is coleslaw. My mother-in-law used to take her cabbage and do this." He made a series of wild stabs at an imaginary cabbage. "I thought she was going to commit suicide. Oh, boy, did I pray - that she wouldn't slip! Don't get me wrong. I love my mother-in-law. It's her daughter I can't figure out. You take the cabbage. Cut it in half. Coleslaw, hot slaw. Pot slaw. Liberty slaw. It comes out like shredded wheat . . ."

  It was a vaudeville monologue, except that Arnold wasn't merely entertaining; he was selling. "You can take a pitchman and make a great actor out of him, but you cannot take an actor and always make a great pitchman out of him," he says. The pitchman must make you applaud and take out your money. He must be able to execute what in pitchman's parlance is called "the turn" - the perilous, crucial moment where he goes from entertainer to businessman. If, out of a crowd of fifty, twenty-five people come forward to buy, the true pitchman sells to only twenty of them. To the remaining five, he says, "Wait! There's something else I want to show you!" Then he starts his pitch again, with slight variations, and the remaining four or five become the inner core of the next crowd, hemmed in by the people around them, and so eager to pay their money and be on their way that they start the selling frenzy all over again. The turn requires the management of expectation. That's why Arnold always kept a pineapple tantalizingly perched on his stand. "For forty years, I've been promising to show people how to cut the pineapple, and I've never cut it once," he says. "It got to the point where a pitchman friend of mine went out and bought himself a plastic pineapple. Why would you cut the pineapple? It cost a couple bucks. And if you cut it they'd leave." Arnold says that he once hired some guys to pitch a vegetable slicer for him at a fair in Danbury, Connecticut, and became so annoyed at their lackadaisical attitude that he took over the demonstration himself. They were, he says, waiting for him to fail: he had never worked that particular slicer before and, sure enough, he was massacring the vegetables. Still, in a single pitch he took in two hundred dollars. "Their eyes popped out of their heads," Arnold recalls. "They said, `We don't understand it. You don't even know how to work the damn machine.' I said, `But I know how to do one thing better than you.' They said, `What's that?' I said, `I know how to ask for the money.' And that's the secret to the whole damn business."

  Ron Popeil started pitching his father's kitchen gadgets at the Maxwell Street flea market in Chicago, in the mid-fifties. He was thirteen. Every morning, he would arrive at the market at five and prepare fifty pounds each of onions, cabbages, and carrots, and a hundred pounds of potatoes. He sold from six in the morning until four in the afternoon, bringing in as much as five hundred dollars a day. In his late teens, he started doing the state- and county-fair circuit, and then he scored a prime spot in the Woolworth's at State and Washington, in the Loop, which at the time was the top-grossing Woolworth's store in the country. He was making more than the manager of the store, selling the Chop- O-Matic and the Dial-O-Matic. He dined at the Pump Room and wore a Rolex and rented hundred-and-fifty-dollar-a-night hotel suites. In pictures from the period, he is beautiful, with thick dark hair and blue-green eyes and sensuous lips, and, several years later, when he moved his office to 919 Michigan Avenue, he was called the Paul Newman of the Playboy Building. Mel Korey, a friend of Ron's from college and his first business partner, remembers the time he went to see Ron pitch the Chop-O-Matic at the State Street Woolworth's. "He was mesmerizing," Korey says. "There were secretaries who would take their lunch break at Woolworth's to watch him because he was so good-looking. He would go into the turn, and people would just come running." Several years ago, Ron's friend Steve Wynn, the founder of the Mirage resorts, went to visit Michael Milken in prison. They were near a television, and happened to catch one of Ron's infomercials just as he was doing the countdown, a routine taken straight from the boardwalk, where he says, "You're not going to spend two hundred dollars, not a hundred and eighty dollars, not one-seventy, not one- sixty . . ." It's a standard pitchman's gimmick: it sounds dramatic only because the starting price is set way up high. But something about the way Ron did it was irresistible. As he got lower and lower, Wynn and Milken - who probably know as much about profit margins as anyone in America - cried out in unison, "Stop, Ron! Stop!"

  Was Ron the best? The only attempt to settle the question definitively was made some forty years ago, when Ron and Arnold were working a knife set at the Eastern States Exposition, in West Springfield, Massachusetts. A third man, Frosty Wishon, who was a legend in his own right, was there, too. "Frosty was a well-dressed, articulate individual and a good salesman," Ron says. "But he thought he was the best. So I said, `Well, guys, we've got a ten-day show, eleven, maybe twelve hours a day. We'll each do a rotation, and we'll compare how much we sell." In Morris-Popeil lore, this is known as "the shoot-out," and no one has ever forgotten the outcome. Ron beat Arnold, but only by a whisker- -no more than a few hundred dollars. Frosty Wishon, meanwhile, sold only half as much as either of his rivals. "You have no idea the pressure Frosty was under," Ron continues. "He came up to me at the end of the show and said, `Ron, I will never work with you again as long as I live.' "

  No doubt Frosty Wishon was a charming and persuasive person, but he assumed that this was enough - that the rules of pitching were the same as the rules of celebrity endorsement. When Michael Jordan pitches McDonald's hamburgers, Michael Jordan is the star. But when Ron Popeil or Arnold Morris pitched, say, the Chop-O-Matic, his gift was to make the Chop-O-Matic the star. It was, after all, an innovation. It represented a different way of dicing onions and chopping liver: it required consumers to rethink the way they went about their business in the kitchen. Like most great innovations, it was disruptive. And how do you persuade people to disrupt their lives? Not merely by ingratiation or sincerity, and not by being famous or beautiful. You have to explain the invention to customers - not once or twice but three or four times, with a different twist each time. You have to show them exactly how it works and why it works, and make them follow your hands as you chop liver with it, and then tell them precisely how it fits into their routine, and, finally, sell them on the paradoxical fact that, revolutionary as the gadget is, it's not at all hard to use.

  Thirty years ago, the videocassette recorder came on the market, and it was a disruptive product, too: it was supposed to make it possible to tape a television show so that no one would ever again be chained to the prime-time schedule. Yet, as ubiquitous as the VCR became, it was seldom put to that purpose. That's because the VCR was never pitched: no one ever explained the gadget to American consumers - not once or twice but three or four times - and no one showed them exactly how it worked or how it would fit into their routine, and no pair of hands guided them through every step of the process. All the VCR-makers did was hand over the box with a smile and a pat on the back, tossing in an instruction manual for good measure. Any pitchman could have told you that wasn't going to do it.

  Once, when I was over at Ron's house in Coldwater Canyon, sitting on one of the high stools in his kitchen, he showed me what real pitching is all about. He was talking about how he had just had dinner with the actor Ron Silver, who is playing Ron's friend Robert Shapiro in a new movie about the O. J. Simpson trial. "They shave the back of Ron Silver's head so that he's got a bald spot, because, you know, Bob Shapiro's got a bald spot back there, too," Ron said. "So I say to him, `You've gotta get GLH.' " GLH, one of Ron's earlier products, is an aerosol spray designed to thicken the hair and cover up bald spots. "I told him, `It will make you look good. When you've got to do the scene, you shampoo it out.' "

  At this point, the average salesman would have stopped. The story was an aside, no more. We had been discussing the Showtime Rotisserie, and on the counter behind us was a Showtime cooking a chicken and next to it a Showtime cooking baby-back ribs, and on the table in front of him Ron's pasta maker was working, and he was frying some garlic so that we could have a little lunch. But now that he had told me about GLH it was unthinkable that he would not also show me its wonders. He walked quickly over to a table at the other side of the room, talking as he went. "People always ask me, `Ron, where did you get that name GLH?' I made it up. Great-Looking Hair." He picked up a can. "We make it in nine different colors. This is silver-black." He picked up a hand mirror and angled it above his head so that he could see his bald spot. "Now, the first thing I'll do is spray it where I don't need it." He shook the can and began spraying the crown of his head, talking all the while. "Then I'll go to the area itself." He pointed to his bald spot. "Right here. O.K. Now I'll let that dry. Brushing is fifty per cent of the way it's going to look." He began brushing vigorously, and suddenly Ron Popeil had what looked like a complete head of hair. "Wow," I said. Ron glowed. "And you tell me `Wow.' That's what everyone says. `Wow.' That's what people say who use it. `Wow.' If you go outside" - he grabbed me by the arm and pulled me out onto the deck - "if you are in bright sunlight or daylight, you cannot tell that I have a big bald spot in the back of my head. It really looks like hair, but it's not hair. It's quite a product. It's incredible. Any shampoo will take it out. You know who would be a great candidate for this? Al Gore. You want to see how it feels?" Ron inclined the back of his head toward me. I had said, "Wow," and had looked at his hair inside and outside, but the pitchman in Ron Popeil wasn't satisfied. I had to feel the back of his head. I did. It felt just like real hair.

  The Tinkerer

  Ron Popeil inherited more than the pitching tradition of Nathan Morris. He was very much the son of S. J. Popeil, and that fact, too, goes a long way toward explaining the success of the Showtime Rotisserie. S.J. had a ten-room apartment high in the Drake Towers, near the top of Chicago's Magnificent Mile. He had a chauffeured Cadillac limousine with a car phone, a rarity in those days, which he delighted in showing off (as in "I'm calling you from the car"). He wore three-piece suits and loved to play the piano. He smoked cigars and scowled a lot and made funny little grunting noises as he talked. He kept his money in T-bills. His philosophy was expressed in a series of epigrams: To his attorney, "If they push you far enough, sue"; to his son, "It's not how much you spend, it's how much you make." And, to a designer who expressed doubts about the utility of one of his greatest hits, the Pocket Fisherman, "It's not for using; it's for giving." In 1974, S.J.'s second wife, Eloise, decided to have him killed, so she hired two hit men - one of whom, aptly, went by the name of Mr. Peeler. At the time, she was living at the Popeil estate in Newport Beach with her two daughters and her boyfriend, a thirty-seven-year-old machinist. When, at Eloise's trial, S.J. was questioned about the machinist, he replied, "I was kind of happy to have him take her off my hands." That was vintage S.J. But eleven months later, after Eloise got out of prison, S.J. married her again. That was vintage S.J., too. As a former colleague of his puts it, "He was a strange bird."

  S. J. Popeil was a tinkerer. In the middle of the night, he would wake up and make frantic sketches on a pad he kept on his bedside table. He would disappear into his kitchen for hours and make a huge mess, and come out with a faraway look on his face. He loved standing behind his machinists, peering over their shoulders while they were assembling one of his prototypes. In the late forties and early fifties, he worked almost exclusively in plastic, reinterpreting kitchen basics with a subtle, modernist flair. "Popeil Brothers made these beautiful plastic flour sifters," Tim Samuelson, a curator at the Chicago Historical Society and a leading authority on the Popeil legacy, says. "They would use contrasting colors, or a combination of opaque plastic with a translucent swirl plastic." Samuelson became fascinated with all things Popeil after he acquired an original Popeil Brothers doughnut maker, in red-and-white plastic, which he felt "had beautiful lines"; to this day, in the kitchen of his Hyde Park high-rise, he uses the Chop-O-Matic in the preparation of salad ingredients. "There was always a little twist to what he did," Samuelson goes on. "Take the Popeil automatic egg turner. It looks like a regular spatula, but if you squeeze the handle the blade turns just enough to flip a fried egg."

  Walter Herbst, a designer whose firm worked with Popeil Brothers for many years, says that S.J.'s modus operandi was to "come up with a holistic theme. He'd arrive in the morning with it. It would be something like" - Herbst assumes S.J.'s gruff voice - " 'We need a better way to shred cabbage.' It was a passion, an absolute goddam passion. One morning, he must have been eating grapefruit, because he comes to work and calls me and says, 'We need a better way to cut grapefruit!' " The idea they came up with was a double-bladed paring knife, with the blades separated by a fraction of an inch so that both sides of the grapefruit membrane could be cut simultaneously. "There was a little grocery store a few blocks away," Herbst says. "So S.J. sends the chauffeur out for grapefruit. How many? Six. Well, over the period of a couple of weeks, six turns to twelve and twelve turns to twenty, until we were cutting thirty to forty grapefruits a day. I don't know if that little grocery store ever knew what happened."

  S. J. Popeil's finest invention was undoubtedly the Veg-O-Matic, which came on the market in 1960 and was essentially a food processor, a Cuisinart without the motor. The heart of the gadget was a series of slender, sharp blades strung like guitar strings across two Teflon-coated metal rings, which were made in Woodstock, Illinois, from 364 Alcoa, a special grade of aluminum. When the rings were aligned on top of each other so that the blades ran parallel, a potato or an onion pushed through would come out in perfect slices. If the top ring was rotated, the blades formed a crosshatch, and a potato or an onion pushed through would come out diced. The rings were housed in a handsome plastic assembly, with a plunger to push the vegetables through the blades. Technically, the Veg-O-Matic was a triumph: the method of creating blades strong enough to withstand the assault of vegetables received a U.S. patent. But from a marketing perspective it posed a problem. S.J.'s products had hitherto been sold by pitchmen armed with a mound of vegetables meant to carry them through a day's worth of demonstrations. But the Veg-O-Matic was too good. In a single minute, according to the calculations of Popeil Brothers, it could produce a hundred and twenty egg wedges, three hundred cucumber slices, eleven hundred and fifty potato shoestrings, or three thousand onion dices. It could go through what used to be a day's worth of vegetables in a matter of minutes. The pitchman could no longer afford to pitch to just a hundred people at a time; he had to pitch to a hundred thousand. The Veg-O-Matic needed to be sold on television, and one of the very first pitchmen to grasp this fact was Ron Popeil.

  In the summer of 1964, just after the Veg-O-Matic was introduced, Mel Korey joined forces with Ron Popeil in a company called Ronco. They shot a commercial for the Veg-O-Matic for five hundred dollars, a straightforward pitch shrunk to two minutes, and set out from Chicago for the surrounding towns of the Midwest. They cold-called local department stores and persuaded them to carry the Veg-O-Matic on guaranteed sale, which meant that whatever the stores didn't sell could be returned. Then they visited the local television station and bought a two- or three-week run of the cheapest airtime they could find, praying that it would be enough to drive traffic to the store. "We got Veg-O-Matics wholesale for $3.42," Korey says. "They retailed for $9.95, and we sold them to the stores for $7.46, which meant that we had four dollars to play with. If I spent a hundred dollars on television, I had to sell twenty-five Veg-O-Matics to break even." It was clear, in those days, that you could use television to sell kitchen products if you were Procter & Gamble. It wasn't so clear that this would work if you were Mel Korey and Ron Popeil, two pitchmen barely out of their teens selling a combination slicer-dicer that no one had ever heard of. They were taking a wild gamble, and, to their amazement, it paid off. "They had a store in Butte, Montana - Hennessy's," Korey goes on, thinking back to those first improbable years. "Back then, people there were still wearing peacoats. The city was mostly bars. It had just a few three-story buildings. There were twenty-seven thousand people, and one TV station. I had the Veg-O-Matic, and I go to the store, and they said, 'We'll take a case. We don't have a lot of traffic here.' I go to the TV station and the place is a dump. The only salesperson was going blind and deaf. So I do a schedule. For five weeks, I spend three hundred and fifty dollars. I figure if I sell a hundred and seventy-four machines - six cases - I'm happy. I go back to Chicago, and I walk into the office one morning and the phone is ringing. They said, 'We sold out. You've got to fly us another six cases of Veg-O-Matics.' The next week, on Monday, the phone rings. It's Butte again: 'We've got a hundred and fifty oversold.' I fly him another six cases. Every few days after that, whenever the phone rang we'd look at each other and say, 'Butte, Montana.' " Even today, thirty years later, Korey can scarcely believe it. "How many homes in total in that town? Maybe several thousand? We ended up selling two thousand five hundred Veg-O-Matics in five weeks!"

  Why did the Veg-O-Matic sell so well? Doubtless, Americans were eager for a better way of slicing vegetables. But it was more than that: the Veg-O-Matic represented a perfect marriage between the medium (television) and the message (the gadget). The Veg-O-Matic was, in the relevant sense, utterly transparent. You took the potato and you pushed it through the Teflon-coated rings and - voilà! - you had French fries. There were no buttons being pressed, no hidden and intimidating gears: you could show-and-tell the Veg-O-Matic in a two-minute spot and allay everyone's fears about a daunting new technology. More specifically, you could train the camera on the machine and compel viewers to pay total attention to the product you were selling. TV allowed you to do even more effectively what the best pitchmen strove to do in live demonstrations - make the product the star.

  This was a lesson Ron Popeil never forgot. In his infomercial for the Showtime Rotisserie, he opens not with himself but with a series of shots of meat and poultry, glistening almost obscenely as they rotate in the Showtime. A voice-over describes each shot: a "delicious six-pound chicken," a "succulent whole duckling," a "mouthwatering pork-loin roast . . ." Only then do we meet Ron, in a sports coat and jeans. He explains the problems of conventional barbecues, how messy and unpleasant they are. He bangs a hammer against the door of the Showtime, to demonstrate its strength. He deftly trusses a chicken, impales it on the patented two-pronged Showtime spit rod, and puts it into the oven. Then he repeats the process with a pair of chickens, salmon steaks garnished with lemon and dill, and a rib roast. All the time, the camera is on his hands, which are in constant motion, manipulating the Showtime apparatus gracefully, with his calming voice leading viewers through every step: "All I'm going to do here is slide it through like this. It goes in very easily. I'll match it up over here. What I'd like to do is take some herbs and spices here. All I'll do is slide it back. Raise up my glass door here. I'll turn it to a little over an hour. . . . Just set it and forget it."

  Why does this work so well? Because the Showtime - like the Veg-O-Matic before it - was designed to be the star. From the very beginning, Ron insisted that the entire door be a clear pane of glass, and that it slant back to let in the maximum amount of light, so that the chicken or the turkey or the baby-back ribs turning inside would be visible at all times. Alan Backus says that after the first version of the Showtime came out Ron began obsessing over the quality and evenness of the browning and became convinced that the rotation speed of the spit wasn't quite right. The original machine moved at four revolutions per minute. Ron set up a comparison test in his kitchen, cooking chicken after chicken at varying speeds until he determined that the optimal speed of rotation was actually six r.p.m. One can imagine a bright-eyed M.B.A. clutching a sheaf of focus-group reports and arguing that Ronco was really selling convenience and healthful living, and that it was foolish to spend hundreds of thousands of dollars retooling production in search of a more even golden brown. But Ron understood that the perfect brown is important for the same reason that the slanted glass door is important: because in every respect the design of the product must support the transparency and effectiveness of its performance during a demonstration - the better it looks onstage, the easier it is for the pitchman to go into the turn and ask for the money.

  If Ron had been the one to introduce the VCR, in other words, he would not simply have sold it in an infomercial. He would also have changed the VCR itself, so that it made sense in an infomercial. The clock, for example, wouldn't be digital. (The haplessly blinking unset clock has, of course, become a symbol of frustration.) The tape wouldn't be inserted behind a hidden door - it would be out in plain view, just like the chicken in the rotisserie, so that if it was recording you could see the spools turn. The controls wouldn't be discreet buttons; they would be large, and they would make a reassuring click as they were pushed up and down, and each step of the taping process would be identified with a big, obvious numeral so that you could set it and forget it. And would it be a slender black, low-profile box? Of course not. Ours is a culture in which the term "black box" is synonymous with incomprehensibility. Ron's VCR would be in red-and-white plastic, both opaque and translucent swirl, or maybe 364 Alcoa aluminum, painted in some bold primary color, and it would sit on top of the television, not below it, so that when your neighbor or your friend came over he would spot it immediately and say, "Wow, you have one of those Ronco Tape-O-Matics!"

  A Real Piece of Work

  Ron Popeil did not have a happy childhood. "I remember baking a potato. It must have been when I was four or five years old," he told me. We were in his kitchen, and had just sampled some baby-back ribs from the Showtime. It had taken some time to draw the memories out of him, because he is not one to dwell on the past. "I couldn't get that baked potato into my stomach fast enough, because I was so hungry." Ron is normally in constant motion, moving his hands, chopping food, bustling back and forth. But now he was still. His parents split up when he was very young. S.J. went off to Chicago. His mother disappeared. He and his older brother, Jerry, were shipped off to a boarding school in upstate New York. "I remember seeing my mother on one occasion. I don't remember seeing my father, ever, until I moved to Chicago, at thirteen. When I was in the boarding school, the thing I remember was a Sunday when the parents visited the children, and my parents never came. Even knowing that they weren't going to show up, I walked out to the perimeter and looked out over the farmland, and there was this road." He made an undulating motion with his hand to suggest a road stretching off into the distance. "I remember standing on the road crying, looking for the movement of a car miles away, hoping that it was my mother and father. And they never came. That's all I remember about boarding school." Ron remained perfectly still. "I don't remember ever having a birthday party in my life. I remember that my grandparents took us out and we moved to Florida. My grandfather used to tie me down in bed - my hands, my wrists, and my feet. Why? Because I had a habit of turning over on my stomach and bumping my head either up and down or side to side. Why? How? I don't know the answers. But I was spread-eagle, on my back, and if I was able to twist over and do it my grandfather would wake up at night and come in and beat the hell out of me." Ron stopped, and then added, "I never liked him. I never knew my mother or her parents or any of that family. That's it. Not an awful lot to remember. Obviously, other things took place. But they have been erased."

  When Ron came to Chicago, at thirteen, with his grandparents, he was put to work in the Popeil Brothers factory - but only on the weekends, when his father wasn't there. "Canned salmon and white bread for lunch, that was the diet," he recalls. "Did I live with my father? Never. I lived with my grandparents." When he became a pitchman, his father gave him just one advantage: he extended his son credit. Mel Korey says that he once drove Ron home from college and dropped him off at his father's apartment. "He had a key to the apartment, and when he walked in his dad was in bed already. His dad said, 'Is that you, Ron?' And Ron said, 'Yeah.' And his dad never came out. And by the next morning Ron still hadn't seen him." Later, when Ron went into business for himself, he was persona non grata around Popeil Brothers. "Ronnie was never allowed in the place after that," one of S.J.'s former associates recalls. "He was never let in the front door. He was never allowed to be part of anything." My father, Ron says simply, "was all business. I didn't know him personally."

  Here is a man who constructed his life in the image of his father - who went into the same business, who applied the same relentless attention to the workings of the kitchen, who got his start by selling his father's own products - and where was his father? "You know, they could have done wonders together," Korey says, shaking his head. "I remember one time we talked with K-tel about joining forces, and they said that we would be a war machine - that was their word. Well, Ron and his dad, they could have been a war machine." For all that, it is hard to find in Ron even a trace of bitterness. Once, I asked him, "Who are your inspirations?" The first name came easily: his good friend Steve Wynn. He was silent for a moment, and then he added, "My father." Despite everything, Ron clearly found in his father's example a tradition of irresistible value. And what did Ron do with that tradition? He transcended it. He created the Showtime, which is indisputably a better gadget, dollar for dollar, than the Morris Metric Slicer, the Dutch Kitchen Shredder Grater, the Chop-O-Matic, and the Veg-O-Matic combined.

  When I was in Ocean Township, visiting Arnold Morris, he took me to the local Jewish cemetery, Chesed Shel Ames, on a small hilltop just outside town. We drove slowly through the town's poorer sections in Arnold's white Mercedes. It was a rainy day. At the cemetery, a man stood out front in an undershirt, drinking a beer. We entered through a little rusty gate. "This is where it all starts," Arnold said, by which he meant that everyone - the whole spirited, squabbling clan - was buried here. We walked up and down the rows until we found, off in a corner, the Morris headstones. There was Nathan Morris, of the straw boater and the opportune heart attack, and next to him his wife, Betty. A few rows over was the family patriarch, Kidders Morris, and his wife, and a few rows from there Irving Rosenbloom, who made a fortune in plastic goods out on Long Island. Then all the Popeils, in tidy rows: Ron's grandfather Isadore, who was as mean as a snake, and his wife, Mary; S.J., who turned a cold shoulder to his own son; Ron's brother, Jerry, who died young. Ron was from them, but he was not of them. Arnold walked slowly among the tombstones, the rain dancing off his baseball cap, and then he said something that seemed perfectly right. "You know, I'll bet you you'll never find Ronnie here."

  On the Air

  One Saturday night a few weeks ago, Ron Popeil arrived at the headquarters of the television shopping network QVC, a vast gleaming complex nestled in the woods of suburban Philadelphia. Ron is a regular on QVC. He supplements his infomercials with occasional appearances on the network, and, for twenty-four hours beginning that midnight, QVC had granted him eight live slots, starting with a special "Ronco" hour between midnight and 1 a.m. Ron was travelling with his daughter Shannon, who had got her start in the business selling the Ronco Electric Food Dehydrator on the fair circuit, and the plan was that the two of them would alternate throughout the day. They were pitching a Digital Jog Dial version of the Showtime, in black, available for one day only, at a "special value" of $129.72.

  In the studio, Ron had set up eighteen Digital Jog Dial Showtimes on five wood-panelled gurneys. From Los Angeles, he had sent, via Federal Express, dozens of Styrofoam containers with enough meat for each of the day's airings: eight fifteen-pound turkeys, seventy-two hamburgers, eight legs of lamb, eight ducks, thirty-odd chickens, two dozen or so Rock Cornish game hens, and on and on, supplementing them with garnishes, trout, and some sausage bought that morning at three Philadelphia-area supermarkets. QVC's target was thirty-seven thousand machines, meaning that it hoped to gross about $4.5 million during the twenty-four hours - a huge day, even by the network's standards. Ron seemed tense. He barked at the team of QVC producers and cameramen bustling around the room. He fussed over the hero plates - the ready-made dinners that he would use to showcase meat taken straight from the oven. "Guys, this is impossible," he said, peering at a tray of mashed potatoes and gravy. "The level of gravy must be higher." He was limping a little. "You know, there's a lot of pressure on you," he said wearily. " 'How did Ron do? Is he still the best?' "

  With just a few minutes to go, Ron ducked into the greenroom next to the studio to put GLH in his hair: a few aerosol bursts, followed by vigorous brushing. "Where is God right now?" his co-host, Rick Domeier, yelled out, looking around theatrically for his guest star. "Is God backstage?" Ron then appeared, resplendent in a chef's coat, and the cameras began to roll. He sliced open a leg of lamb. He played with the dial of the new digital Showtime. He admired the crispy, succulent skin of the duck. He discussed the virtues of the new food-warming feature - where the machine would rotate at low heat for up to four hours after the meat was cooked in order to keep the juices moving - and, all the while, bantered so convincingly with viewers calling in on the testimonial line that it was as if he were back mesmerizing the secretaries in the Woolworth's at State and Washington.

  In the greenroom, there were two computer monitors. The first displayed a line graph charting the number of calls that came in at any given second. The second was an electronic ledger showing the total sales up to that point. As Ron took flight, one by one, people left the studio to gather around the computers. Shannon Popeil came first. It was 12:40 a.m. In the studio, Ron was slicing onions with one of his father's Dial-O-Matics. She looked at the second monitor and gave a little gasp. Forty minutes in, and Ron had already passed seven hundred thousand dollars. A QVC manager walked in. It was 12:48 a.m., and Ron was roaring on: $837,650. "It can't be!" he cried out. "That's unbelievable!" Two QVC producers came over. One of them pointed at the first monitor, which was graphing the call volume. "Jump," he called out. "Jump!" There were only a few minutes left. Ron was extolling the virtues of the oven one final time, and, sure enough, the line began to take a sharp turn upward, as all over America viewers took out their wallets. The numbers on the second screen began to change in a blur of recalculation - rising in increments of $129.72 plus shipping and taxes. "You know, we're going to hit a million dollars, just on the first hour," one of the QVC guys said, and there was awe in his voice. It was one thing to talk about how Ron was the best there ever was, after all, but quite another to see proof of it, before your very eyes. At that moment, on the other side of the room, the door opened, and a man appeared, stooped and drawn but with a smile on his face. It was Ron Popeil, who invented a better rotisserie in his kitchen and went out and pitched it himself. There was a hush, and then the whole room stood up and cheered.
GO TO TOP MENU

  Why some people choke and others panic.

  There was a moment, in the third and deciding set of the 1993 Wimbledon final, when Jana Novotna seemed invincible. She was leading 4-1 and serving at 40-30, meaning that she was one point from winning the game, and just five points from the most coveted championship in tennis. She had just hit a backhand to her opponent, Steffi Graf, that skimmed the net and landed so abruptly on the far side of the court that Graf could only watch, in flat- footed frustration. The stands at Center Court were packed. The Duke and Duchess of Kent were in their customary place in the royal box. Novotna was in white, poised and confident, her blond hair held back with a headband - and then something happened. She served the ball straight into the net. She stopped and steadied herself for the second serve - the toss, the arch of the back - but this time it was worse. Her swing seemed halfhearted, all arm and no legs and torso. Double fault. On the next point, she was slow to react to a high shot by Graf, and badly missed on a forehand volley. At game point, she hit an overhead straight into the net. Instead of 5-1, it was now 4-2. Graf to serve: an easy victory, 4-3. Novotna to serve. She wasn't tossing the ball high enough. Her head was down. Her movements had slowed markedly. She double-faulted once, twice, three times. Pulled wide by a Graf forehand, Novotna inexplicably hit a low, flat shot directly at Graf, instead of a high crosscourt forehand that would have given her time to get back into position: 4-4. Did she suddenly realize how terrifyingly close she was to victory? Did she remember that she had never won a major tournament before? Did she look across the net and see Steffi Graf - Steffi Graf! - the greatest player of her generation?

  On the baseline, awaiting Graf's serve, Novotna was now visibly agitated, rocking back and forth, jumping up and down. She talked to herself under her breath. Her eyes darted around the court. Graf took the game at love; Novotna, moving as if in slow motion, did not win a single point: 5-4, Graf. On the sidelines, Novotna wiped her racquet and her face with a towel, and then each finger individually. It was her turn to serve. She missed a routine volley wide, shook her head, talked to herself. She missed her first serve, made the second, then, in the resulting rally, mis-hit a backhand so badly that it sailed off her racquet as if launched into flight. Novotna was unrecognizable, not an élite tennis player but a beginner again. She was crumbling under pressure, but exactly why was as baffling to her as it was to all those looking on. Isn't pressure supposed to bring out the best in us? We try harder. We concentrate harder. We get a boost of adrenaline. We care more about how well we perform. So what was happening to her?

  At championship point, Novotna hit a low, cautious, and shallow lob to Graf. Graf answered with an unreturnable overhead smash, and, mercifully, it was over. Stunned, Novotna moved to the net. Graf kissed her twice. At the awards ceremony, the Duchess of Kent handed Novotna the runner-up's trophy, a small silver plate, and whispered something in her ear, and what Novotna had done finally caught up with her. There she was, sweaty and exhausted, looming over the delicate white-haired Duchess in her pearl necklace. The Duchess reached up and pulled her head down onto her shoulder, and Novotna started to sob.

  Human beings sometimes falter under pressure. Pilots crash and divers drown. Under the glare of competition, basketball players cannot find the basket and golfers cannot find the pin. When that happens, we say variously that people have "panicked" or, to use the sports colloquialism, "choked." But what do those words mean? Both are pejoratives. To choke or panic is considered to be as bad as to quit. But are all forms of failure equal? And what do the forms in which we fail say about who we are and how we think?We live in an age obsessed with success, with documenting the myriad ways by which talented people overcome challenges and obstacles. There is as much to be learned, though, from documenting the myriad ways in which talented people sometimes fail.

  "Choking" sounds like a vague and all-encompassing term, yet it describes a very specific kind of failure. For example, psychologists often use a primitive video game to test motor skills. They'll sit you in front of a computer with a screen that shows four boxes in a row, and a keyboard that has four corresponding buttons in a row. One at a time, x's start to appear in the boxes on the screen, and you are told that every time this happens you are to push the key corresponding to the box. According to Daniel Willingham, a psychologist at the University of Virginia, if you're told ahead of time about the pattern in which those x's will appear, your reaction time in hitting the right key will improve dramatically. You'll play the game very carefully for a few rounds, until you've learned the sequence, and then you'll get faster and faster. Willingham calls this "explicit learning." But suppose you're not told that the x's appear in a regular sequence, and even after playing the game for a while you're not aware that there is a pattern. You'll still get faster: you'll learn the sequence unconsciously. Willingham calls that "implicit learning" - learning that takes place outside of awareness. These two learning systems are quite separate, based in different parts of the brain. Willingham says that when you are first taught something - say, how to hit a backhand or an overhead forehand - you think it through in a very deliberate, mechanical manner. But as you get better the implicit system takes over: you start to hit a backhand fluidly, without thinking. The basal ganglia, where implicit learning partially resides, are concerned with force and timing, and when that system kicks in you begin to develop touch and accuracy, the ability to hit a drop shot or place a serve at a hundred miles per hour. "This is something that is going to happen gradually," Willingham says. "You hit several thousand forehands, after a while you may still be attending to it. But not very much. In the end, you don't really notice what your hand is doing at all."

  Under conditions of stress, however, the explicit system sometimes takes over. That's what it means to choke. When Jana Novotna faltered at Wimbledon, it was because she began thinking about her shots again. She lost her fluidity, her touch. She double-faulted on her serves and mis-hit her overheads, the shots that demand the greatest sensitivity in force and timing. She seemed like a different person - playing with the slow, cautious deliberation of a beginner - because, in a sense, she was a beginner again: she was relying on a learning system that she hadn't used to hit serves and overhead forehands and volleys since she was first taught tennis, as a child. The same thing has happened to Chuck Knoblauch, the New York Yankees' second baseman, who inexplicably has had trouble throwing the ball to first base. Under the stress of playing in front of forty thousand fans at Yankee Stadium, Knoblauch finds himself reverting to explicit mode, throwing like a Little Leaguer again.

  Panic is something else altogether. Consider the following account of a scuba-diving accident, recounted to me by Ephimia Morphew, a human-factors specialist at nasa: "It was an open-water certification dive, Monterey Bay, California, about ten years ago. I was nineteen. I'd been diving for two weeks. This was my first time in the open ocean without the instructor. Just my buddy and I. We had to go about forty feet down, to the bottom of the ocean, and do an exercise where we took our regulators out of our mouth, picked up a spare one that we had on our vest, and practiced breathing out of the spare. My buddy did hers. Then it was my turn. I removed my regulator. I lifted up my secondary regulator. I put it in my mouth, exhaled, to clear the lines, and then I inhaled, and, to my surprise, it was water. I inhaled water. Then the hose that connected that mouthpiece to my tank, my air source, came unlatched and air from the hose came exploding into my face.

  "Right away, my hand reached out for my partner's air supply, as if I was going to rip it out. It was without thought. It was a physiological response. My eyes are seeing my hand do something irresponsible. I'm fighting with myself. Don't do it. Then I searched my mind for what I could do. And nothing came to mind. All I could remember was one thing: If you can't take care of yourself, let your buddy take care of you. I let my hand fall back to my side, and I just stood there."

  This is a textbook example of panic. In that moment, Morphew stopped thinking. She forgot that she had another source of air, one that worked perfectly well and that, moments before, she had taken out of her mouth. She forgot that her partner had a working air supply as well, which could easily be shared, and she forgot that grabbing her partner's regulator would imperil both of them. All she had was her most basic instinct: get air. Stress wipes out short-term memory. People with lots of experience tend not to panic, because when the stress suppresses their short- term memory they still have some residue of experience to draw on. But what did a novice like Morphew have? I searched my mind for what I could do. And nothing came to mind.

  Panic also causes what psychologists call perceptual narrowing. In one study, from the early seventies, a group of subjects were asked to perform a visual acuity task while undergoing what they thought was a sixty-foot dive in a pressure chamber. At the same time, they were asked to push a button whenever they saw a small light flash on and off in their peripheral vision. The subjects in the pressure chamber had much higher heart rates than the control group, indicating that they were under stress. That stress didn't affect their accuracy at the visual-acuity task, but they were only half as good as the control group at picking up the peripheral light. "You tend to focus or obsess on one thing," Morphew says. "There's a famous airplane example, where the landing light went off, and the pilots had no way of knowing if the landing gear was down. The pilots were so focussed on that light that no one noticed the autopilot had been disengaged, and they crashed the plane." Morphew reached for her buddy's air supply because it was the only air supply she could see.

  Panic, in this sense, is the opposite of choking. Choking is about thinking too much. Panic is about thinking too little. Choking is about loss of instinct. Panic is reversion to instinct. They may look the same, but they are worlds apart.

  Why does this distinction matter? In some instances, it doesn't much. If you lose a close tennis match, it's of little moment whether you choked or panicked; either way, you lost. But there are clearly cases when how failure happens is central to understanding why failure happens.

  Take the plane crash in which John F. Kennedy, Jr., was killed last summer. The details of the flight are well known. On a Friday evening last July, Kennedy took off with his wife and sister-in-law for Martha's Vineyard. The night was hazy, and Kennedy flew along the Connecticut coastline, using the trail of lights below him as a guide. At Westerly, Rhode Island, he left the shoreline, heading straight out over Rhode Island Sound, and at that point, apparently disoriented by the darkness and haze, he began a series of curious maneuvers: He banked his plane to the right, farther out into the ocean, and then to the left. He climbed and descended. He sped up and slowed down. Just a few miles from his destination, Kennedy lost control of the plane, and it crashed into the ocean.

  Kennedy's mistake, in technical terms, was that he failed to keep his wings level. That was critical, because when a plane banks to one side it begins to turn and its wings lose some of their vertical lift. Left unchecked, this process accelerates. The angle of the bank increases, the turn gets sharper and sharper, and the plane starts to dive toward the ground in an ever-narrowing corkscrew. Pilots call this the graveyard spiral. And why didn't Kennedy stop the dive? Because, in times of low visibility and high stress, keeping your wings level - indeed, even knowing whether you are in a graveyard spiral - turns out to be surprisingly difficult. Kennedy failed under pressure.

  Had Kennedy been flying during the day or with a clear moon, he would have been fine. If you are the pilot, looking straight ahead from the cockpit, the angle of your wings will be obvious from the straight line of the horizon in front of you. But when it's dark outside the horizon disappears. There is no external measure of the plane's bank. On the ground, we know whether we are level even when it's dark, because of the motion-sensing mechanisms in the inner ear. In a spiral dive, though, the effect of the plane's G-force on the inner ear means that the pilot feels perfectly level even if his plane is not. Similarly, when you are in a jetliner that is banking at thirty degrees after takeoff, the book on your neighbor's lap does not slide into your lap, nor will a pen on the floor roll toward the "down" side of the plane. The physics of flying is such that an airplane in the midst of a turn always feels perfectly level to someone inside the cabin.

  This is a difficult notion, and to understand it I went flying with William Langewiesche, the author of a superb book on flying, "Inside the Sky." We met at San Jose Airport, in the jet center where the Silicon Valley billionaires keep their private planes. Langewiesche is a rugged man in his forties, deeply tanned, and handsome in the way that pilots (at least since the movie "The Right Stuff") are supposed to be. We took off at dusk, heading out toward Monterey Bay, until we had left the lights of the coast behind and night had erased the horizon. Langewiesche let the plane bank gently to the left. He took his hands off the stick. The sky told me nothing now, so I concentrated on the instruments. The nose of the plane was dropping. The gyroscope told me that we were banking, first fifteen, then thirty, then forty-five degrees. "We're in a spiral dive," Langewiesche said calmly. Our airspeed was steadily accelerating, from a hundred and eighty to a hundred and ninety to two hundred knots. The needle on the altimeter was moving down. The plane was dropping like a stone, at three thousand feet per minute. I could hear, faintly, a slight increase in the hum of the engine, and the wind noise as we picked up speed. But if Langewiesche and I had been talking I would have caught none of that. Had the cabin been unpressurized, my ears might have popped, particularly as we went into the steep part of the dive. But beyond that? Nothing at all. In a spiral dive, the G-load - the force of inertia - is normal. As Langewiesche puts it, the plane likes to spiral-dive. The total time elapsed since we started diving was no more than six or seven seconds. Suddenly, Langewiesche straightened the wings and pulled back on the stick to get the nose of the plane up, breaking out of the dive. Only now did I feel the full force of the G-load, pushing me back in my seat. "You feel no G-load in a bank," Langewiesche said. "There's nothing more confusing for the uninitiated."

  I asked Langewiesche how much longer we could have fallen. "Within five seconds, we would have exceeded the limits of the airplane," he replied, by which he meant that the force of trying to pull out of the dive would have broken the plane into pieces. I looked away from the instruments and asked Langewiesche to spiral-dive again, this time without telling me. I sat and waited. I was about to tell Langewiesche that he could start diving anytime, when, suddenly, I was thrown back in my chair. "We just lost a thousand feet," he said.

  This inability to sense, experientially, what your plane is doing is what makes night flying so stressful. And this was the stress that Kennedy must have felt when he turned out across the water at Westerly, leaving the guiding lights of the Connecticut coastline behind him. A pilot who flew into Nantucket that night told the National Transportation Safety Board that when he descended over Martha's Vineyard he looked down and there was "nothing to see. There was no horizon and no light.   . I thought the island might [have] suffered a power failure." Kennedy was now blind, in every sense, and he must have known the danger he was in. He had very little experience in flying strictly by instruments. Most of the time when he had flown up to the Vineyard the horizon or lights had still been visible. That strange, final sequence of maneuvers was Kennedy's frantic search for a clearing in the haze. He was trying to pick up the lights of Martha's Vineyard, to restore the lost horizon. Between the lines of the National Transportation Safety Board's report on the crash, you can almost feel his desperation:

About 2138 the target began a right turn in a southerly direction. About 30 seconds later, the target stopped its descent at 2200 feet and began a climb that lasted another 30 seconds. During this period of time, the target stopped the turn, and the airspeed decreased to about 153 KIAS. About 2139, the target leveled off at 2500 feet and flew in a southeasterly direction. About 50 seconds later, the target entered a left turn and climbed to 2600 feet. As the target continued in the left turn, it began a descent that reached a rate of about 900 fpm.

  But was he choking or panicking? Here the distinction between those two states is critical. Had he choked, he would have reverted to the mode of explicit learning. His movements in the cockpit would have become markedly slower and less fluid. He would have gone back to the mechanical, self-conscious application of the lessons he had first received as a pilot - and that might have been a good thing. Kennedy needed to think, to concentrate on his instruments, to break away from the instinctive flying that served him when he had a visible horizon.

  But instead, from all appearances, he panicked. At the moment when he needed to remember the lessons he had been taught about instrument flying, his mind - like Morphew's when she was underwater - must have gone blank. Instead of reviewing the instruments, he seems to have been focussed on one question: Where are the lights of Martha's Vineyard? His gyroscope and his other instruments may well have become as invisible as the peripheral lights in the underwater-panic experiments. He had fallen back on his instincts - on the way the plane felt - and in the dark, of course, instinct can tell you nothing. The N.T.S.B. report says that the last time the Piper's wings were level was seven seconds past 9:40, and the plane hit the water at about 9:41, so the critical period here was less than sixty seconds. At twenty-five seconds past the minute, the plane was tilted at an angle greater than forty-five degrees. Inside the cockpit it would have felt normal. At some point, Kennedy must have heard the rising wind outside, or the roar of the engine as it picked up speed. Again, relying on instinct, he might have pulled back on the stick, trying to raise the nose of the plane. But pulling back on the stick without first levelling the wings only makes the spiral tighter and the problem worse. It's also possible that Kennedy did nothing at all, and that he was frozen at the controls, still frantically searching for the lights of the Vineyard, when his plane hit the water. Sometimes pilots don't even try to make it out of a spiral dive. Langewiesche calls that "one G all the way down."

  What happened to Kennedy that night illustrates a second major difference between panicking and choking. Panicking is conventional failure, of the sort we tacitly understand. Kennedy panicked because he didn't know enough about instrument flying. If he'd had another year in the air, he might not have panicked, and that fits with what we believe - that performance ought to improve with experience, and that pressure is an obstacle that the diligent can overcome. But choking makes little intuitive sense. Novotna's problem wasn't lack of diligence; she was as superbly conditioned and schooled as anyone on the tennis tour. And what did experience do for her? In 1995, in the third round of the French Open, Novotna choked even more spectacularly than she had against Graf, losing to Chanda Rubin after surrendering a 5-0 lead in the third set. There seems little doubt that part of the reason for her collapse against Rubin was her collapse against Graf - that the second failure built on the first, making it possible for her to be up 5-0 in the third set and yet entertain the thought I can still lose. If panicking is conventional failure, choking is paradoxical failure.

  Claude Steele, a psychologist at Stanford University, and his colleagues have done a number of experiments in recent years looking at how certain groups perform under pressure, and their findings go to the heart of what is so strange about choking. Steele and Joshua Aronson found that when they gave a group of Stanford undergraduates a standardized test and told them that it was a measure of their intellectual ability, the white students did much better than their black counterparts. But when the same test was presented simply as an abstract laboratory tool, with no relevance to ability, the scores of blacks and whites were virtually identical. Steele and Aronson attribute this disparity to what they call "stereotype threat": when black students are put into a situation where they are directly confronted with a stereotype about their group - in this case, one having to do with intelligence - the resulting pressure causes their performance to suffer.

  Steele and others have found stereotype threat at work in any situation where groups are depicted in negative ways. Give a group of qualified women a math test and tell them it will measure their quantitative ability and they'll do much worse than equally skilled men will; present the same test simply as a research tool and they'll do just as well as the men. Or consider a handful of experiments conducted by one of Steele's former graduate students, Julio Garcia, a professor at Tufts University. Garcia gathered together a group of white, athletic students and had a white instructor lead them through a series of physical tests: to jump as high as they could, to do a standing broad jump, and to see how many pushups they could do in twenty seconds. The instructor then asked them to do the tests a second time, and, as you'd expect, Garcia found that the students did a little better on each of the tasks the second time around. Then Garcia ran a second group of students through the tests, this time replacing the instructor between the first and second trials with an African-American. Now the white students ceased to improve on their vertical leaps. He did the experiment again, only this time he replaced the white instructor with a black instructor who was much taller and heavier than the previous black instructor. In this trial, the white students actually jumped less high than they had the first time around. Their performance on the pushups, though, was unchanged in each of the conditions. There is no stereotype, after all, that suggests that whites can't do as many pushups as blacks. The task that was affected was the vertical leap, because of what our culture says: white men can't jump.

  It doesn't come as news, of course, that black students aren't as good at test-taking as white students, or that white students aren't as good at jumping as black students. The problem is that we've always assumed that this kind of failure under pressure is panic. What is it we tell underperforming athletes and students? The same thing we tell novice pilots or scuba divers: to work harder, to buckle down, to take the tests of their ability more seriously. But Steele says that when you look at the way black or female students perform under stereotype threat you don't see the wild guessing of a panicked test taker. "What you tend to see is carefulness and second-guessing," he explains. "When you go and interview them, you have the sense that when they are in the stereotype-threat condition they say to themselves, 'Look, I'm going to be careful here. I'm not going to mess things up.' Then, after having decided to take that strategy, they calm down and go through the test. But that's not the way to succeed on a standardized test. The more you do that, the more you will get away from the intuitions that help you, the quick processing. They think they did well, and they are trying to do well. But they are not." This is choking, not panicking. Garcia's athletes and Steele's students are like Novotna, not Kennedy. They failed because they were good at what they did: only those who care about how well they perform ever feel the pressure of stereotype threat. The usual prescription for failure - to work harder and take the test more seriously - would only make their problems worse.

  That is a hard lesson to grasp, but harder still is the fact that choking requires us to concern ourselves less with the performer and more with the situation in which the performance occurs. Novotna herself could do nothing to prevent her collapse against Graf. The only thing that could have saved her is if - at that critical moment in the third set - the television cameras had been turned off, the Duke and Duchess had gone home, and the spectators had been told to wait outside. In sports, of course, you can't do that. Choking is a central part of the drama of athletic competition, because the spectators have to be there - and the ability to overcome the pressure of the spectators is part of what it means to be a champion. But the same ruthless inflexibility need not govern the rest of our lives. We have to learn that sometimes a poor performance reflects not the innate ability of the performer but the complexion of the audience; and that sometimes a poor test score is the sign not of a poor student but of a good one.

  Through the first three rounds of the 1996 Masters golf tournament, Greg Norman held a seemingly insurmountable lead over his nearest rival, the Englishman Nick Faldo. He was the best player in the world. His nickname was the Shark. He didn't saunter down the fairways; he stalked the course, blond and broad-shouldered, his caddy behind him, struggling to keep up. But then came the ninth hole on the tournament's final day. Norman was paired with Faldo, and the two hit their first shots well. They were now facing the green. In front of the pin, there was a steep slope, so that any ball hit short would come rolling back down the hill into oblivion. Faldo shot first, and the ball landed safely long, well past the cup.

  Norman was next. He stood over the ball. "The one thing you guard against here is short," the announcer said, stating the obvious. Norman swung and then froze, his club in midair, following the ball in flight. It was short. Norman watched, stone-faced, as the ball rolled thirty yards back down the hill, and with that error something inside of him broke.

  At the tenth hole, he hooked the ball to the left, hit his third shot well past the cup, and missed a makable putt. At eleven, Norman had a three-and-a-half-foot putt for par - the kind he had been making all week. He shook out his hands and legs before grasping the club, trying to relax. He missed: his third straight bogey. At twelve, Norman hit the ball straight into the water. At thirteen, he hit it into a patch of pine needles. At sixteen, his movements were so mechanical and out of synch that, when he swung, his hips spun out ahead of his body and the ball sailed into another pond. At that, he took his club and made a frustrated scythelike motion through the grass, because what had been obvious for twenty minutes was now official: he had fumbled away the chance of a lifetime.

  Faldo had begun the day six strokes behind Norman. By the time the two started their slow walk to the eighteenth hole, through the throng of spectators, Faldo had a four- stroke lead. But he took those final steps quietly, giving only the smallest of nods, keeping his head low. He understood what had happened on the greens and fairways that day. And he was bound by the particular etiquette of choking, the understanding that what he had earned was something less than a victory and what Norman had suffered was something less than a defeat.

  When it was all over, Faldo wrapped his arms around Norman. "I don't know what to say - I just want to give you a hug," he whispered, and then he said the only thing you can say to a choker: "I feel horrible about what happened. I'm so sorry." With that, the two men began to cry.
GO TO TOP MENU

  Every now and again in politics, there is a moment that captures the temper of the times, and our moment may have come this budget season in Washington. The Centers for Disease Control asked Congress if, for an extra fifteen million dollars in C.D.C. funding, it would like to wipe out syphilis from the United States by 2005. And Congress said no.

  The request was not a political ploy to get a bigger budget. Syphilis is an epidemic that, for reasons no one quite understands, runs in cycles, and, after peaking in 1990, the disease is now at its lowest level in United States history. It has retreated to a handful of areas across the country: just twenty- five counties account for half of all cases. In other words, syphilis is very close to that critical point faced by many epidemics, when even the slightest push could tip them into oblivion. That's why the C.D.C. has asked for the extra fifteen million dollars - to supply that final push.

  This was all patiently explained to Congress last year as the epidemic first neared its lowest ebb. The C.D.C. proposed the most prosaic and straightforward of public-health efforts - an aggressive regimen of free diagnosis and treatment. The drug of choice? Penicillin, the same antibiotic that has been so successful in fighting syphilis for the past half century. Congress wasn't interested. This year, the C.D.C. made its case again, and again the public-health budgets that emerged from the House and the Senate left the agency well short of the necessary funding. Next year, unfortunately, the moment when syphilis can be easily eliminated will have passed. The disease will have begun its cyclical return, moving out of the familiar, well-defined neighborhoods where it is now sequestered, and presenting a much more formidable target for public-health officials. "If you miss the timing, there is a point when it is no longer feasible to move to elimination," says Judy Wasserheit, who is the head of the C.D.C.'s syphilis-prevention effort. "We're already pushing the limits of that time frame."

  Exactly why, in a period of fiscal plenty, Congress cannot find the money for an anti-syphilis campaign is a bit puzzling. The disease plays a major role in the transmission of H.I.V., increasing infection rates between two- and five-fold. It often irreparably harms children born to those who are infected. And it is extremely expensive. Even with the rates as low as they are now, syphilis costs the country two hundred and fourteen million dollars a year. Congress has the opportunity to make history by eliminating a disease that has plagued the West for centuries. Why isn't it taking it?

  The truth is, this is the price we pay for the ways in which disease has become steadily politicized. The great insight of the AIDS movement - later picked up by groups concerned about breast cancer and prostate cancer - was that a community afflicted with a specific medical problem could take its case directly to Capitol Hill, bypassing the medical establishment entirely. This has dramatically increased the resources available for medical research. But it has also given Congress an excuse to treat public health as another form of interest-group politics, in which the most deserving constituencies are those which shout the loudest. In fact, when it comes to illness and disease the most deserving constituencies are often those who cannot shout at all. That syphilis is a sexually transmitted disease primarily affecting very poor African-Americans only makes things worse - sex, race, and poverty being words that the present Congress has difficulty pronouncing individually, let alone in combination.

  The last time America came so tantalizingly close to the elimination of syphilis was during the mid-fifties, after the introduction of penicillin. "Are Venereal Diseases disappearing?" the American Journal of Syphilis asked in 1951; four years later, the journal itself had disappeared. Such was the certainty that the era of syphilis was ending that the big debate in the public- health field was ethical rather than medical - namely, how the removal of the threat of venereal disease would affect sexual behavior.

  As Dr. John Stokes, one of the leading experts of his day on sexually transmitted diseases, wrote, "It is a reasonable question, whether by eliminating disease, without commensurate attention to the development of human idealism, self-control, and responsibility in the sexual life, we are not bringing mankind to its fall instead of fulfillment." Stokes assumed that syphilis would soon vanish, and that we ought to worry about the morality of those who could have got the disease but now wouldn't. As it turns out, he had it backward. Syphilis is still with us. And we ought to worry instead about the morality of those who could have eliminated the disease but chose not to.
GO TO TOP MENU

  What do job interviews really tell us?

  1.

  Nolan Myers grew up in Houston, the elder of two boys in a middle- class family. He went to Houston's High School for the Performing and Visual Arts and then Harvard, where he intended to major in History and Science. After discovering the joys of writing code, though, he switched to computer science. "Programming is one of those things you get involved in, and you just can't stop until you finish," Myers says. "You get involved in it, and all of a sudden you look at your watch and it's four in the morning! I love the elegance of it." Myers is short and slightly stocky and has pale-blue eyes. He smiles easily, and when he speaks he moves his hands and torso for emphasis. He plays in a klezmer band called the Charvard Chai Notes. He talks to his parents a lot. He gets B's and B-pluses.

  This spring, in the last stretch of his senior year, Myers spent a lot of time interviewing for jobs with technology companies. He talked to a company named Trilogy, down in Texas, but he didn't think he would fit in. "One of Trilogy's subsidiaries put ads out in the paper saying that they were looking for the top tech students, and that they'd give them two hundred thousand dollars and a BMW," Myers said, shaking his head in disbelief. In another of his interviews, a recruiter asked him to solve a programming problem, and he made a stupid mistake and the recruiter pushed the answer back across the table to him, saying that his "solution" accomplished nothing. As he remembers the moment, Myers blushes. "I was so nervous. I thought, Hmm, that sucks!" The way he says that, though, makes it hard to believe that he really was nervous, or maybe what Nolan Myers calls nervous the rest of us call a tiny flutter in the stomach. Myers doesn't seem like the sort to get flustered. He's the kind of person you would call the night before the big test in seventh grade, when nothing made sense and you had begun to panic.

  I like Nolan Myers. He will, I am convinced, be very good at whatever career he chooses. I say those two things even though I have spent no more than ninety minutes in his presence. We met only once, on a sunny afternoon in April at the Au Bon Pain in Harvard Square. He was wearing sneakers and khakis and a polo shirt, in a dark-green pattern. He had a big backpack, which he plopped on the floor beneath the table. I bought him an orange juice. He fished around in his wallet and came up with a dollar to try and repay me, which I refused. We sat by the window. Previously, we had talked for perhaps three minutes on the phone, setting up the interview. Then I E-mailed him, asking him how I would recognize him at Au Bon Pain. He sent me the following message, with what I'm convinced--again, on the basis of almost no evidence--to be typical Myers panache: "22ish, five foot seven, straight brown hair, very good-looking. :)." I have never talked to his father, his mother, or his little brother, or any of his professors. I have never seen him ecstatic or angry or depressed. I know nothing of his personal habits, his tastes, or his quirks. I cannot even tell you why I feel the way I do about him. He's good-looking and smart and articulate and funny, but not so good-looking and smart and articulate and funny that there is some obvious explanation for the conclusions I've drawn about him. I just like him, and I'm impressed by him, and if I were an employer looking for bright young college graduates, I'd hire him in a heartbeat.

  I heard about Nolan Myers from Hadi Partovi, an executive with Tellme, a highly touted Silicon Valley startup offering Internet access through the telephone. If you were a computer-science major at M.I.T., Harvard, Stanford, Caltech, or the University of Waterloo this spring, looking for a job in software, Tellme was probably at the top of your list. Partovi and I talked in the conference room at Tellme's offices, just off the soaring, open floor where all the firm's programmers and marketers and executives sit, some of them with bunk beds built over their desks. (Tellme recently moved into an old printing plant--a low- slung office building with a huge warehouse attached--and, in accordance with new-economy logic, promptly turned the old offices into a warehouse and the old warehouse into offices.) Partovi is a handsome man of twenty-seven, with olive skin and short curly black hair, and throughout our entire interview he sat with his chair tilted precariously at a forty-five-degree angle. At the end of a long riff about how hard it is to find high-quality people, he blurted out one name: Nolan Myers. Then, from memory, he rattled off Myers's telephone number. He very much wanted Myers to come to Tellme.

  Partovi had met Myers in January, during a recruiting trip to Harvard. "It was a heinous day," Partovi remembers. "I started at seven and went until nine. I'd walk one person out and walk the other in." The first fifteen minutes of every interview he spent talking about Tellme--its strategy, its goals, and its business. Then he gave everyone a short programming puzzle. For the rest of the hour-long meeting, Partovi asked questions. He remembers that Myers did well on the programming test, and after talking to him for thirty to forty minutes he became convinced that Myers had, as he puts it, "the right stuff." Partovi spent even less time with Myers than I did. He didn't talk to Myers's family, or see him ecstatic or angry or depressed, either. He knew that Myers had spent last summer as an intern at Microsoft and was about to graduate from an Ivy League school. But virtually everyone recruited by a place like Tellme has graduated from an élite university, and the Microsoft summer-internship program has more than six hundred people in it. Partovi didn't even know why he liked Myers so much. He just did. "It was very much a gut call," he says.

  This wasn't so very different from the experience Nolan Myers had with Steve Ballmer, the C.E.O. of Microsoft. Earlier this year, Myers attended a party for former Microsoft interns called Gradbash. Ballmer gave a speech there, and at the end of his remarks Myers raised his hand. "He was talking a lot about aligning the company in certain directions," Myers told me, "and I asked him about how that influences his ability to make bets on other directions. Are they still going to make small bets?" Afterward, a Microsoft recruiter came up to Myers and said, "Steve wants your E-mail address." Myers gave it to him, and soon he and Ballmer were E-mailing. Ballmer, it seems, badly wanted Myers to come to Microsoft. "He did research on me," Myers says. "He knew which group I was interviewing with, and knew a lot about me personally. He sent me an E-mail saying that he'd love to have me come to Microsoft, and if I had any questions I should contact him. So I sent him a response, saying thank you. After I visited Tellme, I sent him an E-mail saying I was interested in Tellme, here were the reasons, that I wasn't sure yet, and if he had anything to say I said I'd love to talk to him. I gave him my number. So he called, and after playing phone tag we talked--about career trajectory, how Microsoft would influence my career, what he thought of Tellme. I was extremely impressed with him, and he seemed very genuinely interested in me."

  What convinced Ballmer he wanted Myers? A glimpse! He caught a little slice of Nolan Myers in action and--just like that--the C.E.O. of a four-hundred-billion-dollar company was calling a college senior in his dorm room. Ballmer somehow knew he liked Myers, the same way Hadi Partovi knew, and the same way I knew after our little chat at Au Bon Pain. But what did we know? What could we know? By any reasonable measure, surely none of us knew Nolan Myers at all.

  It is a truism of the new economy that the ultimate success of any enterprise lies with the quality of the people it hires. At many technology companies, employees are asked to all but live at the office, in conditions of intimacy that would have been unthinkable a generation ago. The artifacts of the prototypical Silicon Valley office--the videogames, the espresso bar, the bunk beds, the basketball hoops--are the elements of the rec room, not the workplace. And in the rec room you want to play only with your friends. But how do you find out who your friends are?Today, recruiters canvas the country for résumés. They analyze employment histories and their competitors' staff listings. They call references, and then do what I did with Nolan Myers: sit down with a perfect stranger for an hour and a half and attempt to draw conclusions about that stranger's intelligence and personality. The job interview has become one of the central conventions of the modern economy. But what, exactly, can you know about a stranger after sitting down and talking with him for an hour?

  2.

  Some years ago, an experimental psychologist at Harvard University, Nalini Ambady, together with Robert Rosenthal, set out to examine the nonverbal aspects of good teaching. As the basis of her research, she used videotapes of teaching fellows which had been made during a training program at Harvard. Her plan was to have outside observers look at the tapes with the sound off and rate the effectiveness of the teachers by their expressions and physical cues. Ambady wanted to have at least a minute of film to work with. When she looked at the tapes, though, there was really only about ten seconds when the teachers were shown apart from the students. "I didn't want students in the frame, because obviously it would bias the ratings," Ambady says. "So I went to my adviser, and I said, 'This isn't going to work.'"

  But it did. The observers, presented with a ten-second silent video clip, had no difficulty rating the teachers on a fifteen- item checklist of personality traits. In fact, when Ambady cut the clips back to five seconds, the ratings were the same. They were even the same when she showed her raters just two seconds of videotape. That sounds unbelievable unless you actually watch Ambady's teacher clips, as I did, and realize that the eight seconds that distinguish the longest clips from the shortest are superfluous: anything beyond the first flash of insight is unnecessary. When we make a snap judgment, it is made in a snap. It's also, very clearly, a judgment:we get a feeling that we have no difficulty articulating.

  Ambady's next step led to an even more remarkable conclusion. She compared those snap judgments of teacher effectiveness with evaluations made, after a full semester of classes, by students of the same teachers. The correlation between the two, she found, was astoundingly high. A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher's class for an entire semester.

  Recently, a comparable experiment was conducted by Frank Bernieri, a psychologist at the University of Toledo. Bernieri, working with one of his graduate students, Neha Gada-Jain, selected two people to act as interviewers, and trained them for six weeks in the proper procedures and techniques of giving an effective job interview. The two then interviewed ninety-eight volunteers, of various ages and backgrounds. The interviews lasted between fifteen and twenty minutes, and afterward each interviewer filled out a six-page, five-part evaluation of the person he'd just talked to. Originally, the intention of the study was to find out whether applicants who had been coached in certain nonverbal behaviors designed to ingratiate themselves with their interviewers--like mimicking the interviewers' physical gestures or posture--would get better ratings than applicants who behaved normally. As it turns out, they didn't. But then another of Bernieri's students, an undergraduate named Tricia Prickett, decided that she wanted to use the interview videotapes and the evaluations that had been collected to test out the adage that "the handshake is everything."

  "She took fifteen seconds of videotape showing the applicant as he or she knocks on the door, comes in, shakes the hand of the interviewer, sits down, and the interviewer welcomes the person," Bernieri explained. Then, like Ambady, Prickett got a series of strangers to rate the applicants based on the handshake clip, using the same criteria that the interviewers had used. Once more, against all expectations, the ratings were very similar to those of the interviewers. "On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview," Bernieri says. "The strength of the correlations was extraordinary."

  This research takes Ambady's conclusions one step further. In the Toledo experiment, the interviewers were trained in the art of interviewing. They weren't dashing off a teacher evaluation on their way out the door. They were filling out a formal, detailed questionnaire, of the sort designed to give the most thorough and unbiased account of an interview. And still their ratings weren't all that different from those of people off the street who saw just the greeting.

  This is why Hadi Partovi, Steve Ballmer, and I all agreed on Nolan Myers. Apparently, human beings don't need to know someone in order to believe that they know someone. Nor does it make that much difference, apparently, that Partovi reached his conclusion after putting Myers through the wringer for an hour, I reached mine after ninety minutes of amiable conversation at Au Bon Pain, and Ballmer reached his after watching and listening as Myers asked a question.

  Bernieri and Ambady believe that the power of first impressions suggests that human beings have a particular kind of prerational ability for making searching judgments about others. In Ambady's teacher experiments, when she asked her observers to perform a potentially distracting cognitive task--like memorizing a set of numbers--while watching the tapes, their judgments of teacher effectiveness were unchanged. But when she instructed her observers to think hard about their ratings before they made them, their accuracy suffered substantially. Thinking only gets in the way. "The brain structures that are involved here are very primitive," Ambady speculates. "All of these affective reactions are probably governed by the lower brain structures." What we are picking up in that first instant would seem to be something quite basic about a person's character, because what we conclude after two seconds is pretty much the same as what we conclude after twenty minutes or, indeed, an entire semester. "Maybe you can tell immediately whether someone is extroverted, or gauge the person's ability to communicate,"Bernieri says. "Maybe these clues or cues are immediately accessible and apparent." Bernieri and Ambady are talking about the existence of a powerful form of human intuition. In a way, that's comforting, because it suggests that we can meet a perfect stranger and immediately pick up on something important about him. It means that I shouldn't be concerned that I can't explain why I like Nolan Myers, because, if such judgments are made without thinking, then surely they defy explanation.

  But there's a troubling suggestion here as well. I believe that Nolan Myers is an accomplished and likable person. But I have no idea from our brief encounter how honest he is, or whether he is self-centered, or whether he works best by himself or in a group, or any number of other fundamental traits. That people who simply see the handshake arrive at the same conclusions as people who conduct a full interview also implies, perhaps, that those initial impressions matter too much--that they color all the other impressions that we gather over time.

  For example, I asked Myers if he felt nervous about the prospect of leaving school for the workplace, which seemed like a reasonable question, since I remember how anxious I was before my first job. Would the hours scare him? Oh no, he replied, he was already working between eighty and a hundred hours a week at school. "Are there things that you think you aren't good at, which make you worry?" I continued.

  His reply was sharp: "Are there things that I'm not good at, or things that I can't learn? I think that's the real question. There are a lot of things I don't know anything about, but I feel comfortable that given the right environment and the right encouragement I can do well at." In my notes, next to that reply, I wrote "Great answer!" and I can remember at the time feeling the little thrill you experience as an interviewer when someone's behavior conforms with your expectations. Because I had decided, right off, that I liked him, what I heard in his answer was toughness and confidence. Had I decided early on that I didn't like Nolan Myers, I would have heard in that reply arrogance and bluster. The first impression becomes a self-fulfilling prophecy: we hear what we expect to hear. The interview is hopelessly biased in favor of the nice.

  3.

  When Ballmer and Partovi and I met Nolan Myers, we made a prediction. We looked at the way he behaved in our presence--at the way he talked and acted and seemed to think--and drew conclusions about how he would behave in other situations. I had decided, remember, that Myers was the kind of person you called the night before the big test in seventh grade. Was I right to make that kind of generalization?

  This is a question that social psychologists have looked at closely. In the late nineteen-twenties, in a famous study, the psychologist Theodore Newcomb analyzed extroversion among adolescent boys at a summer camp. He found that how talkative a boy was in one setting--say, lunch--was highly predictive of how talkative that boy would be in the same setting in the future. A boy who was curious at lunch on Monday was likely to be curious at lunch on Tuesday. But his behavior in one setting told you almost nothing about how he would behave in a different setting: from how someone behaved at lunch, you couldn't predict how he would behave during, say, afternoon playtime. In a more recent study, of conscientiousness among students at Carleton College, the researchers Walter Mischel, Neil Lutsky, and Philip K. Peake showed that how neat a student's assignments were or how punctual he was told you almost nothing about how often he attended class or how neat his room or his personal appearance was. How we behave at any one time, evidently, has less to do with some immutable inner compass than with the particulars of our situation.

  This conclusion, obviously, is at odds with our intuition. Most of the time, we assume that people display the same character traits in different situations. We habitually underestimate the large role that context plays in people's behavior. In the Newcomb summer-camp experiment, for example, the results showing how little consistency there was from one setting to another in talkativeness, curiosity, and gregariousness were tabulated from observations made and recorded by camp counsellors on the spot. But when, at the end of the summer, those same counsellors were asked to give their final impressions of the kids, they remembered the children's behavior as being highly consistent.

  "The basis of the illusion is that we are somehow confident that we are getting what is there, that we are able to read off a person's disposition," Richard Nisbett, a psychologist at the University of Michigan, says. "When you have an interview with someone and have an hour with them, you don't conceptualize that as taking a sample of a person's behavior, let alone a possibly biased sample, which is what it is. What you think is that you are seeing a hologram, a small and fuzzy image but still the whole person."

  Then Nisbett mentioned his frequent collaborator, Lee Ross, who teaches psychology at Stanford. "There was one term when he was teaching statistics and one term he was teaching a course with a lot of humanistic psychology. He gets his teacher evaluations. The first referred to him as cold, rigid, remote, finicky, and uptight. And the second described this wonderful warmhearted guy who was so deeply concerned with questions of community and getting students to grow. It was Jekyll and Hyde. In both cases, the students thought they were seeing the real Lee Ross."

  Psychologists call this tendency--to fixate on supposedly stable character traits and overlook the influence of context--the Fundamental Attri-bution Error, and if you combine this error with what we know about snap judgments the interview becomes an even more problematic encounter. Not only had I let my first impressions color the informationI gathered about Myers, but I had also assumed that the way he behaved with me in an interview setting was indicative of the way he would always behave. It isn't that the interview is useless; what I learned about Myers--that he and I get along well--is something I could never have got from a résumé or by talking to his references. It's just that our conversation turns out to have been less useful, and potentially more misleading, than I had supposed. That most basic of human rituals--the conversation with a stranger--turns out to be a minefield.

  4.

  Not long after I met with Nolan Myers, I talked with a human- resources consultant from Pasadena named Justin Menkes. Menkes's job is to figure out how to extract meaning from face-to-face encounters, and with that in mind he agreed to spend an hour interviewing me the way he thinks interviewing ought to be done. It felt, going in, not unlike a visit to a shrink, except that instead of having months, if not years, to work things out, Menkes was set upon stripping away my secrets in one session. Consider, he told me, a commonly asked question like "Describe a few situations in which your work was criticized. How did you handle the criticism?" The problem, Menkes said, is that it's much too obvious what the interviewee is supposed to say. "There was a situation where I was working on a project, and I didn't do as well as I could have," he said, adopting a mock-sincere singsong. "My boss gave me some constructive criticism. And I redid the project. It hurt. Yet we worked it out." The same is true of the question "What would your friends say about you?"--to which the correct answer (preferably preceded by a pause, as if to suggest that it had never dawned on you that someone would ask such a question) is "My guess is that they would call me a people person--either that or a hard worker."

  Myers and I had talked about obvious questions, too. "What is your greatest weakness?" I asked him. He answered, "I tried to work on a project my freshman year, a children's festival. I was trying to start a festival as a benefit here in Boston. And I had a number of guys working with me. I started getting concerned with the scope of the project we were working on--how much responsibility we had, getting things done. I really put the brakes on, but in retrospect I really think we could have done it and done a great job."

  Then Myers grinned and said, as an aside, "Do I truly think that is a fault? Honestly, no." And, of course, he's right. All I'd really asked him was whether he could describe a personal strength as if it were a weakness, and, in answering as he did, he had merely demonstrated his knowledge of the unwritten rules of the interview.

  But, Menkes said, what if those questions were rephrased so that the answers weren't obvious? For example: "At your weekly team meetings, your boss unexpectedly begins aggressively critiquing your performance on a current project. What do you do?"

  I felt a twinge of anxiety. What would I do? I remembered a terrible boss I'd had years ago. "I'd probably be upset," I said. "But I doubt I'd say anything. I'd probably just walk away." Menkes gave no indication whether he was concerned or pleased by that answer. He simply pointed out that another person might well have said something like "I'd go and see my boss later in private, and confront him about why he embarrassed me in front of my team." I was saying that I would probably handle criticism--even inappropriate criticism--from a superior with stoicism; in the second case, the applicant was saying he or she would adopt a more confrontational style. Or, at least, we were telling the interviewer that the workplace demands either stoicism or confrontation--and to Menkes these are revealing and pertinent pieces of information.

  Menkes moved on to another area--handling stress. A typical question in this area is something like "Tell me about a time when you had to do several things at once. How did you handle the situation? How did you decide what to do first?" Menkes says this is also too easy. "I just had to be very organized," he began again in his mock-sincere singsong. "I had to multitask. I had to prioritize and delegate appropriately. I checked in frequently with my boss." Here's how Menkes rephrased it: "You're in a situation where you have two very important responsibilities that both have a deadline that is impossible to meet. You cannot accomplish both. How do you handle that situation?"

  "Well," I said, "I would look at the two and decide what I was best at, and then go to my boss and say, 'It's better that I do one well than both poorly,' and we'd figure out who else could do the other task."

  Menkes immediately seized on a telling detail in my answer. I was in-terested in what job I would do best. But isn't the key issue what job the company most needed to have done? With that comment, I had revealed some-thing valuable: that in a time of work-related crisis I start from a self-centered consideration. "Perhaps you are a bit of a solo practitioner," Menkes said diplomatically. "That's an essential bit of information."

  Menkes deliberately wasn't drawing any broad conclusions. If we are not people who are shy or talkative or outspoken but people who are shy in some contexts, talkative in other situations, and outspoken in still other areas, then what it means to know someone is to catalogue and appreciate all those variations. Menkes was trying to begin that process of cataloguing. This interviewing technique is known as "structured interviewing," and in studies by industrial psychologists it has been shown to be the only kind of interviewing that has any success at all in predicting performance in the workplace. In the structured interviews, the format is fairly rigid. Each applicant is treated in precisely the same manner. The questions are scripted. The interviewers are carefully trained, and each applicant is rated on a series of predetermined scales.

  What is interesting about the structured interview is how narrow its objectives are. When I interviewed Nolan Myers I was groping for some kind of global sense of who he was; Menkes seemed entirely uninterested in arriving at that same general sense of me--he seemed to realize how foolish that expectation was for an hour-long interview. The structured interview works precisely because it isn't really an interview; it isn't about getting to know someone, in a traditional sense. It's as much concerned with rejecting information as it is with collecting it.

  Not surprisingly, interview specialists have found it extraordinarily difficult to persuade most employers to adopt the structured interview. It just doesn't feel right. For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.

  5.

  Nolan Myers agonized over which job to take. He spent half an hour on the phone with Steve Ballmer, and Ballmer was very persuasive. "He gave me very, very good advice," Myers says of his conversations with the Microsoft C.E.O. "He felt that I should go to the place that excited me the most and that I thought would be best for my career. He offered to be my mentor." Myers says he talked to his parents every day about what to do. In February, he flew out to California and spent a Saturday going from one Tellme executive to another, asking and answering questions. "Basically, I had three things I was looking for. One was long-term goals for the company. Where did they see themselves in five years? Second, what position would I be playing in the company?" He stopped and burst out laughing. "And I forget what the third one is." In March, Myers committed to Tellme.

  Will Nolan Myers succeed at Tellme? I think so, although I honestly have no idea. It's a harder question to answer now than it would have been thirty or forty years ago. If this were 1965, Nolan Myers would have gone to work at I.B.M. and worn a blue suit and sat in a small office and kept his head down, and the particulars of his personality would not have mattered so much. It was not so important that I.B.M. understood who you were before it hired you, because you understood what I.B.M. was. If you walked through the door at Armonk or at a branch office in Illinois, you knew what you had to be and how you were supposed to act. But to walk through the soaring, open offices of Tellme, with the bunk beds over the desks, is to be struck by how much more demanding the culture of Silicon Valley is. Nolan Myers will not be provided with a social script, that blue suit and organization chart. Tellme, like any technology startup these days, wants its employees to be part of a fluid team, to be flexible and innovative, to work with shifting groups in the absence of hierarchy and bureaucracy, and in that environment, where the workplace doubles as the rec room, the particulars of your personality matter a great deal.

  This is part of the new economy's appeal, because Tellme's soaring warehouse is a more productive and enjoyable place to work than the little office boxes of the old I.B.M. But the danger here is that we will be led astray in judging these newly important particulars of character. If we let personability--some indefinable, prerational intuition, magnified by the Fundamental Attribution Error--bias the hiring process today, then all we will have done is replace the old-boy network, where you hired your nephew, with the new-boy network, where you hire whoever impressed you most when you shook his hand. Social progress, unless we're careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.

  Myers has spent much of the past year helping to teach Introduction to Computer Science. He realized, he says, that one of the reasons that students were taking the course was that they wanted to get jobs in the software industry. "I decided that, having gone through all this interviewing, I had developed some expertise, and I would like to share that. There is a real skill and art in presenting yourself to potential employers. And so what we did in this class was talk about the kinds of things that employers are looking for--what are they looking for in terms of personality. One of the most important things is that you have to come across as being confident in what you are doing and in who you are. How do you do that? Speak clearly and smile." As he said that, Nolan Myers smiled. "For a lot of people, that's a very hard skill to learn. But for some reason I seem to understand it intuitively."
GO TO TOP MENU

  The T-shirt trade becomes a calling.

  1.

  Dov Charney started his T-shirt business, American Apparel, on the corner of Santa Fe Avenue and the 10 Freeway, a mile or so from downtown Los Angeles. Actually, his factory was built directly underneath the eastbound and westbound lanes, and the roof over the room where the cutters and sewers work was basically the freeway itself, so that the clicking and clacking of sewing machines mixed with the rumble of tractor trailers. It was not, as Dov was the first to admit, an ideal location, with the possible exception that it was just two blocks from the Playpen, the neighborhood strip bar, which made it awfully convenient whenever he decided to conduct a fitting. "Big companies tend to hire fitting models at a hundred bucks an hour," Dov explained recently as he headed over to the Playpen to test some of his new T-shirts. "But they only give you one look. At a strip bar, you get a cross- section of chicks. You've got big chicks, little chicks, big-assed chicks, little-assed chicks, chicks with big tits, and chicks with little tits. You couldn't ask for a better place to fit a shirt."

  He had three of his staff with him, and half a dozen samples of his breakthrough Classic Girl line of "baby T"s, in this case shirts with ribbed raglan three-quarter sleeves in lilac and pink. He walked quickly, leaning forward slightly, as if to improve his aerodynamics. Dov is thirty-one years old and has thick black hair and blue-tinted aviator glasses, and tends to dress in khakis and knit vintage shirts, with one of his own T-shirts as an undergarment. In front of the Playpen, Dov waved to the owner, a middle-aged Lebanese man in a red guayabera, and ushered his group into the gloom of the bar. At this hour - two o'clock in the afternoon - the Playpen was almost empty; just one girl gyrated for a customer, to what sounded like the music from "Ali Baba and the Forty Thieves." The situation was ideal, because it meant the rest of the girls had time to model.

  The first to come over was Diana, dark-haired and buxom. She slipped out of a yellow mesh dress and pulled on one of Dov's baby T's. Dov examined her critically. He was concerned about the collar. The Classic Girl is supposed to have a snug fit, with none of the torquing and bowing that plague lesser shirts. But the prototype was bunching around the neck. Dov gestured to one of his colleagues. "Olin, look what's going on here. I think there's too much binding going into the machine." Diana turned around, and wiggled her behind playfully. Dov pulled the T-shirt tight. "I think it could be a little longer here," he said, pursing his lips. Baby T's, in their earlier incarnation, were short, in some cases above the belly button - something that Dov considers a mistake. The music was now deafening, and over a loudspeaker a "lap-dance promo" was being announced. Dov, oblivious, turned his attention to Mandy, a svelte, long-legged blonde in a black bikini. On her, Dov observed, the shirt did not fit so "emphatically" around the chest as it had on Diana. Dov looked Mandy up and down, tugging and pulling to get the shirt just right. "When you're doing a fitting, often the more oddly shaped girl will tell you a lot more," he said. By now, a crowd of strippers was gathering around him, presumably attracted by the novelty of being asked by a customer to put clothes on. But Dov had seen all he needed to. His life's great cause - which is to produce the world's finest T-shirt for between three and four dollars wholesale - had advanced another step. "What did I learn today?" he asked, as he strode out the door. "I learned that my sleeves are perfect. But I see a quality problem with the collar." He thought for a moment. "And I definitely have to add an inch to the garment."

  2.

  There is a town in upstate New York, just north and west of Albany, called Gloversville, so named because in the late nineteenth century and the early part of the twentieth century ninety-five per cent of the fine gloves sold in the United States were manufactured there. At one time, there were a hundred and sixteen glove factories in the town, employing twelve thousand people and turning out fifteen million dollars' worth of gloves a year. New glove start-ups appeared all the time, whenever some glove entrepreneur - some ambitious handschumacher - had a better idea about how to make a glove. A trade journal, Glovers Review, covered the industry's every step. Local firms - such as Jacob Adler & Co. and Louis Meyers & Sons and Elite Glove Co. - became nationally known brands. When the pogroms of Eastern Europe intensified, in the eighteen-eighties, the Jewish glove cutters of Warsaw - the finest leather artisans of nineteenth-century Europe - moved en masse to Gloversville, because Gloversville was where you went in those days if you cared about gloves.

  It's hard to imagine anyone caring so deeply about gloves, and had we visited Gloversville in its prime most of us would have found it a narrow and provincial place. But if you truly know gloves and think about them and dream about them and, more important, if you are surrounded every day by a community of people who know and think and dream about gloves, a glove becomes more than a glove. In Gloversville, there was an elaborate social hierarchy. The handschumacher considered himself socially and intellectually superior to the schuster and the schneider - the shoemaker and the tailor. To cover the hands, after all, was the highest calling. (As the glover's joke goes, "Did you ever see anyone talk using his boots?") Within the glove world, in turn, the "makers" - the silkers, the closers, and the fourchetters, who sewed the gloves - were inferior to the "cutters," who first confronted the hide, and who advertised their status by going to work wearing white shirts and collars, bow ties or cravats, tigereye cufflinks, and carefully pressed suits. A skilled cutter could glance at a glove and see in it the answers to a hundred questions. Is the leather mocha, the most pliable of all skins, taken from the hide of long-black-haired Arabian sheep? Or is it South African capeskin, the easiest to handle? Is it kid from Spain, peccary from the wild pigs of Brazil and Mexico, chamois from Europe, or cabretta, from a Brazilian hairy sheep? Is the finish "grained" - showing the outside of the hide - or "velvet," meaning that the leather has been buffed? Is it sewn in a full- piqué stitch or a half-piqué, an osann or an overseam? Do the color and texture of the fourchette - the strip of leather that forms the sides of the fingers - match the adjoining leather? The lesson of Gloversville is that behind every ordinary object is a group of people to whom that object is anything but ordinary.

  Dov Charney lives in his own, modern-day version of Gloversville. He is part of a world that cares about T-shirts every bit as much as the handschumachers cared about peccary and cabretta. It is impossible to talk about Dov, for example, without talking about his best friend, Rick Klotz, who runs a clothing company named Fresh Jive, about a mile and a half from Dov's factory. Rick, who is thirty-two, designs short-sleeve shirts and baggy pants and pullovers and vests and printed T-shirts with exquisite graphics (featuring everything from an obscure typographical scheme to the Black Panthers). In the eighties, Rick was a punker, at least until everyone else got short hair, at which point he grew his hair long. Later, in his Ted Nugent-and-TransAm phase, he had, he says, a "big, filthy mustache, like Cheech." Now he is perfectly bald, and drives a black custom-made late-model Cadillac Fleetwood Limited, with a VCR in the back, and, because he sits very low in the seat, and bobs up and down to very loud hip-hop as he drives, the effect, from the street, is slightly comic, like that of a Ping-Pong ball in choppy water. When Dov first came to Los Angeles, a few years ago, he crashed at Rick's apartment in Hollywood, and the two grew so close that Rick believes he and Dov were "separated at birth."

  "If it wasn't for Rick, I wouldn't have been able to make it," Dov says. "I slept on his couch. I checked in for a few days, stayed for a year." This was after an initial foray that Dov had made into the T-shirt business, in South Carolina in the early nineties, failed. "When he lived with me, he was on the brink," Rick added. "Every day was the same. Go to sleep at two with the phone. Then wake up at six to call back East. One time, he was just crying and losing it. It was just so heavy. I was, like, 'Dude, what are you doing?'"

  What do Rick and Dov have in common? It isn't a matter of personality. Dov says that sometimes when he's out with Rick he'll spot one of Rick's T-shirts, and he'll shout, "There's one of your T-shirts!" Rick will look down and away, embarrassed, because he's so acutely aware of how uncool that sounds. Dov couldn't care less. When he spots his own work, he can hardly contain himself. "I always say, 'Hey' " - Dov put on the accent of his native Montreal - "'where did you get that shirt?' Like, if I'm on the subway in New York City. I say, 'You want some more?' I take my bag and give them out for free. I'm excited about it. I could be watching TV at night, or I could be watching a porno, and, boom, there is my T-shirt. I've made millions of them. I always know it!"

  What the two of them share is a certain sensibility. Rick grew up in the Valley and Dov grew up in Montreal, but it's as if they were born and raised in the same small town, where the T-shirt was something that you lived and died for. At dinner one recent night in L.A., Rick talked about how he met Dov, several years ago, at a big trade show in Las Vegas. "I'm at this party sitting out on the balcony. I see this guy dancing and he's - what's the word?" And here Rick did a kind of spastic gyration in his seat. "Imbecilic. He didn't care what anybody thought. And he catches me looking and goes like this." Rick made two pistols out of his fingers, and fired one hand after another. "I was, like, in love."

  Dov seemed touched. "You know, I knew of Rick long before I ever met him. His T-shirt graphics are some of the most respected T-shirt graphics in the world. I swear to God."

  But Rick was being modest again. "No, they're not."

  "If you mention Fresh Jive in most industrialized countries to people that know what good graphics are on T-shirts, they're, like . . . " Dov made an appreciative noise. "I swear, it's like a connoisseur's wine."

  "Maybe at one time," Rick murmured.

  "He is an artist!" Dov went on, his voice rising. "His canvas is fabric!"

  3.

  On the day that he made his foray to the Playpen, Dov met with a fortyish man named Jhean. In the garment-manufacturing business in Los Angeles, the up-and-coming entrepreneurs are Persian and Korean. (Dov has a partner who is Korean.) The occasional throwback, like Dov, is Jewish. Jhean, however, is Haitian. He used to work in government, but now he is in the garment business, a career change of which Dov heartily approved. Jhean was wearing tight black pants, a red silk shirt open to mid-chest, and a gold chain. Dov put his arm around him affectionately. "Jhean is a crazy man," he announced, to no one in particular. "He was going to be one of my partners. We were going to get this whole Montreal Jewish-Korean-Haitian thing going." Jhean turned away, and Dov lowered his voice to a whisper. "Jhean has it in his blood, you know," he said, meaning a feel for T-shirts.

  Dov led Jhean outside, and they sat on a bench, the sun peeking through at them between the off-ramp and the freeway lanes. Jhean handed Dov a men's Fruit of the Loom undershirt, size medium. It was the reason for Jhean's visit. "Who can do this for me?" he asked.

  Dov took the shirt and unfolded it slowly. He held it up in front of his eyes, as a mother might hold a baby, and let out a soft whistle. "This is an unbelievable garment," he said. "Nobody has the machines to make it, except for two parties that I'm aware of. Fruit of the Loom. And Hanes. The shirt is a two-by-one rib. They've taken out one or two of the needles. It's a coarse yarn. And it's tubular, so there is no waste. This is one of the most efficient garments in the world. It comes off the tube like a sock."

  Some T-shirts have two seams down each side: they are made with "open width" fabric, by sewing together the front and the back of the T-shirt. This T-shirt had no seams. It was cut from cotton fabric that had been knitted into a T-shirt-size tube, which is a trickier procedure but means less wasted fabric, lower sewing costs, and less of the twisting that can distort a garment.

  Dov began to run his fingers along the bottom of the shirt, which had been not hemmed but overlocked - with a stitch - to save even more fabric. "This costs, with the right equipment, maybe a dollar. My cost is a dollar-thirty, a dollar-fifty. The finest stuff is two-fifty, two-sixty. If you can make this shirt, you can make millions. But you can't make this shirt. Hanes actually does this even better than Fruit of the Loom. They've got this dialled down." Jhean wondered if he could side-seam it, but Dov just shook his head. "If you side-seam it, you lose the whole energy."

  You could tell that Dov was speaking as much to himself as to Jhean. He was saying that he couldn't reproduce a masterpiece like that undershirt, either. But there was no defeat in his voice, because he knew enough about T-shirts to realize that there is more than one way to make a perfect garment. Dov likes to point out that the average American owns twenty-ve T-shirts - twenty- five! - and, even if you reckon, as he does, that of those only between four and seven are in regular rotation, that's still an enormous market.

  The garment in question was either eighteen- or twenty-singles yarn, which is standard for T-shirts. But what if a T-shirt maker were to use thirty-singles yarn, knitted on a fine-gauge machine, which produces a thinner, more "fashion-forward" fabric? The Fruit of the Loom piece was open-end cotton, and open-end is coarse. Dov likes "ring-spun combed" yarn, which is much softer, and costs an extra eighty cents a pound. Softness also comes from the way the fabric is processed before cutting, and Dov is stickler for that kind of detail. "I have a lot of secret ingredients," he says. "Just like K.F.C. There is the amount of yarn in one revolution, which determines the tightness. There's the spacing of the needle. Then there's the finishing. What kind of chemicals are you using in the finishing? We think this through. We've developed a neurosis about this." In his teens, Dov hooked up with a friend who was selling printed T's outside the Montreal Forum, and Dov's contribution was to provide American Hanes instead of the Canadian poly-cotton-blend Penmans. The Hanes, Dov says, was "creamier," and he contended that the Canadian T-shirt consumer deserved that extra creaminess. When he's inspecting rolls of fabric, Dov will sometimes break into the plastic package wrap and run his hand over the cotton, palm flat, and if you look behind his tinted aviators you'll see that his eyes have closed slightly. Once, he held two white swatches up to the light, in order to demonstrate how one had "erections" - little fibres that stood up straight on the fabric - and the other did not, and then he ran his hand ever so slightly across the surface of the swatch he liked, letting the fibres tickle his palm. "I'm particular," Dov explained. "Like in my underwear. I'm very committed to Hanes thirty-two. I've been wearing it for twelve years. I sleep in it. And if Hanes makes any adjustments I'm picking it up. I watch. They change their labels, they use different countries to make their shit, I know."

  Dov was back inside his factory now, going from the room where all the sewers sit, stitching up T-shirts, to a passageway lined with big rolls of fabric. The fact that Jhean's Fruit of the Loom undershirt was of rib fabric launched him on one of his favorite topics, which was the fabric he personally helped rediscover - baby rib. Baby rib is rib in which the ridges are so close together and the cotton is so fine that it looks like standard T-shirt jersey, and Dov's breakthrough was to realize that because of the way it stretches and supports and feels it was perfect for girls. "See this, that's conventional rib." He pulled on a piece of white fabric, exposing wide ridges of cotton. "It's knitted on larger machines. And it's a larger, bulkier yarn. It's poor-quality cotton. But girls want softness. So, rather than take the cheap road, I've taken the higher road." Dov's baby rib uses finer cotton and tighter stitching, and the fit is tighter across the chest and shoulders, the way he believes a T-shirt ought to look. "There were a few influences," he said, reflecting on the creative process that brought him to baby rib. "I'm not sure which girlfriend, but we can name some." He ticked them off on his fingers. "There's Marcella, from Argentina. I met her in South Beach. She wore these little tops made in South America. And they were finer than the tops that girls were wearing in the States. I got such a boner looking at her in that T-shirt that I thought, This is doing something for me. We've got to explore this opportunity. This was four, five years ago. O.K., I broke up with her, and I started going out with this stripper, Julie, from South Carolina. She had a gorgeous body. She was all-American. And, you know, Julie looked so great in those little T-shirts. She put one on and it meant something."

  Dov pulled out a single typewritten page, a draft of a "mission statement" he was preparing for the industry. This was for a new line of Standard American T-shirts he wanted to start making - thirty-singles, ring-spun, tubular shirts knit on custom- made Asian equpiment. "Dear Client," it began:

  During the last ten years major T-shirt makers such as Hanes and Fruit of the Loom have focused on being "heavier" and generously cut. Innovation and style have been put aside, and there has been a perpetual price war during the last four years. The issues are who can be cheaper, bigger or heavier. . . .Concerns about fit or issues of softness or stretch have been the last priority and have been barely considered. In order to create leadership we have reconstructed the T-shirt and have made a deviation from the traditional "Beefy-T" styled garment. We have redone the typical pattern. It is slightly more fitted - especially in the sleeve and armhole opening. . . . Yes the fabric is lighter, and we think that is a positive aspect of the garment. The garment has a stretch that is reminiscent of T-shirts from decades ago.

  Dov was peering over my shoulder as I read. "We're going to kick everybody's ass," he announced. "The finest T-shirts are six dollars a piece wholesale. The shittiest shirts are like two dollars. We're going to come in at three and have the right stuff. I'm making the perfect fit. I'm going to manufacture this like gasoline."

  If you ask Dov why he's going to these lengths, he'll tell you that it matters to him that Americans can buy an affordable and high-quality T-shirt. That's an admirable notion, but, of course, most of us don't really know what constitutes a high-quality T-shirt: we don't run our hands over a swatch of cotton and let the little fibres tickle our palm, or ruminate on the difference between side-seaming and tubularity. For that matter, few people who bought dress gloves in 1900 knew the difference between a full-piqué or a half-piqué stitch, between high-grade or merely medium-grade peccary. Producers, the economics textbooks tell us, are disciplined by the scrutiny of the marketplace. Yet what of commonplace articles such as T-shirts and gloves, about which most customers don't know enough or care enough to make fine discriminations? Discipline really comes not from customers but from other producers. And here again the economics textbooks steer us wrong, because they place too much emphasis on the role of formal competitors, the Gap or Hanes or the other big glove-maker in your niche. To be sure, Dov can occasionally be inspired by a truly exceptional garment like, say, a two-by-one ribbed undershirt from Fruit of the Loom. But in Gloversville the critical person is not so much the distant rival as the neighbor who is also a contractor, or the guy at the bar downtown who used to be in the business, or the friend at synagogue who is also an expert glove-maker - all of whom can look at your work with a practiced eye and shame you if it isn't right. Dov is motivated to produce a high-quality T-shirt at three dollars because that would mean something to Jhean and to Olin and, most of all, to Rick, whose T-shirt graphics are respected around the world. In Gloversville, the market is not an economic mechanism but - and this is the real power of a place like that - a social one.

  "Everybody got so technically obsessed with reduced shrinkage," Dov went on, and by "everyone" he meant a group of people you could count on the fingers of one hand. "That was a big mistake for the industry because they took away the natural stretch property of a lot of the jersey. If you look at vintage shirts, they had a lot of stretch. Today, they don't. They are like these print boards. They are practically woven in comparison. I say fuck the shrinkage. I have a theory on width shrinkage on rib: I don't care. In fact, you put it on, it will come back." He was pacing back and forth and talking even more rapidly than usual. "I'm concerned about linear shrinkage. But, if it doesn't have any width shrinkage at all, I become concerned, too. I have a fabric I'm working on with a T-shirt engineer. It keeps having zero width shrinkage. That's not desirable!"

  Dov stopped. He had spotted something out of the corner of his eye. It was one of his workers, a young man with a mustache and a goatee and slicked-back hair. He was wearing a black custom T, with two white stripes down the arms. Dov started walking toward him. "Oh, my God. You want to see something?" He reached out and flipped up the tag at the back of the cutter's shirt. "It's a Fresh Jive piece. I made it for Rick five years ago. Somehow this shirt just trickled back here." The sweet serendipity of it all brought a smile to his face.

  4.

  While Dov was perfecting his baby T's, Rick was holding a fashion shoot for his elegant new women's-wear line, Fresh Jive Domestics, which had been conceived by a young designer named Jessica. The shoot was at Rick's friend Deidre's house, a right-angled, white- stuccoed, shag-rugged modernist masterpiece under the Hollywood sign. Deidre rents it from the drummer of the seventies supergroup Bread. Madonna's old house is several hundred yards to the west of Deidre's place, and Aldous Huxley used to live a few hundred yards in the other direction, with the result that her block functions as a kind of architectural enactment of postwar Los Angeles intellectual life. For Rick's purposes, though, the house's main points of attraction were its fabulous details, like the little white Star Trek seats around the kitchen counter and the white baby grand in the window with the autographed Hugh Hefner photo and the feisty brown-haired spitz-collie named Sage barricaded in the kitchen. Rick had a box of disposable cameras, and as he shot the models other people joined in with the disposables, so that in the end Rick would be able to combine both sets of pictures in a brag book. It made for a slightly chaotic atmosphere - particularly since there were at least seven highly active cell phones in the room, each with a different ring, competing with the hip-hop on the stereo - and in the midst of it all Rick walked over to the baby grand and, with a mischievous look on his face, played the opening chords of Beethoven's "Pathétique" sonata.

  Rick was talking about his plans to open a Fresh Jive store in Los Angeles. But he kept saying that it couldn't be on Melrose Avenue, where all the street-wear stores are. "Maybe that would be good for sales," he said. Then he shook his head. "No way."

  Deidre, who was lounging next to the baby grand, started laughing. "You know what, Rick?" she said. "I think it's all about a Fresh Jive store without any Fresh Jive stuff in it."

  It was a joke, but in some way not a joke, because that's the sort of thing that Rick might actually do. He's terrified by the conventional. At dinner the previous evening, for example, he and Dov had talked about a particular piece - the sports-style V-necked raglan custom T with stripes that Dov had spotted on the cutter. Rick introduced it years ago and then stopped making it when everyone else started making it, too.

  "One of our biggest retailers takes me into this room last year," Rick explained. "It's full of custom T-shirts. He said, 'You started this, and everybody else took advantage of it. But you didn't go with it.' He was pissed off at me."

  The businessman in Rick knew that he shouldn't have given up on the shirt so quickly, that he could have made a lot more money had he stayed and exploited the custom-T market. But he couldn't do that, because if he had been in that room with all the other custom T's he risked being known in his world as the guy who started the custom-T trend and then ran out of new ideas. Retail chains like J.C. Penney and Millers Outpost sometimes come to Rick and ask if they can carry Fresh Jive, or ask if he will sell them a big run of a popular piece, and he usually says no. He will allow his clothes to appear only in certain street-wear boutiques. His ambition is to grow three times as big as he is now - to maybe a thirty-million-dollar company - but no larger.

  This is the sensibility of the artisan, and it isn't supposed to play much of a role anymore. We live in the age of the entrepreneur, who responds rationally to global pressures and customer demands in order to maximize profit. To the extent that we still talk of Gloversville - and the glove-making business there has long since faded away - we talk of it as a place that people need to leave behind. There was Lucius N. Littauer, for example, who, having made his fortune with Littauer Brothers Glove Co., in downtown Gloversville, went on to Congress, became a confidant of Presidents McKinley and Roosevelt, and then put up the money for what is now the Kennedy School of Government, at Harvard University. There was Samuel Goldwyn, the motion-picture magnate, who began his career as a cutter with Gloversville's Elite Glove Co. In 1912, he jumped into the movie business. He went to Hollywood. He rode horses and learned to play tennis and croquet. Like so many immigrant Jews in the movie industry, he enacted through his films a very public process of assimilation. This is the oldest of American stories: the heroic young man who leaves the small town to play on the big stage - who wants to be an entrepreneur, not an artisan. But the truth is that we always get the story wrong. It isn't that Littauer and Goldwyn left Gloversville to find the real culture, because the real culture comes from Gloversville, too; places like Washington and Hollywood persist and renew themselves only because Littauers and Goldwyns arrive from time to time, bringing with them a little piece of the real thing.

  "The one paranoia Rick has is that, God forbid, he makes something that another company has," Dov said, at dinner with Rick that night.

  Rick nodded. "In my personal life. Ask Dov. Every piece of clothing I own. Nobody else can have it."

  Rick was wearing a pair of jeans and a plain white T-shirt, but if you looked closely you noticed that it wasn't just any jeans-and- T-shirt ensemble. The pants were an unusual denim chino, from Rick's Beggars Banquet collection. And the shirt?

  "That is a very well-thought-out item," Dov said, gesturing toward Rick. "It's a purple-label BVD. It's no longer available. Size medium. Of all the shirts I've studied, this one has a phenomenal fit."He reached across the table and ran his fingers around the lower edge of the sleeve. Dov is a believer in a T-shirt that is snug on the biceps. "It's not the greatest fabric. But it shrinks perfectly. I actually gave him that shirt. I came back from one of my customers in New York City, on Grand Street, that happens to resell that particular garment."

  It's all of a piece, in the end: the purple-label BVD, the custom-T that he designed but now won't touch. If in Dov's world the true competitive pressures are not economic but social, Rick's version of Gloversville is driven not by the marketplace but by personality - the particular, restless truculence of the sort of person who will give up almost anything and go to any lengths not to be like anyone else.

  "We're doing this line of casual shoes," Rick said, during a rare lull in one of Dov's T-shirt soliloquies. "One is the Crip Slip. It's that corduroy slipper that the gang kids would always wear. The other is the Wino, which is that really cheap canvas slipper that you can buy at K mart for seven dollars and that the winos wear when they're, like, really hung over." His big new idea, Rick explained, was to bring out a line of complementary twelve-inch dolls in those characters. "We could have a guy with baggy pants and a pushcart," he went on. "You know, you pull down his pants and there's skid marks. And we have a full gangster for the Crip Slip."

  Rick was so excited about the idea that he was still talking about it the next day at work. He was with a Fresh Jive designer named Jupiter - a skateboarder from Las Vegas of German, Welsh, Irish, French, Chinese, and Spanish extraction - and a guy named Matt, who wore on his chest a gold-plated, diamond-encrusted Star of David the size of a Peppermint Pattie. "The idea is that the doll would pump the shoe, and the shoe would pump the doll,"Rick said. "The doll for the Crip Slip would be totally gangster. The handkerchief. The plaid shirt or the wife beater. A forty in his hand. Flashing signs. Wouldn't that be crazy?" And then Rick caught himself. "Omigod.The doll for the Crip Slip will have interchangeable hands, with different gang signs!"

  Matt looked awestruck: "Ohhh, that'll be sick!"

  "Wooooow." Jupiter dragged the word out, and shook his head slowly. "That's crazy!"

  5.

  A few days later, Dov drove down to San Diego for Action Sports Retail, a big trade show in the street-wear world. Dov makes the rounds of A.S.R. twice a year, walking up and down through the cavernous conference center, stopping at the booths of hundreds of T-shirt companies and persuading people to buy his shirts wholesale for their lines. This year, he was busy scouting locations for American Apparel's new factory, and so he arrived a day late, clutching a motorized mini-scooter. To his great irritation, he wasn't allowed to carry it in. "This is the most uncool show," he announced, after haggling fruitlessly with the guard at the gate.

  But his mood lifted quickly. How could it not? This was A.S.R., and everyone was wearing T-shirts or selling T-shirts, and because this was a place where people knew their T-shirts a lot of those T-shirts were Dov's. He started down one of the aisles. He pointed to a booth on the left. "They use my T-shirts." Next to that booth was another small company. "They use my T-shirts, too." He was wearing khakis and New Balance sneakers and one of his men's T-shirts in baby rib (a controversial piece, because the binding on the collar was a mere half inch). On his back he had a huge orange pack full of catalogues and samples, and every time he spotted a potential customer he would pull the backpack off and rummage through it, and the contents would spill on the floor.

  Dov spotted a young woman walking toward him in a baby T. "That's a competitor's shirt. I can tell right away. The spacing of the needle. The fabric is not baby rib." He high-fived someone in another booth. Another young woman, in another T-shirt booth, loomed up ahead. "That's my shirt right there. In the green. I even know the stock number." He turned to her: "You're the girl in the olive forty-three, sixty-six sleeveless V with one-inch binding."

  She laughed, but Dov was already off again, plunging back into the fray. "I always have an insecurity that I can be crushed by a bigger business," he said. "Like, Fruit of the Loom decided to do baby T's, and I got a little scared. But then I saw their shirt, and I laughed, because they missed it." Do the suits over at Fruit of the Loom have the same feel for a shirt that Dov does? Were they inspired by Marcella of Argentina and Julie from South Carolina? Those guys were off somewhere in a suburban office park. They weren't in Gloversville. "It was horribly designed," Dov went on. "It was thick, open-end, eighteen-singles coarse rib. It's not the luxury that I offer. See the rib on that collar?" He pulled up the binding on the T-shirt of a friend standing next to him. "Look how thick and spacey it is. That's what they did. They missed the point." Somewhere a cell phone was ringing. A young woman walked past. "Hey!" Dov called out. "That's my T-shirt!"
GO TO TOP MENU

  What the co-inventor of the Pill didn't know about menstruation can endanger women's health.

  1.

  John Rock was christened in 1890 at the Church of the Immaculate Conception in Marlborough, Massachusetts, and married by Cardinal William O'Connell, of Boston. He had five children and nineteen grandchildren. A crucifix hung above his desk, and nearly every day of his adult life he attended the 7 a.m. Mass at St. Mary's in Brookline. Rock, his friends would say, was in love with his church. He was also one of the inventors of the birth-control pill, and it was his conviction that his faith and his vocation were perfectly compatible. To anyone who disagreed he would simply repeat the words spoken to him as a child by his home-town priest: "John, always stick to your conscience. Never let anyone else keep it for you. And I mean anyone else." Even when Monsignor Francis W. Carney, of Cleveland, called him a "moral rapist," and when Frederick Good, the longtime head of obstetrics at Boston City Hospital, went to Boston's Cardinal Richard Cushing to have Rock excommunicated, Rock was unmoved. "You should be afraid to meet your Maker," one angry woman wrote to him, soon after the Pill was approved. "My dear madam," Rock wrote back, "in my faith, we are taught that the Lord is with us always. When my time comes, there will be no need for introductions."

  In the years immediately after the Pill was approved by the F.D.A., in 1960, Rock was everywhere. He appeared in interviews and documentaries on CBS and NBC, in Time, Newsweek, Life, The Saturday Evening Post. He toured the country tirelessly. He wrote a widely discussed book, "The Time Has Come: A Catholic Doctor's Proposals to End the Battle Over Birth Control," which was translated into French, German, and Dutch. Rock was six feet three and rail-thin, with impeccable manners; he held doors open for his patients and addressed them as "Mrs." or "Miss." His mere association with the Pill helped make it seem respectable. "He was a man of great dignity," Dr. Sheldon J. Segal, of the Population Council, recalls. "Even if the occasion called for an open collar, you'd never find him without an ascot. He had the shock of white hair to go along with that. And posture, straight as an arrow, even to his last year." At Harvard Medical School, he was a giant, teaching obstetrics for more than three decades. He was a pioneer in in-vitro fertilization and the freezing of sperm cells, and was the first to extract an intact fertilized egg. The Pill was his crowning achievement. His two collaborators, Gregory Pincus and Min- Cheuh Chang, worked out the mechanism. He shepherded the drug through its clinical trials. "It was his name and his reputation that gave ultimate validity to the claims that the pill would protect women against unwanted pregnancy," Loretta McLaughlin writes in her marvellous 1982 biography of Rock. Not long before the Pill's approval, Rock travelled to Washington to testify before the F.D.A. about the drug's safety. The agency examiner, Pasquale DeFelice, was a Catholic obstetrician from Georgetown University, and at one point, the story goes, DeFelice suggested the unthinkable - that the Catholic Church would never approve of the birth-control pill. "I can still see Rock standing there, his face composed, his eyes riveted on DeFelice," a colleague recalled years later, "and then, in a voice that would congeal your soul, he said, 'Young man, don't you sell my church short.' "

  In the end, of course, John Rock's church disappointed him. In 1968, in the encyclical "Humanae Vitae," Pope Paul VI outlawed oral contraceptives and all other "artificial" methods of birth control. The passion and urgency that animated the birth-control debates of the sixties are now a memory. John Rock still matters, though, for the simple reason that in the course of reconciling his church and his work he made an error. It was not a deliberate error. It became manifest only after his death, and through scientific advances he could not have anticipated. But because that mistake shaped the way he thought about the Pill - about what it was, and how it worked, and most of all what it meant - and because John Rock was one of those responsible for the way the Pill came into the world, his error has colored the way people have thought about contraception ever since.

  John Rock believed that the Pill was a "natural" method of birth control. By that he didn't mean that it felt natural, because it obviously didn't for many women, particularly not in its earliest days, when the doses of hormone were many times as high as they are today. He meant that it worked by natural means. Women can get pregnant only during a certain interval each month, because after ovulation their bodies produce a surge of the hormone progesterone. Progesterone - one of a class of hormones known as progestin - prepares the uterus for implantation and stops the ovaries from releasing new eggs; it favors gestation. "It is progesterone, in the healthy woman, that prevents ovulation and establishes the pre- and post-menstrual 'safe' period," Rock wrote. When a woman is pregnant, her body produces a stream of progestin in part for the same reason, so that another egg can't be released and threaten the pregnancy already under way. Progestin, in other words, is nature's contraceptive. And what was the Pill? Progestin in tablet form. When a woman was on the Pill, of course, these hormones weren't coming in a sudden surge after ovulation and weren't limited to certain times in her cycle. They were being given in a steady dose, so that ovulation was permanently shut down. They were also being given with an additional dose of estrogen, which holds the endometrium together and - as we've come to learn - helps maintain other tissues as well. But to Rock, the timing and combination of hormones wasn't the issue. The key fact was that the Pill's ingredients duplicated what could be found in the body naturally. And in that naturalness he saw enormous theological significance.

  In 1951, for example, Pope Pius XII had sanctioned the rhythm method for Catholics because he deemed it a "natural" method of regulating procreation: it didn't kill the sperm, like a spermicide, or frustrate the normal process of procreation, like a diaphragm, or mutilate the organs, like sterilization. Rock knew all about the rhythm method. In the nineteen-thirties, at the Free Hospital for Women, in Brookline, he had started the country's first rhythm clinic for educating Catholic couples in natural contraception. But how did the rhythm method work? It worked by limiting sex to the safe period that progestin created. And how did the Pill work? It worked by using progestin to extend the safe period to the entire month. It didn't mutilate the reproductive organs, or damage any natural process. "Indeed," Rock wrote, oral contraceptives "may be characterized as a 'pill-established safe period,' and would seem to carry the same moral implications" as the rhythm method. The Pill was, to Rock, no more than "an adjunct to nature."

  In 1958, Pope Pius XII approved the Pill for Catholics, so long as its contraceptive effects were "indirect" - that is, so long as it was intended only as a remedy for conditions like painful menses or "a disease of the uterus." That ruling emboldened Rock still further. Short-term use of the Pill, he knew, could regulate the cycle of women whose periods had previously been unpredictable. Since a regular menstrual cycle was necessary for the successful use of the rhythm method - and since the rhythm method was sanctioned by the Church - shouldn't it be permissible for women with an irregular menstrual cycle to use the Pill in order to facilitate the use of rhythm? And if that was true why not take the logic one step further? As the federal judge John T. Noonan writes in "Contraception," his history of the Catholic position on birth control:

  If it was lawful to suppress ovulation to achieve a regularity necessary for successfully sterile intercourse, why was it not lawful to suppress ovulation without appeal to rhythm? If pregnancy could be prevented by pill plus rhythm, why not by pill alone? In each case suppression of ovulation was used as a means. How was a moral difference made by the addition of rhythm?

  These arguments, as arcane as they may seem, were central to the development of oral contraception. It was John Rock and Gregory Pincus who decided that the Pill ought to be taken over a four-week cycle - a woman would spend three weeks on the Pill and the fourth week off the drug (or on a placebo), to allow for menstruation. There was and is no medical reason for this. A typical woman of childbearing age has a menstrual cycle of around twenty- eight days, determined by the cascades of hormones released by her ovaries. As first estrogen and then a combination of estrogen and progestin flood the uterus, its lining becomes thick and swollen, preparing for the implantation of a fertilized egg. If the egg is not fertilized, hormone levels plunge and cause the lining - the endometrium - to be sloughed off in a menstrual bleed. When a woman is on the Pill, however, no egg is released, because the Pill suppresses ovulation. The fluxes of estrogen and progestin that cause the lining of the uterus to grow are dramatically reduced, because the Pill slows down the ovaries. Pincus and Rock knew that the effect of the Pill's hormones on the endometrium was so modest that women could conceivably go for months without having to menstruate. "In view of the ability of this compound to prevent menstrual bleeding as long as it is taken," Pincus acknowledged in 1958, "a cycle of any desired length could presumably be produced." But he and Rock decided to cut the hormones off after three weeks and trigger a menstrual period because they believed that women would find the continuation of their monthly bleeding reassuring. More to the point, if Rock wanted to demonstrate that the Pill was no more than a natural variant of the rhythm method, he couldn't very well do away with the monthly menses. Rhythm required "regularity," and so the Pill had to produce regularity as well.

  It has often been said of the Pill that no other drug has ever been so instantly recognizable by its packaging: that small, round plastic dial pack. But what was the dial pack if not the physical embodiment of the twenty-eight-day cycle? It was, in the words of its inventor, meant to fit into a case "indistinguishable" from a woman's cosmetics compact, so that it might be carried "without giving a visual clue as to matters which are of no concern to others." Today, the Pill is still often sold in dial packs and taken in twenty-eight-day cycles. It remains, in other words, a drug shaped by the dictates of the Catholic Church - by John Rock's desire to make this new method of birth control seem as natural as possible. This was John Rock's error. He was consumed by the idea of the natural. But what he thought was natural wasn't so natural after all, and the Pill he ushered into the world turned out to be something other than what he thought it was. In John Rock's mind the dictates of religion and the principles of science got mixed up, and only now are we beginning to untangle them.

  2.

  In 1986, a young scientist named Beverly Strassmann travelled to Africa to live with the Dogon tribe of Mali. Her research site was the village of Sangui in the Sahel, about a hundred and twenty miles south of Timbuktu. The Sahel is thorn savannah, green in the rainy season and semi-arid the rest of the year. The Dogon grow millet, sorghum, and onions, raise livestock, and live in adobe houses on the Bandiagara escarpment. They use no contraception. Many of them have held on to their ancestral customs and religious beliefs. Dogon farmers, in many respects, live much as people of that region have lived since antiquity. Strassmann wanted to construct a precise reproductive profile of the women in the tribe, in order to understand what female biology might have been like in the millennia that preceded the modern age. In a way, Strassmann was trying to answer the same question about female biology that John Rock and the Catholic Church had struggled with in the early sixties: what is natural? Only, her sense of "natural" was not theological but evolutionary. In the era during which natural selection established the basic patterns of human biology - the natural history of our species - how often did women have children? How often did they menstruate? When did they reach puberty and menopause? What impact did breast-feeding have on ovulation? These questions had been studied before, but never so thoroughly that anthropologists felt they knew the answers with any certainty.

  Strassmann, who teaches at the University of Michigan at Ann Arbor, is a slender, soft-spoken woman with red hair, and she recalls her time in Mali with a certain wry humor. The house she stayed in while in Sangui had been used as a shelter for sheep before she came and was turned into a pigsty after she left. A small brown snake lived in her latrine, and would curl up in a camouflaged coil on the seat she sat on while bathing. The villagers, she says, were of two minds: was it a deadly snake - Kere me jongolo, literally, "My bite cannot be healed" - or a harmless mouse snake? (It turned out to be the latter.) Once, one of her neighbors and best friends in the tribe roasted her a rat as a special treat. "I told him that white people aren't allowed to eat rat because rat is our totem," Strassmann says. "I can still see it. Bloated and charred. Stretched by its paws. Whiskers singed. To say nothing of the tail." Strassmann meant to live in Sangui for eighteen months, but her experiences there were so profound and exhilarating that she stayed for two and a half years. "I felt incredibly privileged," she says. "I just couldn't tear myself away."

  Part of Strassmann's work focussed on the Dogon's practice of segregating menstruating women in special huts on the fringes of the village. In Sangui, there were two menstrual huts - dark, cramped, one-room adobe structures, with boards for beds. Each accommodated three women, and when the rooms were full, latecomers were forced to stay outside on the rocks. "It's not a place where people kick back and enjoy themselves," Strassmann says. "It's simply a nighttime hangout. They get there at dusk, and get up early in the morning and draw their water." Strassmann took urine samples from the women using the hut, to confirm that they were menstruating. Then she made a list of all the women in the village, and for her entire time in Mali - seven hundred and thirty- six consecutive nights - she kept track of everyone who visited the hut. Among the Dogon, she found, a woman, on average, has her first period at the age of sixteen and gives birth eight or nine times. From menarche, the onset of menstruation, to the age of twenty, she averages seven periods a year. Over the next decade and a half, from the age of twenty to the age of thirty-four, she spends so much time either pregnant or breast-feeding (which, among the Dogon, suppresses ovulation for an average of twenty months) that she averages only slightly more than one period per year. Then, from the age of thirty-five until menopause, at around fifty, as her fertility rapidly declines, she averages four menses a year. All told, Dogon women menstruate about a hundred times in their lives. (Those who survive early childhood typically live into their seventh or eighth decade.) By contrast, the average for contemporary Western women is somewhere between three hundred and fifty and four hundred times.

  Strassmann's office is in the basement of a converted stable next to the Natural History Museum on the University of Michigan campus. Behind her desk is a row of battered filing cabinets, and as she was talking she turned and pulled out a series of yellowed charts. Each page listed, on the left, the first names and identification numbers of the Sangui women. Across the top was a time line, broken into thirty-day blocks. Every menses of every woman was marked with an X. In the village, Strassmann explained, there were two women who were sterile, and, because they couldn't get pregnant, they were regulars at the menstrual hut. She flipped through the pages until she found them. "Look, she had twenty-nine menses over two years, and the other had twenty- three." Next to each of their names was a solid line of x's. "Here's a woman approaching menopause," Strassmann went on, running her finger down the page. "She's cycling but is a little bit erratic. Here's another woman of prime childbearing age. Two periods. Then pregnant. I never saw her again at the menstrual hut. This woman here didn't go to the menstrual hut for twenty months after giving birth, because she was breast-feeding. Two periods. Got pregnant. Then she miscarried, had a few periods, then got pregnant again. This woman had three menses in the study period." There weren't a lot of x's on Strassmann's sheets. Most of the boxes were blank. She flipped back through her sheets to the two anomalous women who were menstruating every month. "If this were a menstrual chart of undergraduates here at the University of Michigan, all the rows would be like this."

  Strassmann does not claim that her statistics apply to every preindustrial society. But she believes - and other anthropological work backs her up - that the number of lifetime menses isn't greatly affected by differences in diet or climate or method of subsistence (foraging versus agriculture, say). The more significant factors, Strassmann says, are things like the prevalence of wet-nursing or sterility. But over all she believes that the basic pattern of late menarche, many pregnancies, and long menstrual-free stretches caused by intensive breast-feeding was virtually universal up until the "demographic transition" of a hundred years ago from high to low fertility. In other words, what we think of as normal - frequent menses - is in evolutionary terms abnormal. "It's a pity that gynecologists think that women have to menstruate every month,"Strassmann went on. "They just don't understand the real biology of menstruation."

  To Strassmann and others in the field of evolutionary medicine, this shift from a hundred to four hundred lifetime menses is enormously significant. It means that women's bodies are being subjected to changes and stresses that they were not necessarily designed by evolution to handle. In a brilliant and provocative book, "Is Menstruation Obsolete?," Drs. Elsimar Coutinho and Sheldon S. Segal, two of the world's most prominent contraceptive researchers, argue that this recent move to what they call "incessant ovulation" has become a serious problem for women's health. It doesn't mean that women are always better off the less they menstruate. There are times - particularly in the context of certain medical conditions - when women ought to be concerned if they aren't menstruating: In obese women, a failure to menstruate can signal an increased risk of uterine cancer. In female athletes, a failure to menstruate can signal an increased risk of osteoporosis. But for most women, Coutinho and Segal say, incessant ovulation serves no purpose except to increase the occurence of abdominal pain, mood shifts, migraines, endometriosis, fibroids, and anemia - the last of which, they point out, is "one of the most serious health problems in the world."

  Most serious of all is the greatly increased risk of some cancers. Cancer, after all, occurs because as cells divide and reproduce they sometimes make mistakes that cripple the cells' defenses against runaway growth. That's one of the reasons that our risk of cancer generally increases as we age: our cells have more time to make mistakes. But this also means that any change promoting cell division has the potential to increase cancer risk, and ovulation appears to be one of those changes. Whenever a woman ovulates, an egg literally bursts through the walls of her ovaries. To heal that puncture, the cells of the ovary wall have to divide and reproduce. Every time a woman gets pregnant and bears a child, her lifetime risk of ovarian cancer drops ten per cent. Why? Possibly because, between nine months of pregnancy and the suppression of ovulation associated with breast-feeding, she stops ovulating for twelve months - and saves her ovarian walls from twelve bouts of cell division. The argument is similar for endometrial cancer. When a woman is menstruating, the estrogen that flows through her uterus stimulates the growth of the uterine lining, causing a flurry of potentially dangerous cell division. Women who do not menstruate frequently spare the endometrium that risk. Ovarian and endometrial cancer are characteristically modern diseases, consequences, in part, of a century in which women have come to menstruate four hundred times in a lifetime.

  In this sense, the Pill really does have a "natural"effect. By blocking the release of new eggs, the progestin in oral contraceptives reduces the rounds of ovarian cell division. Progestin also counters the surges of estrogen in the endometrium, restraining cell division there. A woman who takes the Pill for ten years cuts her ovarian-cancer risk by around seventy per cent and her endometrial-cancer risk by around sixty per cent. But here "natural" means something different from what Rock meant. He assumed that the Pill was natural because it was an unobtrusive variant of the body's own processes. In fact, as more recent research suggests, the Pill is really only natural in so far as it's radical - rescuing the ovaries and endometrium from modernity. That Rock insisted on a twenty-eight-day cycle for his pill is evidence of just how deep his misunderstanding was: the real promise of the Pill was not that it could preserve the menstrual rhythms of the twentieth century but that it could disrupt them.

  Today, a growing movement of reproductive specialists has begun to campaign loudly against the standard twenty-eight-day pill regimen. The drug company Organon has come out with a new oral contraceptive, called Mircette, that cuts the seven-day placebo interval to two days. Patricia Sulak, a medical researcher at Texas A.& M. University, has shown that most women can probably stay on the Pill, straight through, for six to twelve weeks before they experience breakthrough bleeding or spotting. More recently, Sulak has documented precisely what the cost of the Pill's monthly "off" week is. In a paper in the February issue of the journal Obstetrics and Gynecology, she and her colleagues documented something that will come as no surprise to most women on the Pill: during the placebo week, the number of users experiencing pelvic pain, bloating, and swelling more than triples, breast tenderness more than doubles, and headaches increase by almost fifty per cent. In other words, some women on the Pill continue to experience the kinds of side effects associated with normal menstruation. Sulak's paper is a short, dry, academic work, of the sort intended for a narrow professional audience. But it is impossible to read it without being struck by the consequences of John Rock's desire to please his church. In the past forty years, millions of women around the world have been given the Pill in such a way as to maximize their pain and suffering. And to what end? To pretend that the Pill was no more than a pharmaceutical version of the rhythm method?

  3.

  In 1980 and 1981, Malcolm Pike, a medical statistician at the University of Southern California, travelled to Japan for six months to study at the Atomic Bomb Casualties Commission. Pike wasn't interested in the effects of the bomb. He wanted to examine the medical records that the commission had been painstakingly assembling on the survivors of Hiroshima and Nagasaki. He was investigating a question that would ultimately do as much to complicate our understanding of the Pill as Strassmann's research would a decade later: why did Japanese women have breast-cancer rates six times lower than American women?

  In the late forties, the World Health Organization began to collect and publish comparative health statistics from around the world, and the breast-cancer disparity between Japan and America had come to obsess cancer specialists. The obvious answer - that Japanese women were somehow genetically protected against breast cancer - didn't make sense, because once Japanese women moved to the United States they began to get breast cancer almost as often as American women did. As a result, many experts at the time assumed that the culprit had to be some unknown toxic chemical or virus unique to the West. Brian Henderson, a colleague of Pike's at U.S.C. and his regular collaborator, says that when he entered the field, in 1970, "the whole viral- and chemical- carcinogenesis idea was huge - it dominated the literature." As he recalls, "Breast cancer fell into this large, unknown box that said it was something to do with the environment - and that word 'environment' meant a lot of different things to a lot of different people. They might be talking about diet or smoking or pesticides."

  Henderson and Pike, however, became fascinated by a number of statistical pecularities. For one thing, the rate of increase in breast-cancer risk rises sharply throughout women's thirties and forties and then, at menopause, it starts to slow down. If a cancer is caused by some toxic outside agent, you'd expect that rate to rise steadily with each advancing year, as the number of mutations and genetic mistakes steadily accumulates. Breast cancer, by contrast, looked as if it were being driven by something specific to a woman's reproductive years. What was more, younger women who had had their ovaries removed had a markedly lower risk of breast cancer; when their bodies weren't producing estrogen and progestin every month, they got far fewer tumors. Pike and Henderson became convinced that breast cancer was linked to a process of cell division similar to that of ovarian and endometrial cancer. The female breast, after all, is just as sensitive to the level of hormones in a woman's body as the reproductive system. When the breast is exposed to estrogen, the cells of the terminal-duct lobular unit - where most breast cancer arises - undergo a flurry of division. And during the mid-to-late stage of the menstrual cycle, when the ovaries start producing large amounts of progestin, the pace of cell division in that region doubles.

  It made intuitive sense, then, that a woman's risk of breast cancer would be linked to the amount of estrogen and progestin her breasts have been exposed to during her lifetime. How old a woman is at menarche should make a big difference, because the beginning of puberty results in a hormonal surge through a woman's body, and the breast cells of an adolescent appear to be highly susceptible to the errors that result in cancer. (For more complicated reasons, bearing children turns out to be protective against breast cancer, perhaps because in the last two trimesters of pregnancy the cells of the breast mature and become much more resistant to mutations.) How old a woman is at menopause should matter, and so should how much estrogen and progestin her ovaries actually produce, and even how much she weighs after menopause, because fat cells turn other hormones into estrogen.

  Pike went to Hiroshima to test the cell-division theory. With other researchers at the medical archive, he looked first at the age when Japanese women got their period. A Japanese woman born at the turn of the century had her first period at sixteen and a half. American women born at the same time had their first period at fourteen. That difference alone, by their calculation, was sufficient to explain forty per cent of the gap between American and Japanese breast-cancer rates. "They had collected amazing records from the women of that area," Pike said. "You could follow precisely the change in age of menarche over the century. You could even see the effects of the Second World War. The age of menarche of Japanese girls went up right at that point because of poor nutrition and other hardships. And then it started to go back down after the war. That's what convinced me that the data were wonderful."

  Pike, Henderson, and their colleagues then folded in the other risk factors. Age at menopause, age at first pregnancy, and number of children weren't sufficiently different between the two countries to matter. But weight was. The average post- menopausal Japanese woman weighed a hundred pounds; the average American woman weighed a hundred and forty-five pounds. That fact explained another twenty-five per cent of the difference. Finally, the researchers analyzed blood samples from women in rural Japan and China, and found that their ovaries - possibly because of their extremely low-fat diet - were producing about seventy-five per cent the amount of estrogen that American women were producing. Those three factors, added together, seemed to explain the breast-cancer gap. They also appeared to explain why the rates of breast cancer among Asian women began to increase when they came to America: on an American diet, they started to menstruate earlier, gained more weight, and produced more estrogen. The talk of chemicals and toxins and power lines and smog was set aside. "When people say that what we understand about breast cancer explains only a small amount of the problem, that it is somehow a mystery, it's absolute nonsense," Pike says flatly. He is a South African in his sixties, with graying hair and a salt-and-pepper beard. Along with Henderson, he is an eminent figure in cancer research, but no one would ever accuse him of being tentative in his pronouncements. "We understand breast cancer extraordinarily well. We understand it as well as we understand cigarettes and lung cancer."

  What Pike discovered in Japan led him to think about the Pill, because a tablet that suppressed ovulation - and the monthly tides of estrogen and progestin that come with it - obviously had the potential to be a powerful anti-breast-cancer drug. But the breast was a little different from the reproductive organs. Progestin prevented ovarian cancer because it suppressed ovulation. It was good for preventing endometrial cancer because it countered the stimulating effects of estrogen. But in breast cells, Pike believed, progestin wasn't the solution; it was one of the hormones that caused cell division. This is one explanation for why, after years of studying the Pill, researchers have concluded that it has no effect one way or the other on breast cancer: whatever beneficial effect results from what the Pill does is cancelled out by how it does it. John Rock touted the fact that the Pill used progestin, because progestin was the body's own contraceptive. But Pike saw nothing "natural"about subjecting the breast to that heavy a dose of proges- tin. In his view, the amount of progestin and estrogen needed to make an effective contraceptive was much greater than the amount needed to keep the reproductive system healthy - and that excess was unnecessarily raising the risk of breast cancer. A truly natural Pill might be one that found a way to suppress ovulation without using progestin. Throughout the nineteen-eighties, Pike recalls, this was his obsession. "We were all trying to work out how the hell we could fix the Pill. We thought about it day and night."

  4.

  Pike's proposed solution is a class of drugs known as GnRHAs, which has been around for many years. GnRHAs disrupt the signals that the pituitary gland sends when it is attempting to order the manufacture of sex hormones. It's a circuit breaker. "We've got substantial experience with this drug," Pike says. Men suffering from prostate cancer are sometimes given a GnRHA to temporarily halt the production of testosterone, which can exacerbate their tumors. Girls suffering from what's called precocious puberty - puberty at seven or eight, or even younger - are sometimes given the drug to forestall sexual maturity. If you give GnRHA to women of childbearing age, it stops their ovaries from producing estrogen and progestin. If the conventional Pill works by convincing the body that it is, well, a little bit pregnant, Pike's pill would work by convincing the body that it was menopausal.

  In the form Pike wants to use it, GnRHA will come in a clear glass bottle the size of a saltshaker, with a white plastic mister on top. It will be inhaled nasally. It breaks down in the body very quickly. A morning dose simply makes a woman menopausal for a while. Menopause, of course, has its risks. Women need estrogen to keep their hearts and bones strong. They also need progestin to keep the uterus healthy. So Pike intends to add back just enough of each hormone to solve these problems, but much less than women now receive on the Pill. Ideally, Pike says, the estrogen dose would be adjustable: women would try various levels until they found one that suited them. The progestin would come in four twelve-day stretches a year. When someone on Pike's regimen stopped the progestin, she would have one of four annual menses.

  Pike and an oncologist named Darcy Spicer have joined forces with another oncologist, John Daniels, in a startup called Balance Pharmaceuticals. The firm operates out of a small white industrial strip mall next to the freeway in Santa Monica. One of the tenants is a paint store, another looks like some sort of export company. Balance's offices are housed in an oversized garage with a big overhead door and concrete floors. There is a tiny reception area, a little coffee table and a couch, and a warren of desks, bookshelves, filing cabinets, and computers. Balance is testing its formulation on a small group of women at high risk for breast cancer, and if the results continue to be encouraging, it will one day file for F.D.A. approval.

  "When I met Darcy Spicer a couple of years ago," Pike said recently, as he sat at a conference table deep in the Balance garage, "he said, 'Why don't we just try it out? By taking mammograms, we should be able to see changes in the breasts of women on this drug, even if we add back a little estrogen to avoid side effects.' So we did a study, and we found that there were huge changes." Pike pulled out a paper he and Spicer had published in the Journal of the National Cancer Institute, showing breast X-rays of three young women. "These are the mammograms of the women before they start," he said. Amid the grainy black outlines of the breast were large white fibrous clumps - clumps that Pike and Spicer believe are indicators of the kind of relentless cell division that increases breast-cancer risk. Next to those x-rays were three mammograms of the same women taken after a year on the GnRHA regimen. The clumps were almost entirely gone. "This to us represents that we have actually stopped the activity inside the breasts," Pike went on. "White is a proxy for cell proliferation. We're slowing down the breast."

  Pike stood up from the table and turned to a sketch pad on an easel behind him. He quickly wrote a series of numbers on the paper. "Suppose a woman reaches menarche at fifteen and menopause at fifty. That's thirty-five years of stimulating the breast. If you cut that time in half, you will change her risk not by half but by half raised to the power of 4.5." He was working with a statistical model he had developed to calculate breast-cancer risk. "That's one-twenty-third. Your risk of breast cancer will be one- twenty-third of what it would be otherwise. It won't be zero. You can't get to zero. If you use this for ten years, your risk will be cut by at least half. If you use it for five years, your risk will be cut by at least a third. It's as if your breast were to be five years younger, or ten years younger - forever." The regimen, he says, should also provide protection against ovarian cancer.

  Pike gave the sense that he had made this little speech many times before, to colleagues, to his family and friends - and to investors. He knew by now how strange and unbelievable what he was saying sounded. Here he was, in a cold, cramped garage in the industrial section of Santa Monica, arguing that he knew how to save the lives of hundreds of thousands of women around the world. And he wanted to do that by making young women menopausal through a chemical regimen sniffed every morning out of a bottle. This was, to say the least, a bold idea. Could he strike the right balance between the hormone levels women need to stay healthy and those that ultimately make them sick? Was progestin really so important in breast cancer? There are cancer specialists who remain skeptical. And, most of all, what would women think? John Rock, at least, had lent the cause of birth control his Old World manners and distinguished white hair and appeals from theology; he took pains to make the Pill seem like the least radical of interventions - nature's contraceptive, something that could be slipped inside a woman's purse and pass without notice. Pike was going to take the whole forty-year mythology of "natural" and sweep it aside. "Women are going to think, I'm being manipulated here. And it's a perfectly reasonable thing to think." Pike's South African accent gets a little stronger as he becomes more animated. "But the modern way of living represents an extraordinary change in female biology. Women are going out and becoming lawyers, doctors, presidents of countries. They need to understand that what we are trying to do isn't abnormal. It's just as normal as when someone hundreds of years ago had menarche at seventeen and had five babies and had three hundred fewer menstrual cycles than most women have today. The world is not the world it was. And some of the risks that go with the benefits of a woman getting educated and not getting pregnant all the time are breast cancer and ovarian cancer, and we need to deal with it. I have three daughters. The earliest grandchild I had was when one of them was thirty-one. That's the way many women are now. They ovulate from twelve or thirteen until their early thirties. Twenty years of uninterrupted ovulation before their first child! That's a brand-new phenomenon!"

  5.

  John Rock's long battle on behalf of his birth-control pill forced the Church to take notice. In the spring of 1963, just after Rock's book was published, a meeting was held at the Vatican between high officials of the Catholic Church and Donald B. Straus, the chairman of Planned Parenthood. That summit was followed by another, on the campus of the University of Notre Dame. In the summer of 1964, on the eve of the feast of St. John the Baptist, Pope Paul VI announced that he would ask a committee of church officials to reëxamine the Vatican's position on contraception. The group met first at the Collegio San Jose, in Rome, and it was clear that a majority of the committee were in favor of approving the Pill. Committee reports leaked to the National Catholic Register confirmed that Rock's case appeared to be winning. Rock was elated. Newsweek put him on its cover, and ran a picture of the Pope inside. "Not since the Copernicans suggested in the sixteenth century that the sun was the center of the planetary system has the Roman Catholic Church found itself on such a perilous collision course with a new body of knowledge," the article concluded. Paul VI, however, was unmoved. He stalled, delaying a verdict for months, and then years. Some said he fell under the sway of conservative elements within the Vatican. In the interim, theologians began exposing the holes in Rock's arguments. The rhythm method " 'prevents' conception by abstinence, that is, by the non-performance of the conjugal act during the fertile period," the Catholic journal America concluded in a 1964 editorial. "The pill prevents conception by suppressing ovulation and by thus abolishing the fertile period. No amount of word juggling can make abstinence from sexual relations and the suppression of ovulation one and the same thing." On July 29, 1968, in the "Humanae Vitae" encyclical, the Pope broke his silence, declaring all "artificial" methods of contraception to be against the teachings of the Church.

  In hindsight, it is possible to see the opportunity that Rock missed. If he had known what we know now and had talked about the Pill not as a contraceptive but as a cancer drug - not as a drug to prevent life but as one that would save life - the church might well have said yes. Hadn't Pius XII already approved the Pill for therapeutic purposes? Rock would only have had to think of the Pill as Pike thinks of it: as a drug whose contraceptive aspects are merely a means of attracting users, of getting, as Pike put it, "people who are young to take a lot of stuff they wouldn't otherwise take."

  But Rock did not live long enough to understand how things might have been. What he witnessed, instead, was the terrible time at the end of the sixties when the Pill suddenly stood accused - wrongly - of causing blood clots, strokes, and heart attacks. Between the mid-seventies and the early eighties, the number of women in the United States using the Pill fell by half. Harvard Medical School, meanwhile, took over Rock's Reproductive Clinic and pushed him out. His Harvard pension paid him only seventy-five dollars a year. He had almost no money in the bank and had to sell his house in Brookline. In 1971, Rock left Boston and retreated to a farmhouse in the hills of New Hampshire. He swam in the stream behind the house. He listened to John Philip Sousa marches. In the evening, he would sit in the living room with a pitcher of martinis. In 1983, he gave his last public interview, and it was as if the memory of his achievements was now so painful that he had blotted it out.

  He was asked what the most gratifying time of his life was. "Right now," the inventor of the Pill answered, incredibly. He was sitting by the fire in a crisp white shirt and tie, reading "The Origin," Irving Stone's fictional account of the life of Darwin. "It frequently occurs to me, gosh, what a lucky guy I am. I have no responsibilities, and I have everything I want. I take a dose of equanimity every twenty minutes. I will not be disturbed about things."

  Once, John Rock had gone to seven-o'clock Mass every morning and kept a crucifix above his desk. His interviewer, the writer Sara Davidson, moved her chair closer to his and asked him whether he still believed in an afterlife.

  "Of course I don't," Rock answered abruptly. Though he didn't explain why, his reasons aren't hard to imagine. The church could not square the requirements of its faith with the results of his science, and if the church couldn't reconcile them how could Rock be expected to? John Rock always stuck to his conscience, and in the end his conscience forced him away from the thing he loved most. This was not John Rock's error. Nor was it his church's. It was the fault of the haphazard nature of science, which all too often produces progress in advance of understanding. If the order of events in the discovery of what was natural had been reversed, his world, and our world, too, would have been a different place.

  "Heaven and Hell, Rome, all the Church stuff - that's for the solace of the multitude," Rock said. He had only a year to live. "I was an ardent practicing Catholic for a long time, and I really believed it all then, you see."
GO TO TOP MENU

  Do our first three years of life determine how we'll turn out?

  1.

  In April of 1997, Hillary Clinton was the host of a daylong conference at the White House entitled "What New Research on the Brain tells Us About Our Youngest Children." In her opening remarks, which were beamed live by satellite to nearly a hundred hospitals, universities, and schools, in thirty-seven states, Mrs. Clinton said, "Fifteen years ago, we thought that a baby's brain structure was virtually complete at birth." She went on:

  Now we understand that it is a work in progress, and that everything we do with a child has some kind of potential physical influence on that rapidly forming brain. A child's earliest experiences - their relationships with parents and caregivers, the sights and sounds and smells and feelings they encounter, the challenges they meet - determine how their brains are wired. . . . These experiences can determine whether children will grow up to be peaceful or violent citizens, focussed or undisciplined workers, attentive or detached parents themselves.

  At the afternoon session of the conference, the keynote speech was given by the director turned children's advocate Rob Reiner. His goal, Reiner told the assembled, was to get the public to "look through the prism" of the first three years of life "in terms of problem solving at every level of society":

  If we want to have a real significant impact, not only on children's success in school and later on in life, healthy relationships, but also an impact on reduction in crime, teen pregnancy, drug abuse, child abuse, welfare, homelessness, and a variety of other social ills, we are going to have to address the first three years of life. There is no getting around it. All roads lead to Rome.

  The message of the conference was at once hopeful and a little alarming.On the one hand, it suggested that the right kind of parenting during those first three years could have a lasting effect on a child's life; on the other hand, it implied that if we missed this opportunity the resulting damage might well be permanent. Today, there is a zero-to-three movement, made up of advocacy groups and policymakers like Hillary Clinton, which uses the promise and the threat of this new brain science to push for better pediatric care, early childhood education, and day care. Reiner has started something called the I Am Your Child Foundation, devoted to this cause, and has enlisted the support of, among others, Tom Hanks, Robin Williams, Billy Crystal, Charlton Heston, and Rosie O'Donnell. Some lawmakers now wonder whether programs like Head Start ought to be drastically retooled, to focus on babies and toddlers rather than on preschoolers. The state of California recently approved a fifty-cent-per-pack tax on cigarettes to fund programs aimed at improving care for babies and toddlers up to the age of five. The state governments of Georgia and Tennessee send classical-music CDs home from the hospital with every baby, and Florida requires that day-care centers play classical music every day - all in the belief that Mozart will help babies build their minds in this critical window of development. "During the first part of the twentieth century, science built a strong foundation for the physical health of our children," Mrs. Clinton said in her speech that morning. "The last years of this century are yielding similar breakthroughs for the brain. We are . . . coming closer to the day when we should be able to insure the well-being of children in every domain - physical, social, intellectual, and emotional."

  The First Lady took pains not to make the day's message sound too extreme. "I hope that this does not create the impression that, once a child's third birthday rolls around, the important work is over,"she said, adding that much of the brain's emotional wiring isn't completed until adolescence, and that children never stop needing the love and care of their parents. Still, there was something odd about the proceedings. This was supposed to be a meeting devoted to new findings in brain science, but hardly any of the brain science that was discussed was new. In fact, only a modest amount of brain science was discussed at all. Many of the speakers were from the worlds of education and policy. Then, there was Mrs. Clinton's claim that the experiences of our first few years could "determine" whether we grow up to be peaceful or violent, focussed or undisciplined. We tend to think that the environmental influences upon the way we turn out are the sum of a lifetime of experiences - that someone is disciplined because he spent four years in the Marines, or because he got up every morning as a teen-ager to train with the swim team. But Hillary Clinton was proposing that we direct our attention instead to what happens to children in a very brief window early in life. The First Lady, now a candidate for the United States Senate, is associating herself with a curious theory of human development. Where did this idea come from? And is it true?

  2.

  John Bruer tackles both these questions in his new book, "The Myth of The First Three Years" (Free Press; $25). From its title, Bruer's work sounds like a rant. It isn't. Noting the cultural clout of the zero-to- three idea, Bruer, who heads a medical-research foundation in St. Louis, sets out to compare what people like Rob Reiner and Hillary Clinton are saying to what neuroscientists have actually concluded. The result is a superb book, clear and engaging, that serves as both popular science and intellectual history.

  Mrs. Clinton and her allies, Bruer writes, are correct in their premise: the brain at birth is a work in progress. Relatively few connections among its billions of cells have yet been established. In the first few years of life, the brain begins to wire itself up at a furious pace, forming hundreds of thousands, even millions, of new synapses every second. Infants produce so many new neural connections, so quickly, that the brain of a two-year-old is actually far more dense with neural connections than the brain of an adult. After three, that burst of activity seems to slow down, and our brain begins the long task of rationalizing its communications network, finding those connections which seem to be the most important and getting rid of the rest.

  During this brief initial period of synaptical "exuberance," the brain is especially sensitive to its environment. David Hubel and Torsten Wiesel, in a famous experiment, sewed one of the eyes of a kitten shut for the first three months of its life, and when they opened it back up they found that the animal was permanently blind in that eye. There are critical periods early in life, then, when the brain will not develop properly unless it receives a certain amount of outside stimulation. In another series of experiments, begun in the early seventies, William Greenough, a psychologist at the University of Illinois, showed that a rat reared in a large, toy-filled cage with other rats ended up with a substantially more developed visual cortex than a rat that spent its first month alone in a small, barren cage: the brain, to use the word favored by neuroscientists, is plastic - that is, modifiable by experience. In other words, Hillary Clinton's violent citizens and unfocussed workers might seem to be the human equivalents of kittens who've had an eye sewed shut, or rats who've been reared in a barren cage. If in the critical first three years of synapse formation we could give people the equivalent of a big cage full of toys, she was saying, we could make them healthier and smarter.

  Put this way, these ideas sound quite reasonable, and it's easy to see why they have attracted such excitement. But Bruer's contribution is to show how, on several critical points, this account of child development exaggerates or misinterprets the available evidence.

  Consider, he says, the matter of synapse formation. The zero-to- three activists are convinced that the number of synapses we form in our earliest years plays a big role in determining our mental capacity. But do we know that to be true? People with a form of mental retardation known as fragile-X syndrome, Bruer notes, have higher numbers of synapses in their brain than the rest of us. More important, the period in which humans gain real intellectual maturity is late adolescence, by which time the brain is aggressively pruning the number of connections. Is intelligence associated with how many synapses you have or with how efficiently you manage to sort out and make sense of those connections later in life? Nor do we know how dependent the initial burst of synapse formation is on environmental stimulation. Bruer writes of an experiment where the right hand of a monkey was restrained in a leather mitten from birth to four months, effectively limiting all sensory stimulation. That's the same period when young monkeys form enormous numbers of connections in the somatosensory cortex, the area of the monkey brain responsible for size and texture discriminations, so you'd think that the restrained hand would be impaired. But it wasn't: within a short time, it was functioning normally, which suggests that there is a lot more flexibility and resilience in some aspects of brain development than we might have imagined.

  Bruer also takes up the question of early childhood as a developmental window. It makes sense that if children don't hear language by the age of eleven or twelve they aren't going to speak, and that children who are seriously neglected throughout their upbringing will suffer permanent emotional injury. But why, Bruer asks, did advocates arrive at three years of age as a cutoff point? Different parts of the brain develop at different speeds. The rate of synapse formation in our visual cortex peaks at around three or four months. The synapses in our prefrontal cortex - the parts of our brain involved in the most sophisticated cognitive tasks - peak perhaps as late as three years, and aren't pruned back until middle-to-late adolescence. How can the same cutoff apply to both regions?

  Greenough's rat experiments are used to support the critical- window idea, because he showed that he could affect brain development in those early years by altering the environment of his animals. The implications of the experiment aren't so straightforward, though. The experiments began when the rats were about three weeks old, which is already past rat "infancy," and continued until they were fifty-five days old, which put them past puberty. So the experiment showed the neurological consequences of deprivation not during some critical window of infancy but during the creature's entire period of maturation. In fact, when Greenough repeated his experiment with rats that were four hundred and fifty days old - well past middle age - he found that those kept in complex environments once again had significantly denser neural connections than those kept in isolation.

  Even the meaning of the kitten with its eye sewn shut turns out to be far from obvious. When that work was repeated on monkeys, researchers found that if they deprived both eyes of early stimulation - rearing a monkey in darkness for its first six months - the animal could see (although not perfectly), and the binocularity of its vision, the ability of its left and right eyes to coördinate images, was normal. The experiment doesn't show that more stimulation is better than less for binocular vision. It just suggests that whatever stimulation there is should be balanced, which is why closing one eye tilts the developmental process in favor of the open eye.

  To say that the brain is plastic, then, is not to say that the brain is dependent on certain narrow windows of stimulation. Neuroscientists say instead that infant brains have "experience-expectant plasticity" - which means that they need only something that approximates a normal environment. Bruer writes:

  The odds that our children will end up with appropriately fine-tuned brains are incredibly favorable, because the stimuli the brain expects during critical periods are the kinds of stimuli that occur everywhere and all the time within the normal developmental environment for our species. It is only when there are severe genetic or environmental aberrations from the normal that nature's expectations are frustrated and neural development goes awry.

  In the case of monkeys, the only way to destroy their binocular vision is to sew one eye shut for six months - an entirely contrived act that would almost never happen in the wild. Greenough points out that the "complex" environment he created for his rats - a large cage full of toys and other animals - is actually the closest equivalent of the environment that a rat would encounter naturally. When he created a super-enriched environment for his rats, one with even more stimulation than they would normally encounter, the rats weren't any better off. The only way he could affect the neurological development of the animals was to put them in a barren cage by themselves - again, a situation that an animal would never encounter in the wild. Bruer quotes Steve Petersen, a neuroscientist at Washington University, in St. Louis, as saying that neurological development so badly wants to happen that his only advice to parents would be "Don't raise your children in a closet, starve them, or hit them in the head with a frying pan." Petersen was, of course, being flip. But the general conclusion of researchers seems to be that we human beings enjoy a fairly significant margin of error in our first few years of life. Studies done of Romanian orphans who spent their first year under conditions of severe deprivation suggest that most (but not all) can recover if adopted into a nurturing home. In another study, psychologists examined children from an overcrowded orphanage who had been badly neglected as infants and subsequently adopted into loving homes. Within two years of their adoption, one psychologist involved in their rehabilitation had concluded:

  We had not anticipated the older children who had suffered deprivations for periods of 21/2 to 4 years to show swift response to treatment. That they did so amazed us. These inarticulate, underdeveloped youngsters who had formed no relationships in their lives, who were aimless and without a capacity to concentrate on anything, had resembled a pack of animals more than a group of human beings.   .As we worked with the children, it became apparent that their inadequacy was not the result of damage but, rather, was due to a dearth of normal experiences without which development of human qualities is impossible. After a year of treatment, many of these older children were showing a trusting dependency toward the staff of volunteers and.   self-reliance in play and routines.

  3.

  Some years ago, the Berkeley psychology professor Alison Gopnik and one of her students, Betty Repacholi, conducted an experiment with a series of fourteen-month-old toddlers. Repacholi showed the babies two bowls of food, one filled with Goldfish crackers and one filled with raw broccoli. All the babies, naturally, preferred the crackers. Repacholi then tasted the two foods, saying "Yuck" and making a disgusted face at one and saying "Yum" and making a delighted face at the other. Then she pushed both bowls toward the babies, stretched out her hand, and said, "Could you give me some?"

  When she liked the crackers, the babies gave her crackers. No surprise there. But when Repacholi liked the broccoli and hated the crackers, the babies were presented with a difficult philosophical issue - that different people may have different, even conflicting, desires. The fourteen-month-olds couldn't grasp that. They thought that if they liked crackers everyone liked crackers, and so they gave Repacholi the crackers, despite her expressed preferences. Four months later, the babies had, by and large, figured this principle out, and when Repacholi made a face at the crackers they knew enough to give her the broccoli. "The Scientist in the Crib"(Morrow; $24), a fascinating new book that Gopnik has written with Patricia Kuhl and Andrew Meltzoff, both at the University of Washington, argues that the discovery of this principle - that different people have different desires - is the source of the so-called terrible twos. "What makes the terrible twos so terrible is not that the babies do things you don't want them to do - one-year-olds are plenty good at that - but that they do things because you don't want them to," the authors write. And why is that? Not, as is commonly thought, because toddlers want to test parental authority, or because they're just contrary. Instead, the book argues, the terrible twos represent a rational and engaged exploration of what is to two-year-olds a brand-new idea - a generalization of the insight that the fact that they hate broccoli and like crackers doesn't mean that everyone hates broccoli and likes crackers. "Toddlers are systematically testing the dimensions on which their desires and the desires of others may be in conflict," the authors write. Infancy is an experimental research program, in which "the child is the budding psychologist; we parents are the laboratory rats."

  These ideas about child development are, when you think about it, oddly complementary to the neurological arguments of John Bruer. The paradox of the zero-to-three movement is that, for all its emphasis on how alive children's brains are during their early years, it views babies as profoundly passive - as hostage to the quality of the experiences provided for them by their parents and caregivers. "The Scientistin the Crib" shows us something quite different. Children are scientists, who develop theories and interpret evidence from the world around them in accordance with those theories. And when evidence starts to mount suggesting that the existing theory isn't correct - wait a minute, just because I like crackers doesn't mean Mommy likes crackers - they create a new theory to explain the world, just as a physicist would if confronted with new evidence on the relation of energy and matter. Gopnik, Meltzoff, and Kuhl play with this idea at some length. Science, they suggest, is actually a kind of institutionalized childhood, an attempt to harness abilities that evolved to be used by babies or young children. Ultimately, the argument suggests that child development is a rational process directed and propelled by the child himself. How does the child learn about different desires? By systematically and repeatedly provoking a response from adults. In the broccoli experiment, the adult provided the fourteen-month-old with the information ("I hate Goldfish crackers") necessary to make the right decision. But the child ignored that information until he himself had developed a theory to interpret it. When "The Scientist in the Crib" describes children as budding psychologists and adults as laboratory rats, it's more than a clever turn of phrase. Gopnik, Meltzoff, and Kuhl observe that our influence on infants "seems to work in concert with children's own learning abilities." Newborns will "imitate facial expressions" but not "complex actions they don't understand themselves." And the authors conclude, "Children won't take in what you tell them until it makes sense to them. Other people don't simply shape what children do; parents aren't the programmers. Instead, they seem designed to provide just the right sort of information."

  It isn't until you read "The Scientist in the Crib" alongside more conventional child-development books that you begin to appreciate the full implications of its argument. Here, for example, is a passage from "What's Going On in There? How the Brain and Mind Develop in the First Five Years of Life," by Lise Eliot, who teaches at the University of Chicago: "It's important to avoid the kind of muddled baby-talk that turns a sentence like 'Is she the cutest little baby in the world?' into 'Uz see da cooest wiwo baby inna wowud?' Caregivers should try to enunciate clearly when speaking to babies and young children, giving them the cleanest, simplest model of speech possible." Gopnik, Meltzoff, and Kuhl see things a little differently. First, they point out, by six or seven months babies are already highly adept at decoding the sounds they hear around them, using the same skills we do when we talk to someone with a thick foreign accent or a bad cold. If you say "Uz see da cooest wiwo baby inna wowud?" they hear something like "Is she the cutest little baby in the world?" Perhaps more important, this sort of Motherese - with its elongated vowels and repetitions and overpronounced syllables - is just the thing for babies to develop their language skills.And Motherese, the authors point out, seems to be innate. It's found in every culture in the world, and anyone who speaks to a baby uses it, automatically, even without realizing it. Babies want Motherese, so they manage to elicit it from the rest of us. That's a long way from the passive baby who thrives only because of the specialized, high-end parenting skills of the caregiver. "One thing that science tells us is that nature has designed us to teach babies, as much as it has designed babies to learn," Gopnik, Meltzoff, and Kuhl write. "Almost all of the adult actions we've described" - actions that are critical for the cognitive development of babies - "are swift, spontaneous, automatic and unpremeditated."

  4.

  Does it matter that Mrs. Clinton and her allies have misread the evidence on child development? In one sense, it doesn't. The First Lady does not claim to be a neuroscientist. She is a politician, and she is interested in the brains of children only to further an entirely worthy agenda: improved day care, pediatric care, and early- childhood education. Sooner or later, however, bad justifications for social policy can start to make for bad social policy, and that is the real danger of the zero-to-three movement.

  In Lise Eliot's book, for instance, there's a short passage in which she writes of the extraordinary powers of imitation that infants possess. A fifteen-month-old who watches an experimenter lean over and touch his forehead to the top of a box will, when presented with that same box four months later, do exactly the same thing. "The fact that these memories last so long is truly remarkable - and a little bit frightening," Eliot writes, and she continues:

  It goes a long way toward explaining why children, even decades later, are so prone to replicating their parents' behavior. If toddlers can repeat, even several months later, actions they've seen only once or twice, just imagine how watching their parents' daily activities must affect them. Everything they see and hear over time - work, play, fighting, smoking, drinking, reading, hitting, laughing, words, phrases, and gestures - is stored in ways that shape their later actions, and the more they see of a particular behavior, the likelier it is to reappear in their own conduct.

  There is something to this. Why we act the way we do is obviously the result of all kinds of influences and experiences, including those cues we pick up unconsciously as babies. But this doesn't mean, as Eliot seems to think it does, that you can draw a straight line between a concrete adult behavior and what little Suzie, at six months, saw her mother do. As far as we can tell, for instance, infant imitation has nothing to do with smoking. As the behavioral geneticist David Rowe has demonstrated, the children of smokers are more likely than others to take up the habit because of genetics: they have inherited the same genes that made their parents like, and be easily addicted to, nicotine. Once you account for heredity, there is little evidence that parental smoking habits influence children; the adopted children of smokers, for instance, are no more likely to smoke than the children of non-smokers. To the extent that social imitation is a factor in smoking, the psychologist Judith Rich Harris has observed, it is imitation that occurs in adolescence between a teen-ager and his or her peers. So if you were to use Eliot's ideas to design an anti- smoking campaign you'd direct your efforts to stop parents from smoking around their children, and miss the social roots of smoking entirely.

  This point - the distance between infant experience and grownup behavior - is made even more powerfully in Jerome Kagan's marvellous new book, "Three Seductive Ideas"(Oxford; $27.50). Kagan, a professor of psychology at Harvard, offers a devastating critique of what he calls "infant determinism," arguing that many of the truly critical moments of socialization - the moments that social policy properly concerns itself with - occur well after the age of three. As Kagan puts it, a person's level of "anxiety, depression, apathy and anger" is linked to his or her "symbolic constructions of experience" - how the bare facts of any experience are combined with the context of that event, attitudes toward those involved, expectations and memories of past experience. "The Palestinian youths who throw stones at Israeli soldiers believe that the Israeli government has oppressed them unjustly," Kagan writes. He goes on:

  The causes of their violent actions are not traceable to the parental treatment they received in their first few years. Similarly, no happy African-American two-year-old knows about the pockets of racism in American society or the history of oppression blacks have suffered. The realization that there is prejudice will not take form until that child is five or six years old.

  Infant determinism doesn't just encourage the wrong kind of policy. Ultimately, it undermines the basis of social policy. Why bother spending money trying to help older children or adults if the patterns of a lifetime are already, irremediably, in place? Inevitably, some people will interpret the zero-to-three dogma to mean that our obligations to the disadvantaged expire by the time they reach the age of three. Kagan writes of a famous Hawaiian study of child development, in which almost seven hundred children, from a variety of ethnic and economic backgrounds, were followed from birth to adulthood. The best predictor of who would develop serious academic or behavioral problems in adolescence, he writes, was social class: more than eighty per cent of the children who got in trouble came from the poorest segment of the sample. This is the harsh reality of child development, from which the zero-to-three movement offers a convenient escape. Kagan writes, "It is considerably more expensive to improve the quality of housing, education and health of the approximately one million children living in poverty in America today than to urge their mothers to kiss, talk to, and play with them more consistently." In his view, "to suggest to poor parents that playing with and talking to their infant will protect the child from future academic failure and guarantee life success" is an act of dishonesty. But that does not go far enough. It is also an unwitting act of reproach: it implies to disadvantaged parents that if their children do not turn out the way children of privilege do it is their fault - that they are likely to blame for the flawed wiring of their children's brains.

  5.

  In 1973, when Hillary Clinton - then, of course, known as Hillary Rodham - was a young woman just out of law school, she wrote an essay for the Harvard Educational Review entitled "Children Under the Law." The courts, she wrote, ought to reverse their long-standing presumption that children are legally incompetent. She urged, instead, that children's interests be considered independently from those of their parents. Children ought to be deemed capable of making their own decisions and voicing their own interests, unless evidence could be found to the contrary. To her, the presumption of incompetence gave the courts too much discretion in deciding what was in the child's best interests, and that discretion was most often abused in cases of children from poor minority families. "Children of these families," she wrote, "are perceived as bearers of the sins and disabilities of their fathers."

  This is a liberal argument, because a central tenet of liberalism is that social mobility requires a release not merely from burdens imposed by poverty but also from those imposed by family - that absent or indifferent or incompetent parents should not be permitted to destroy a child's prospects. What else was the classic Horatio Alger story about? In "Ragged Dick," the most famous of Alger's novels, Dick's father runs off before his son's birth, and his mother dies destitute while Dick is still a baby. He becomes a street urchin, before rising to the middle class through a combination of hard work, honesty, and luck. What made such tales so powerful was, in part, the hopeful notion that the circumstances of your birth need not be your destiny; and the modern liberal state has been an attempt to make good on that promise.

  But Mrs. Clinton is now promoting a movement with a different message - that who you are and what you are capable of could be the result of how successful your mother and father were in rearing you. In her book "It Takes a Village," she criticizes the harsh genetic determinism of "The Bell Curve." But an ideology that holds that your future is largely decided at birth by your parents' genes is no more dispiriting than one that holds that your future might be decided at three by your parents' behavior. The unintended consequence of the zero-to-three movement is that, once again, it makes disadvantaged children the bearers of the sins and disabilities of their parents.

  The truth is that the traditional aims of the liberal agenda find ample support in the arguments of John Bruer, of Jerome Kagan, of Judith Rich Harris, and of Gopnik, Meltzoff, and Kuhl. All of them offer considerable evidence that what the middle class perceives as inadequate parenting need not condemn a baby for life, and that institutions and interventions to help children as they approach maturity can make a big difference in how they turn out. It is, surely, a sad irony that, at the very moment when science has provided the intellectual reinforcement for modern liberalism, liberals themselves are giving up the fight.
GO TO TOP MENU

  Don't believe the Internet hype: the real E-commerce revolution happened off-line.

  1.

  At the turn of this century, a Missouri farmer named D.Ward King invented a device that came to be known, in his honor, as the King Road Drag. It consisted of two wooden rails that lay side by side about three feet apart, attached by a series of wooden braces. If you pulled the King Drag along a muddy road, it had the almost magical effect of smoothing out the ruts and molding the dirt into a slight crown, so that the next time it rained the water would drain off to the sides. In 1906, when King demonstrated his device to a group of farmers in Wellsville, Kansas, the locals went out and built a hundred King Drags of their own within the week, which makes sense, because if you had asked a farmer at the turn of the century what single invention could make his life easier he would probably have wanted something that improved the roads. They were, in the late nineteenth century, a disaster: of the country's two million miles of roads, fewer than a hundred and fifty thousand had been upgraded with gravel or oil. The rest were dirt. They turned into rivers of mud when it was raining, and hardened into an impassable sea of ruts when it was not. A trip to church or to go shopping was an exhausting ordeal for many farmers. At one point in the early part of this century, economists estimated that it cost more to haul a bushel of wheat along ten miles of American dirt road than it did to ship it across the ocean from New York to Liverpool.

  The King Road Drag was a simple invention that had the effect of reducing the isolation of the American farmer, and soon that simple invention led to all kinds of dramatic changes. Ever since the Post Office was established, for example, farmers had to make the difficult trek into town to pick up their mail. In the eighteen-nineties, Congress pledged that mail would be delivered free to every farmer's home, but only so long as rural communities could demonstrate that their roads were good enough for a mailman to pass by every day - which was a Catch-22 neatly resolved by the King Road Drag. And once you had rural free delivery and good roads, something like parcel post became inevitable. Through the beginning of the century, all packages that weighed more than four pounds were carried by private-express services, which were unreliable and expensive and would, outside big cities, deliver only to a set of depots. But if the mail was being delivered every day to rural dwellers, why not have the mailman deliver packages, too? In 1912, Congress agreed, and with that the age of the mail-order house began: now a farmer could look through a catalogue that contained many thousands of products and have them delivered right to his door. Smaller companies, with limited resources, had a way to bypass the middleman and reach customers all over the country. You no longer needed to sell to the consumer through actual stores made of bricks and mortar. You could build a virtual store!

  In the first fifteen years of this century, in other words, America underwent something of a revolution. Before rural free delivery, if you didn't live in a town - and most Americans didn't - it wasn't really practical to get a daily newspaper. It was only after daily delivery that the country became "wired," in the sense that if something happened in Washington or France or the Congo one evening, everyone would know about it by the next morning. In 1898, mailmen were delivering about eighteen thousand pieces of mail per rural route. Within five years, that number had more than doubled, and by 1929 it had topped a hundred thousand.

  Here was the dawn of the modern consumer economy - an economy in which information moved freely around the country, in which retailers and consumers, buyers and sellers became truly connected for the first time. "You may go to an average store, spend valuable time and select from a limited stock at retail prices," the fall 1915 Sears, Roebuck catalogue boasted, "or have our Big Store of World Wide Stocks at Economy Prices come to you in this catalog - the Modern Way." By the turn of the century, the Sears catalogue had run to over a thousand pages, listing tens of thousands of items in twenty-four departments: music, buggies, stoves, carriage hardware, drugs, vehicles, shoes, notions, sewing machines, cloaks, sporting goods, dry goods, hardware, groceries, furniture and baby carriages, jewelry, optical goods, books, stereopticons, men's clothing, men's furnishings, bicycles, gramophones, and harnesses. Each page was a distinct site, offering a reader in- depth explanations and descriptions well beyond what he would expect if he went to a store, talked to a sales clerk, and personally examined a product. To find all those products, the company employed scores of human search engines - "missionaries" who, the historians Boris Emmet and John Jeuck write, were "said to travel constantly, inspecting the stocks of virtually all retail establishments in the country, conversing with the public at large to discover their needs and desires, and buying goods 'of all kinds and descriptions'" in order to post them on the World Wide Stock.

  The catalogue, as economists have argued, represented a radical transformation in the marketing and distribution of consumer goods. But, of course, that transformation would not have been possible unless you had parcel post, and you couldn't have had parcel post unless you had rural free delivery, and you could not have had rural free delivery without good roads, and you would not have had good roads without D. Ward King. So what was the genuine revolution? Was it the World Wide Stock or was it the King Road Drag?

  2.

  We are now, it is said, in the midst of another business revolution. "This new economy represents a tectonic upheaval in our commonwealth, a far more turbulent reordering than mere digital hardware has produced," Kevin Kelly, a former executive editor of Wired, writes in his book "New Rules for the New Economy." In "Cyber Rules," the software entrepreneurs Thomas M. Siebel and Pat House compare the advent of the Internet to the invention of writing, the appearance of a metal currency in the eastern Mediterranean several thousand years ago, and the adoption of the Arabic zero. "Business," Bill Gates states flatly in the opening sentence of "Business @ the Speed of Thought," "is going to change more in the next ten years than it has in the last fifty."

  The revolution of today, however, turns out to be as difficult to define as the revolution of a hundred years ago. Kelly, for example, writes that because of the Internet "the new economy is about communication, deep and wide." Communication, he maintains, "is not just a sector of the economy. Communication is the economy." But which is really key - how we communicate, or what we communicate? Gates, meanwhile, is preoccupied with the speed of interaction in the new economy. Going digital, he writes, will "shatter the old way of doing business" because it will permit almost instant communication. Yet why is the critical factor how quickly I communicate some decision or message to you - as opposed to how long it takes me to make that decision, or how long it takes you to act on it? Gates called his book "Business @ the Speed of Thought," but thought is a slow and messy thing. Computers do nothing to speed up our thought process; they only make it a lot faster to communicate our thoughts once we've had them. Gates should have called his book "Business @ the Speed of Typing." In "Growing Up Digital," Don Tapscott even goes so far as to claim that the rise of the Internet has created an entirely new personality among the young. N-Geners, as Tapscott dubs the generation, have a different set of assumptions about work than their parents have. They thrive on collaboration, and many find the notion of a boss somewhat bizarre.   .They are driven to innovate and have a mindset of immediacy requiring fast results. They love hard work because working, learning, and playing are the same thing to them. They are creative in ways their parents could only imagine.   .Corporations who hire them should be prepared to have their windows and walls shaken.

  Let's leave aside the fact that the qualities Tapscott ascribes to the Net Generation - energy, a "mindset of immediacy," creativity, a resistance to authority, and (of all things) sharp differences in outlook from their parents - could safely have been ascribed to every upcoming generation in history. What's interesting here is the blithe assumption, which runs through so much of the thinking and talking about the Internet, that this new way of exchanging information must be at the root of all changes now sweeping through our economy and culture. In these last few weeks before Christmas, as the country's magazines and airways become crowded with advertisements for the fledgling class of dot coms, we may be tempted to concur. But is it possible that, once again, we've been dazzled by the catalogues and forgotten the roads?

  3.

  The world's largest on-line apparel retailer is Lands' End, in Wisconsin. Lands' End began in 1963 as a traditional mail-order company. It mailed you its catalogue, and you mailed back your order along with a check. Then, in the mid-nineteen-eighties, Lands' End, like the rest of the industry, reinvented itself. It mailed you its catalogue, and you telephoned an 800 number with your order and paid with a credit card. Now Lands' End has moved on line. In the first half of this year, E-commerce sales accounted for ten per cent of Lands' End's total business, up two hundred and fifty per cent from last year. What has this move to the Web meant?

  Lands' End has its headquarters in the tiny farming town of Dodgeville, about an hour's drive west of Madison, through the rolling Midwestern countryside. The main Lands' End campus is composed of half a dozen modern, low-slung buildings, clustered around a giant parking lot. In one of those buildings, there is a huge open room filled with hundreds of people sitting in front of computer terminals and wearing headsets. These are the people who take your orders. Since the bulk of Lands' End's business is still driven by the catalogue and the 800 number, most of those people are simply talking on the phone to telephone customers. But a growing percentage of the reps are now part of the company's Internet team, serving people who use the Lands' End Live feature on the company's Web site. Lands' End Live allows customers, with the click of a mouse, to start a live chat with a Lands' End representative or get a rep to call them at home, immediately.

  On a recent fall day, a Lands' End Live user - let's call her Betty - was talking to one of the company's customer-service reps, a tall, red-haired woman named Darcia. Betty was on the Lands' End Web site to buy a pair of sweatpants for her young daughter, and had phoned to ask a few questions.

  "What size did I order last year?" Betty asked. "I think I need one size bigger." Darcia looked up the record of Betty's purchase. Last year, she told Betty, she bought the same pants in big- kid's small.

  "I'm thinking medium or large," Betty said. She couldn't decide.

  "The medium is a ten or a twelve, really closer to a twelve," Darcia told her. "I'm thinking if you go to a large, it will throw you up to a sixteen, which is really big."

  Betty agreed. She wanted the medium. But now she had a question about delivery. It was Thursday morning, and she needed the pants by Tuesday. Darcia told her that the order would go out on Friday morning, and with U.P.S. second-day air she would almost certainly get it by Tuesday. They briefly discussed spending an extra six dollars for the premium, next- day service, but Darcia talked Betty out of it. It was only an eighteen-dollar order, after all.

  Betty hung up, her decision made, and completed her order on the Internet. Darcia started an on-line chat with a woman from the East Coast. Let's call her Carol. Carol wanted to buy the forty-nine-dollar attaché case but couldn't decide on a color. Darcia was partial to the dark olive, which she said was "a professional alternative to black." Carol seemed convinced, but she wanted the case monogrammed and there were eleven monogramming styles on the Web-site page.

  "Can I have a personal suggestion?" she wrote.

  "Sure," Darcia typed back. "Who is the case for?"

  "A conservative psychiatrist," Carol replied.

  Darcia suggested block initials, in black. Carol agreed, and sent the order in herself on the Internet. "All right," Darcia said, as she ended the chat. "She feels better." The exchange had taken twenty-three minutes.

  Notice that in each case the customer filled out the actual order herself and sent it in to the Lands' End computer electronically - which is, of course, the great promise of E-commerce. But that didn't make some human element irrelevant. The customers still needed Darcia for advice on colors, and styles, or for reassurance that their daughter was a medium and not a large. In each case, the sale was closed because that human interaction allayed the last-minute anxieties and doubts that so many of us have at the point of purchase. It's a mistake, in other words, to think that E-commerce will entirely automate the retail process. It just turns reps from order-takers into sales advisers.

  "One of the big fallacies when the Internet came along was that you could get these huge savings by eliminating customer- service costs," Bill Bass, the head of E-commerce for Lands' End, says. "People thought the Internet was self-service, like a gas station. But there are some things that you cannot program a computer to provide. People will still have questions, and what you get are much higher-level questions. Like, 'Can you help me come up with a gift?' And they take longer."

  Meanwhile, it turns out, Internet customers at Lands' End aren't much different from 800-number customers. Both groups average around a hundred dollars an order, and they have the same rate of returns. Call volume on the 800 numbers is highest on Mondays and Tuesdays, from ten in the morning until one in the afternoon. So is E-commerce volume. In the long term, of course, the hope is that the Web site will reduce dependence on the catalogue, and that would be a huge efficiency. Given that last year the company mailed two hundred and fifty million catalogues, costing about a dollar each, the potential savings could be enormous. And yet customers' orders on the Internet spike just after a new catalogue arrives at people's homes in exactly the same way that the 800-number business spikes just after the catalogue arrives. E-commerce users, it seems, need the same kind of visual, tangible prompting to use Lands' End as traditional customers. If Lands' End did all its business over the Internet, it would still have to send out something in the mail - a postcard or a bunch of fabric swatches or a slimmed-down catalogue. "We thought going into E-commerce it would be a different business," Tracy Schmit, an Internet analyst at the company, says. "But it's the same business, the same patterns, the same contacts. It's an extension of what we already do."

  4.

  Now consider what happens on what retailers call the "back end" - the customer-fulfillment side - of Lands' End's operations. Say you go to the company's Web site one afternoon and order a blue 32-16 oxford-cloth button-down shirt and a pair of size-9 Top-Siders. At midnight, the computer at Lands' End combines your order with all the other orders for the day: it lumps your shirt order with the hundred other orders, say, that came in for 32-16 blue oxford-cloth button-downs, and lumps your shoe order with the fifty other size-9 Top-Sider orders of the day. It then prints bar codes for every item, so each of those hundred shirts is assigned a sticker listing the location of blue oxford 32-16 shirts in the warehouse, the order that it belongs to, shipping information, and instructions for things like monogramming.

  The next morning, someone known as a "picker" finds the hundred oxford- cloth shirts in that size, yours among them, and puts a sticker on each one, as does another picker in the shoe area with the fifty size-9 Top-Siders. Each piece of merchandise is placed on a yellow plastic tray along an extensive conveyor belt, and as the belt passes underneath a bar-code scanner the computer reads the label and assembles your order. The tray with your shirt on it circles the room until it is directly above a bin that has been temporarily assigned to you, and then tilts, sending the package sliding downward. Later, when your shoes come gliding along on the belt, the computer reads the bar code on the box and sends the shoe box tumbling into the same bin. Then the merchandise is packed and placed on another conveyor belt, and a bar-code scanner sorts the packages once again, sending the New York-bound packages to the New York-bound U.P.S. truck, the Detroit packages to the Detroit truck, and so on.

  It's an extraordinary operation. When you stand in the middle of the Lands' End warehouse - while shirts and pants and sweaters and ties roll by at a rate that, at Christmas, can reach twenty-five thousand items an hour - you feel as if you're in Willy Wonka's chocolate factory. The warehouses are enormous buildings - as big, in all, as sixteen football fields - and the conveyor belts hang from the ceiling like giant pieces of industrial sculpture. Every so often, a belt lurches to a halt, and a little black scanner box reads the bar code and sends the package off again, directing it left or right or up or down, onto any number of separate sidings and overpasses. In the middle of one of the buildings, there is another huge room where thousands of pants, dangling from a jumbo-sized railing like a dry cleaner's rack, are sorted by color (so sewers don't have to change thread as often) and by style, then hemmed, pressed, bagged, and returned to the order-fulfillment chain - all within a day.

  This system isn't unique to Lands' End. If you went to L. L. Bean or J.Crew or, for that matter, a housewares-catalogue company like Pottery Barn, you'd find the same kind of system. It's what all modern, automated warehouses look like, and it is as much a part of E-commerce as a Web site. In fact, it is the more difficult part of E-commerce. Consider the problem of the Christmas rush. Lands' End records something like thirty per cent of its sales during November and December. A well- supported Web site can easily handle those extra hits, but for the rest of the operation that surge in business represents a considerable strain. Lands' End, for example, aims to respond to every phone call or Lands' End Live query within twenty seconds, and to ship out every order within twenty-four hours of its receipt. In August, those goals are easily met. But, to maintain that level of service in November and December, Lands' End must hire an extra twenty-six hundred people, increasing its normal payroll by more than fifty per cent. Since unemployment in the Madison area is hovering around one per cent, this requires elaborate planning: the company charters buses to bring in students from a nearby college, and has made a deal in the past with a local cheese factory to borrow its workforce for the rush. Employees from other parts of the company are conscripted to help out as pickers, while others act as "runners" in the customer-service department, walking up and down the aisles and jumping into any seat made vacant by someone taking a break. Even the structure of the warehouse is driven, in large part, by the demands of the holiday season. Before the popularization of the bar code, in the early nineteen- eighties, Lands' End used what is called an "order picking" method. That meant that the picker got your ticket, then went to the shirt room and got your shirt, and the shoe room and got your shoes, then put your order together. If another shoe-and- shirt order came over next, she would have to go back to the shirts and back to the shoes all over again. A good picker under the old system could pick between a hundred and fifty and a hundred and seventy-five pieces an hour. The new technique, known as "batch picking," is so much more efficient that a good picker can now retrieve between six hundred and seven hundred pieces an hour. Without bar codes, if you placed an order in mid-December, you'd be hard pressed to get it by Christmas.

  None of this is to minimize the significance of the Internet. Lands' End has a feature on its Web site which allows you to try clothes on a virtual image of yourself - a feature that is obviously not possible with a catalogue. The Web site can list all the company's merchandise, whereas a catalogue has space to list only a portion of the inventory. But how big a role does the Internet ultimately play in E-commerce? It doesn't much affect the cost of running a customer-service department. It reduces catalogue costs, but it doesn't eliminate traditional marketing, because you still have to remind people of your Web site. You still need to master batch picking. You still need the Willy Wonka warehouse. You still need dozens of sewers in the inseaming department, and deals with the local cheese factory, and buses to ship in students every November and December. The head of operations for Lands' End is a genial man in his fifties named Phil Schaecher, who works out of a panelled office decorated with paintings of ducks which overlooks the warehouse floor. When asked what he would do if he had to choose between the two great innovations of the past twenty years - the bar code, which has transformed the back end of his business, and the Internet, which is transforming the front end - Schaecher paused, for what seemed a long time. "I'd take the Internet," he said finally, toeing the line that all retailers follow these days. Then he smiled. "But of course if we lost bar codes I'd retire the next day."

  5.

  On a recent fall morning, a young woman named Charlene got a call from a shipping agent at a firm in Oak Creek, Wisconsin. Charlene is a dispatcher with a trucking company in Akron, Ohio, called Roberts Express. She sits in front of a computer with a telephone headset on, in a large crowded room filled with people in front of computers wearing headsets, not unlike the large crowded room at Lands' End. The shipping agent told Charlene that she had to get seven drums of paint to Muskegon, Michigan, as soon as possible. It was 11:25 a.m. Charlene told the agent she would call her back, and immediately typed those details into her computer, which relayed the message to the two-way-communications satellite that serves as the backbone for the Roberts transportation network. The Roberts satellite, in turn, "pinged" the fifteen hundred independent truckers that Roberts works with, and calculated how far each available vehicle was from the customer in Oak Creek. Those data were then analyzed by proprietary software, which sorted out the cost of the job and the distance between Muskegon and Oak Creek, and sifted through more than fifteen variables governing the optimal distribution of the fleet.

  This much - the satellite relay and the probability calculation - took a matter of seconds. The trip, Charlene's screen told her, was two hundred and seventy-four miles and would cost seven hundred and twenty-six dollars. The computer also gave her twenty-three candidates for the run, ranked in order of preference. The first, Charlene realized, was ineligible, because federal regulations limit the number of hours drivers can spend on the road. The second, she found out, was being held for another job. The third, according to the satellite, was fifty miles away, which was too far. But the fourth, a husband- and-wife team named Jerry and Ann Love, seemed ideal. They were just nineteen miles from OakCreek. "I've worked with them before," Charlene said. "They're really nice people." At eleven-twenty-seven, Charlene sent the Loves an E-mail message, via satellite, that would show up instantly on the computer screens Roberts installs in the cabs of all its contractors. According to Roberts' rules, they had ten minutes to respond. "I'm going to give them a minute or two," Charlene said. There was no answer, so she called the Loves on their cell phone. Ann Love answered. "We'll do that," she said. Charlene chatted with her for a moment and then, as an afterthought, E-mailed the Loves again: "Thank you!" It was eleven-thirty.

  Trucking companies didn't work this way twenty years ago. But Roberts uses its state-of-the-art communications and computer deployment to give the shipping business a new level of precision. If your pickup location is within twenty-five miles of one of the company's express centers - and Roberts has express centers in most major North American cities - Roberts will pick up a package of almost any size within ninety minutes, and it will do so twenty-four hours a day, seven days a week. If the cargo is located between twenty-six and fifty miles of an express center, it will be picked up within two hours. More than half of those deliveries will be made by midnight of the same day. Another twenty-five per cent will be made by eight o'clock the next morning. Ninety-six per cent of all Roberts deliveries are made within fifteen minutes of the delivery time promised when the order is placed. Because of its satellite system, the company knows precisely, within yards, where your order is at all times. The minute the computer tells her your truck is running fifteen minutes behind, Charlene or one of her colleagues will call you to work out some kind of solution. Roberts has been known to charter planes or send in Huey helicopters to rescue time-sensitive cargo stranded in traffic or in a truck that has broken down. The result is a truck-based system so efficient that Roberts estimates it can outperform air freight at distances of up to seven hundred or eight hundred miles.

  Roberts, of course, isn't the only company to reinvent the delivery business over the past twenty years. In the same period, Federal Express has put together, from scratch, a network of six hundred and forty-three planes, forty-three thousand five hundred vehicles, fourteen hundred service centers, thirty-four thousand drop boxes, and a hundred and forty-eight thousand employees - all coordinated by satellite links and organized around a series of huge, automated, bar- code-driven Willy Wonka warehouses. Federal Express was even a pioneer in the development of aircraft antifog navigational equipment: if it absolutely, positively has to get there overnight, the weather can't be allowed to get in the way.

  E-commerce would be impossible without this extraordinary infrastructure. Would you care that you could order a new wardrobe with a few clicks of a mouse if the package took a couple of weeks to get to you? Lands' End has undergone three major changes over the past couple of decades. The first was the introduction of an 800 number, in 1978; the second was express delivery, in 1994; and the third was the introduction of a Web site, in 1995. The first two innovations cut the average transaction time - the time between the moment of ordering and the moment the goods are received - from three weeks to four days. The third innovation has cut the transaction time from four days to, well, four days.

  It isn't just that E-commerce depends on express mail; there's a sense in which E-commerce is express mail. Right now, billions of dollars are being spent around the country on so-called "last-mile delivery systems." Companies such as Webvan, in San Francisco, or Kozmo.com, in New York, are putting together networks of trucks and delivery personnel which can reach almost any home in their area within an hour. What if Webvan or Kozmo were somehow integrated into a huge, national, Roberts-style network of connected trucks? And what if that network were in turn integrated into the operations of a direct merchant like Lands' End? There may soon come a time when a customer from Northampton could order some shirts on LandsEnd.com at the height of the Christmas rush, knowing that the retailer's computer could survey its stock, assess its warehouse capabilities, "ping" a network of thousands of trucks it has at its disposal, look up how many other orders are going to his neck of the woods, check in with his local Kozmo or Webvan, and tell him, right then and there, precisely what time it could deliver those shirts to him that evening or the next morning. It's not hard to imagine, under such a system, that Lands' End's sales would soar; the gap between the instant gratification of a real store and the delayed gratification of a virtual store would narrow even further. It would be a revolution of sorts, a revolution of satellites, probability models, people in headsets, cell phones, truckers, logistics experts, bar codes, deals with the local cheese factory, and - oh yes, the Internet.

  The interesting question, of course, is why we persist in identifying the E-commerce boom as an Internet revolution. Part of the reason, perhaps, is simply the convenience of the word "Internet" as a shorthand for all the technological wizardry of the last few decades. But surely whom and what we choose to celebrate in any period of radical change says something about the things we value. This fall, for example, the Goodyear Tire & Rubber Company - a firm with sales of more than thirteen billion dollars - was dropped from the Dow Jones industrial average. After all, Goodyear runs factories, not Web sites. It is based in Akron, not in Silicon Valley. It is part of the highway highway, not the information highway. The manufacturing economy of the early twentieth century, from which Goodyear emerged, belonged to trade unions and blue-collar men. But ours is the first economic revolution in history that the educated classes have sought to claim as wholly their own, a revolution of Kevin Kelly's "communication" and Bill Gates's "thought" - the two activities for which the Net-Geners believe themselves to be uniquely qualified. Today's talkers and thinkers value the conception of ideas, not their fulfillment. They give credit to the catalogue, but not to the postman who delivered it, or to the road he travelled on. The new economy was supposed to erase all hierarchies. Instead, it has devised another one. On the front end, there are visionaries. On the back end, there are drones.

  6.

  One of the very first packages ever delivered by parcel post, in 1913, was an eight-pound crate of apples sent from New Jersey to President Wilson at the White House. The symbolism of that early delivery was deliberate. When the parcel post was established, the assumption was that it would be used by farmers as a way of sending their goods cheaply and directly to customers in the city. "Let us imagine that the Gotham family," one journalist wrote at the time, immured in the city by the demands of Father Gotham's business, knew that twice a week during the summer they could get from Farmer Ruralis, forty miles out in the country, a hamper of fresh-killed poultry, green peas, string beans, asparagus, strawberries, lettuce, cherries, summer squash, and what not; that the "sass" would be only a day from garden to table; that prices would be lower than market prices; that the cost of transportation would be only thirty-five cents in and, say, eleven cents for the empty hamper back again. Would the Gotham family be interested?

  The Post Office told rural mailmen to gather the names and addresses of all those farmers along their routes who wanted to sell their produce by mail. Those lists were given to city mailmen, who delivered them along their routes, so interested customers could get in contact with interested farmers directly. Because customers wanted to know what kind of produce each farmer had to sell, local postmasters began including merchandise information on their lists, essentially creating a farm-produce mail-order catalogue. A California merchant named David Lubin proposed a scheme whereby a farmer would pick up colored cards from the post office - white for eggs, pink for chickens, yellow for butter - mark each card with his prices, and mail the cards back. If he had three chickens that week for a dollar each, he would mail three pink cards to the post office. There they would be put in a pigeonhole with all the other pink cards. Customers could come by and comparison shop, pick out the cards they liked, write their address on these cards, and have the postal clerk mail them back to the farmer. It was a pre-digital eBay. The scheme was adopted in and around Sacramento, and Congress appropriated ten thousand dollars to try a similar version of it on a large scale.

  At about the same time, an assistant Postmaster General, James Blakslee, had the bright idea of putting together a fleet of parcel-post trucks, which would pick up farm produce from designated spots along the main roads and ship it directly to town. Blakslee laid out four thousand miles of produce routes around the country, to be covered by fifteen hundred parcel- post trucks. In 1918, in the system's inaugural run, four thousand day-old chicks, two hundred pounds of honey, five hundred pounds of smoked sausage, five hundred pounds of butter, and eighteen thousand eggs were carried from Lancaster, Pennsylvania, to New York City, all for $31.60 in postage. New York's Secretary of State called it "an epoch in the history of the United States and the world."

  Only, it wasn't. The Post Office had devised a wonderful way of communicating between farmer and customer. But there is more to a revolution than communication, and within a few years the farm-to-table movement, which started out with such high hopes, was dead. The problem was that Blakslee's trucks began to break down, which meant that the food on board spoiled. Eggs proved hard to package, and so they often arrived damaged. Butter went rancid. In the winter of 1919-20, Blakslee collected a huge number of orders for potatoes, but, as Wayne Fuller writes in his wonderful history of the era, "RFD:The Changing Face of Rural America," the potatoes that year were scarce, and good ones even scarcer, and when Blakslee's men were able to buy them and attempted delivery, nothing but trouble followed. Some of the potatoes were spoiled to begin with; some froze in transit; prices varied, deliveries went astray, and customers complained loudly enough for Congress to hear. One harried official wrote Blakslee that he could "fill the mails with complaints from people who have ordered potatoes from October to December.".   Some people had been waiting over four months, either to have the potatoes delivered or their money refunded.

  Parcel post, in the end, turned out to be something entirely different from what was originally envisioned - a means not to move farm goods from country to town but to move consumer goods from town to country. That is the first lesson from the revolution of a hundred years ago, and it's one that should give pause to all those eager to pronounce on the significance of the Internet age: the nature of revolutions is such that you never really know what they mean until they are over. The other lesson, of course, is that coming up with a new way of connecting buyers and sellers is a very fine thing, but what we care about most of all is getting our potatoes.
GO TO TOP MENU

  The President came to Manhattan last Tuesday to speak at the United Nations, and, for the sake of his motorcade, the police closed down First Avenue between Thirty-fourth and Fifty-first Streets, and almost all the eastbound streets off Second Avenue in the Thirties and Forties - bringing life above ground on the East Side of midtown to a halt.

  In traffic terms, this was not as bad as the gridlock that occurred in June, when the President came to Times Square to see a special performance of "The Iceman Cometh" and the police barricaded the area around Forty-second Street and Eighth Avenue for most of the afternoon, and parts of Forty-seventh Street for the whole day. That particular trip immobilized the busiest and most crowded section of the nation's busiest and most crowded city at one of the busiest and most crowded times of the year, and created a truly historic traffic jam that rippled uptown, downtown, and crosstown, and was still untangling itself long after the President flew home. But a gridlock purist would probably argue that the Times Square visit was sui generis, a once-in-a-Presidency kind of thing. And, besides, one imagines that most New Yorkers didn't mind giving up a few hours of their day in the cause of a beleaguered President's R. and R. (particularly given the parlous state of Washington, D.C., theatre).

  Last week's U.N. visit is more typical and a bit more problematic. It seems that, once a month now, Mr. Clinton descends on the East Side, like a Chinese emperor, commanding all life surrounding him to cease. New Yorkers are famous for their impatience with the slightest inconvenience. Why on earth do they put up with this?

  The conventional answer is that the President is the President and his safety is paramount. But there is nothing particularly safe about the current system. Any terrorist with a missile launcher or a bomb can tell what route the President is taking by reading about the street closings in the morning papers, and can tell exactly when the President is coming by watching for the enormous parade of police motorcycles, limousines, and black Chevy Suburbans that make up his motorcade. If the President really wanted to be safe, he'd take advantage of the anonymity and spontaneity offered by the city; he'd slip into town by some circuitous route, in a hired black Lincoln with tinted windows. Then he'd be invisible. But that's just it: when the President drags his motorcade into Manhattan, he's not using an urban model of safety; he's using a suburban model. In the suburbs, you cannot lose yourself in a crowd of black Lincolns or disappear in a maze of side streets. In the suburbs, safety lies in surrounding yourself with police escorts and bulletproof S.U.V.s for the dash down the freeway.

  The tension between things urban and things suburban is a fixture of the race debate, of arguments over taxes and schools and transportation policy. But in some ways it matters most on this very practical and prosaic level, because when suburban ideas (about safety or, for that matter, about anything) are imported into the city they have the capacity to make the everyday lives of lots of people acutely miserable.

  On the same day last week that the President visited the U.N., for example, it rained in New York, and on the sidewalks of the city, here and there, were people carrying what are known in the accessories business as golf umbrellas: those huge umbrellas that range anywhere from three to almost five feet in diameter. A golf umbrella is, like the Presidential motorcade, a quintessentially suburban notion: it is supposed to be large enough to cover you and your clubs on the golf course, or to keep you and your children dry as you run across the mall parking lot to your station wagon. It is not meant for crowded city streets, and if you had followed all those people with golf umbrellas through midtown last Tuesday you would have seen their umbrellas getting tangled up with other people's umbrellas. You would have seen annoyed pedestrians stepping sharply out of the way to avoid being impaled by giant umbrella ribs, and, as a result, bumping into other people with umbrellas, until whole stretches of sidewalk - like the East Side in the grip of a White House visit - had turned into a sorry, congealed mess.

  Mr. Clinton, as both a golfer and a soon-to-be suburbanite, presumably has a few golf umbrellas of his own. One only hopes that when he comes into Manhattan as a private citizen he remembers where he is and, this time, at least, leaves that bit of the suburbs at home.
GO TO TOP MENU

  How the Information Age could blow away the blockbuster.

  1.

  In 1992, a sometime actress named Rebecca Wells published a novel called "Little Altars Everywhere" with a small, now defunct press in Seattle. Wells was an unknown author, and the press had no money for publicity. She had a friend, however, who spent that Thanksgiving with a friend who was a producer of National Public Radio's "All Things Considered." The producer read the book and passed it on to Linda Wertheimer, a host of the show, and she liked it so much that she put Wells on her program. That interview, in turn, was heard by a man who was listening to the radio in Blytheville, Arkansas, and whose wife, Mary Gay Shipley, ran the town bookstore. He bought the book and gave it to her; she loved it, and, with that, the strange and improbable rise of Rebecca Wells, best-selling author, began. Blytheville is a sleepy little town about an hour or so up the Mississippi from Memphis, and Mary Gay Shipley's bookstore - That Bookstore in Blytheville - sits between the Red Ball Barber Shop and Westbrook's shoe store on a meandering stretch of Main Street. The store is just one long room in a slightly shabby storefront, with creaky floors and big overhead fans and subject headings on the shelves marked with Post-it notes. Shipley's fiction section takes up about as much shelf space as a typical Barnes & Noble devotes to, say, homeopathic medicine. That's because Shipley thinks that a book buyer ought to be able to browse and read the jacket flap of everything that might catch her eye, without being overwhelmed by thousands of choices. Mostly, though, people come to Mary Gay Shipley's store in order to find out what Mary Gay thinks they ought to be reading, and in 1993 Mary Gay Shipley thought people ought to be reading "Little Altars Everywhere." She began ordering it by the dozen, which, Shipley says, "for us, is huge." She put it in the little rack out front where she lists her current favorites. She wrote about it in the newsletter she sends to her regular customers. "We could tell it was going to have a lot of word of mouth," she says. "It was the kind of book where you could say, 'You'll love it. Take it home.' " The No. 1 author at That Bookstore in Blytheville in 1993 was John Grisham, as was the case in nearly every bookstore in the country. But No. 2 was Rebecca Wells.

  "Little Altars Everywhere" was not a best-seller. But there were pockets of devotees around the country - in Blytheville; at the Garden District Book Shop, in New Orleans; at Parkplace books, in Kirkland, Washington - and those pockets created a buzz that eventually reached Diane Reverand, an editor in New York. Reverand published Wells's next book, "Divine Secrets of the Ya-Ya Sisterhood," and when it hit the bookshelves the readers and booksellers of Blytheville, the Garden District, and Kirkland were ready. "When 'The Ya-Ya Sisterhood' came out, I met with an in-store sales rep from HarperCollins," Shipley said. She is a tall woman with graying hair and a quiet, dignified bearing. "I'm not real sure he knew what a hot book this was. When he came in the store, I just turned the page of the catalogue and said, 'I want one hundred copies,' and his jaw fell to the table, because I usually order four or two or one. And I said, 'I want her to come here! And if you go anywhere, tell people this woman sells in Blytheville!'"

  Wells made the trip to Arkansas and read in the back of Shipley's store; the house was packed, and the women in the front row wore placards saying "Ya-Ya." She toured the country, and the crowds grew steadily bigger. "Before the numbers really showed it, I'd be signing books and there would be groups of women who would come together, six or seven, and they would have me sign anywhere between three and ten books," Wells recalls. "And then, after that, I started noticing mothers and daughters coming. Then I noticed that the crowds started to be three-generational - there would be teen-agers and sixth graders." "Ya-Ya" sold fifteen thousand copies in hardcover. The paperback sold thirty thousand copies in its first two months. Diane Reverand took out a single- column ad next to the contents page of The New Yorker - the first dollar she'd spent on advertising for the paperback - and sales doubled to sixty thousand in a month. It sold and sold, and by February of 1998, almost two years after the book was published, it reached the best-seller lists. There are now nearly three million copies in print. Rebecca Wells, needless to say, has a warm spot in her heart for people like Mary Gay Shipley. "Mary Gay is a legend," she says. "She just kept putting my books in people's hands."

  2.

  In the book business, as in the movie business, there are two kinds of hits: sleepers and blockbusters. John Grisham and Tom Clancy and Danielle Steel write blockbusters. Their books are announced with huge publicity campaigns. Within days of publication, they leap onto the best-seller lists. Sales start high - hundreds of thousands of copies in the first few weeks - and then taper off. People who buy or watch blockbusters have a clear sense of what they are going to get: a Danielle Steel novel is always - well, a Danielle Steel novel. Sleepers, on the other hand, are often unknown quantities. Sales start slowly and gradually build; publicity, at least early on, is often nonexistent. Sleepers come to your attention by a slow, serendipitous path: a friend who runs into a friend who sets up the interview that just happens to be heard by a guy married to a bookseller. Sleepers tend to emerge from the world of independent bookstores, because independent bookstores are the kinds of places where readers go to ask the question that launches all sleeper hits: Can you recommend a book to me? Shipley was plugging Terry Kay's "To Dance with the White Dog" long before it became a best-seller. She had Melinda Haynes lined up to do a reading at her store before Oprah tapped "Mother of Pearl" as one of her recommended books and it shot onto the best-seller lists. She read David Guterson's "Snow Falling on Cedars" in manuscript and went crazy for it. "I called the publisher, and they said, 'We think it's a regional book.' And I said, 'Write it down. "M.G.S. says this is an important book."'" All this makes it sound as if she has a sixth sense for books that will be successful, but that's not quite right. People like Mary Gay Shipley don't merely predict sleeper hits; they create sleeper hits.

  Most of us, of course, don't have someone like Mary Gay Shipley in our lives, and with the decline of the independent bookstore in recent years the number of Shipleys out there creating sleeper hits has declined as well. The big chain bookstores that have taken over the bookselling business are blockbuster factories, since the sheer number of titles they offer can make browsing an intimidating proposition. As David Gernert, who is John Grisham's agent and editor, explains, "If you walk into a superstore, that's where being a brand makes so much more of a difference. There is so much more choice it's overwhelming. You see walls and walls of books. In that kind of environment, the reader is drawn to the known commodity. The brand-name author is now a safe haven." Between 1986 and 1996, the share of book sales represented by the thirty top-selling hardcover books in America nearly doubled.

  The new dominance of the blockbuster is part of a familiar pattern. The same thing has happened in the movie business, where a handful of heavily promoted films featuring "bankable" stars now command the lion's share of the annual box-office. We live, as the economists Robert Frank and Philip Cook have argued, in a "winner-take-all society," which is another way of saying that we live in the age of the blockbuster. But what if there were a way around the blockbuster? What if there were a simple way to build your very own Mary Gay Shipley? This is the promise of a new technology called collaborative filtering, one of the most intriguing developments to come out of the Internet age.

  3.

  If you want a recommendation about what product to buy, you might want to consult an expert in the field. That's a function that magazines like Car and Driver and Sound & Vision perform. Another approach is to poll users or consumers of a particular product or service and tabulate their opinions. That's what the Zagat restaurant guides and consumer-ratings services like J. D. Power and Associates do. It's very helpful to hear what an "expert" audiophile has to say about the newest DVD player, or what the thousands of owners of the new Volkswagen Passat have to say about reliability and manufacturing defects. But when it comes to books or movies - what might be called "taste products" - these kinds of recommendations aren't nearly as useful. Few moviegoers, for example, rely on the advice of a single movie reviewer. Most of us gather opinions from a variety of sources - from reviewers whom we have agreed with in the past, from friends who have already seen the movie, or from the presence of certain actors or directors whom we already like - and do a kind of calculation in our heads. It's an imperfect procedure. You can find out a great deal about what various critics have to say. But they're strangers, and, to predict correctly whether you'll like something, the person making the recommendation really has to know something about you.

  That's why Shipley is such a powerful force in touting new books. She has lived in Blytheville all her life and has run the bookstore there for twenty-three years, and so her customers know who she is. They trust her recommendations. At the same time, she knows who they are, so she knows how to match up the right book with the right person. For example, she really likes David Guterson's new novel, "East of the Mountains," but she's not about to recommend it to anyone. It's about a doctor who has cancer and plans his own death and, she says, "there are some people dealing with a death in their family for whom this is not the book to read right now." She had similar reservations about Charles Frazier's "Cold Mountain." "There were people I know who I didn't think would like it," Shipley said. "And I'd tell them that. It's a journey story. It's not what happens at the end that matters, and there are some people for whom that's just not satisfying. I don't want them to take it home, try to read it, not like it, then not go back to that writer." Shipley knows what her customers will like because she knows who they are.

  Collaborative filtering is an attempt to approximate this kind of insider knowledge. It works as a kind of doppelgänger search engine. All of us have had the experience of meeting people and discovering that they appear to have the very same tastes we do - that they really love the same obscure foreign films that we love, or that they are fans of the same little-known novelist whom we are obsessed with. If such a person recommended a book to you, you'd take that recommendation seriously, because cultural tastes seem to run in patterns. If you and your doppelgänger love the same ten books, chances are you'll also like the eleventh book he likes. Collaborative filtering is simply a system that sifts through the opinions and preferences of thousands of people and systematically finds your doppelgänger - and then tells you what your doppelgänger's eleventh favorite book is.

  John Riedl, a University of Minnesota computer scientist who is one of the pioneers of this technology, has set up a Web site called MovieLens, which is a very elegant example of collaborative filtering at work. Everyone who logs on - and tens of thousands of people have already done so - is asked to rate a series of movies on a scale of 1 to 5, where 5 means "must see" and 1 means "awful." For example, Irated "Rushmore" as a 5, which meant that I was put into the group of people who loved "Rushmore." I then rated "Summer of Sam" as a 1, which put me into the somewhat smaller and more select group that both loved "Rushmore" and hated "Summer of Sam." Collaborative-filtering systems don't work all that well at first, because, obviously, in order to find someone's cultural counterparts you need to know a lot more about them than how they felt about two movies. Even after I had given the system seven opinions (including "Election," 4; "Notting Hill," 2; "The Sting," 4; and "Star Wars," 1), it was making mistakes. It thought I would love "Titanic" and "Zero Effect," and I disliked them both. But after I had plugged in about fifteen opinions - which Riedl says is probably the minimum - I began to notice that the rating that MovieLens predicted I would give a movie and the rating I actually gave it were nearly always, almost eerily, the same. The system had found a small group of people who feel exactly the same way I do about a wide range of popular movies.

  What makes this collaborative-filtering system different from those you may have encountered on Amazon.com or Barnesandnoble.com? In order to work well, collaborative filtering requires a fairly representative sample of your interests or purchases. But most of us use retailers like Amazon only for a small percentage of our purchases. For example, I buy the fiction I read at the Barnes & Noble around the corner from where I live. I buy most of my nonfiction in secondhand bookstores, and I use Amazon for gifts and for occasional work-related books that I need immediately, often for a specific and temporary purpose. That's why, bizarrely, Amazon currently recommends that I buy a number of books by the radical theorist Richard Bandler, none of which I have any desire to read. But if I were to buy a much bigger share of my books on-line, or if I "educated" the filter - as Amazon allows every customer to do - and told it what I think of its recommendations, it's easy to see how, over time, it could turn out to be a powerful tool.

  In a new book, "Net Worth," John Hagel, an E-commerce consultant with McKinsey & Company, and his co-author, Marc Singer, suggest that we may soon see the rise of what they call "infomediaries," which are essentially brokers who will handle our preference information. Imagine, for example, that I had set up a company that collected and analyzed all your credit-card transactions. That information could be run through a collaborative filter, and the recommendations could be sold to retailers in exchange for discounts. Steve Larsen, the senior vice-president of marketing for Net Perceptions - a firm specializing in collaborative filtering which was started by Riedl and the former Microsoft executive Steven Snyder, among others - says that someday there might be a kiosk at your local video store where you could rate a dozen or so movies and have the computer generate recommendations for you from the movies the store has in stock. "Better yet, when I go there with my wife we put in my card and her card and say, 'Find us a movie we both like,'" he elaborates. "Or, even better yet, when we go with my fifteen-year-old daughter, 'Find us a movie all three of us like.'" Among marketers, the hope is that such computerized recommendations will increase demand. Right now, for example, thirty-five per cent of all people who enter a video store leave empty-handed, because they can't figure out what they want; the point of putting kiosks in those stores would be to lower that percentage. "It means that people might read more, or listen to music more, or watch videos more, because of the availability of an accurate and dependable and reliable method for them to learn about things that they might like," Snyder says.

  One of Net Perceptions' clients is SkyMall, which is a company that gathers selections from dozens of mail-order catalogues - from Hammacher Schlemmer and L. L. Bean to the Wine Enthusiast - and advertises them in the magazines that you see in the seat pockets of airplanes. SkyMall licensed the system both for their Web site and for their 800-number call center, where the software looks for your doppelgänger while you are calling in with your order, and a few additional recommendations pop up on the operator's screen. SkyMall's system is still in its infancy, but, in a test, the company found that it has increased the total sales per customer somewhere between fifteen and twenty-five per cent. What's remarkable about the SkyMall system is that it links products from many different categories. It's one thing, after all, to surmise that if someone likes "The Remains of the Day" he is also going to like "A Room with a View." But it's quite another to infer that if you liked a particular item from the Orvis catalogue there's a certain item from Reliable Home Office that you'll also be interested in. "Their experience has been absolutely hilarious," Larsen says. "One of the very first recommendations that came out of the engine was for a gentleman who was ordering a blue cloth shirt, a twenty-eight-dollar shirt. Our engine recommended a hundred-and-thirty-five-dollar cigar humidor - and he bought it! I don't think anybody put those two together before."

  The really transformative potential of collaborative filtering, however, has to do with the way taste products - books, plays, movies, and the rest - can be marketed. Marketers now play an elaborate game of stereotyping. They create fixed sets of groups - middle-class-suburban, young-urban-professional, inner-city- working-class, rural-religious, and so on - and then find out enough about us to fit us into one of those groups. The collaborative-filtering process, on the other hand, starts with who we are, then derives our cultural "neighborhood" from those facts. And these groups aren't permanent. They change as we change. I have never seen a film by Luis Buñuel, and I have no plans to. I don't put myself in the group of people who like Buñuel. But if I were to see "That Obscure Object of Desire" tomorrow and love it, and enter my preference on MovieLens, the group of people they defined as "just like me" would immediately and subtly change.

  A group at Berkeley headed by the computer scientist Ken Goldberg has, for instance, developed a collaborative-filtering system for jokes. If you log on to the site, known as Jester, you are given ten jokes to rate. (Q.: Did you hear about the dyslexic devil worshipper? A.: He sold his soul to Santa.) These jokes aren't meant to be especially funny; they're jokes that reliably differentiate one "sense of humor" from another. On the basis of the humor neighborhood you fall into, Jester gives you additional jokes that it thinks you'll like. Goldberg has found that when he analyzes the data from the site - and thirty-six thousand people so far have visited Jester - the resulting neighborhoods are strikingly amorphous. In other words, you don't find those thirty-six thousand people congregating into seven or eight basic humor groups - off-color, say, or juvenile, or literary. "What we'd like to see is nice little clusters," Goldberg says. "But, when you look at the results, what you see is something like a cloud with sort of bunches, and nothing that is nicely defined. It's kind of like looking into the night sky. It's very hard to identify the constellations." The better you understand someone's particular taste pattern - the deeper you probe into what he finds interesting or funny - the less predictable and orderly his preferences become.

  Collaborative filtering underscores a lesson that, for the better part of history, humans have been stubbornly resistant to learning: if you want to understand what one person thinks or feels or likes or does it isn't enough to draw inferences from the general social or demographic category to which he belongs. You cannot tell, with any reasonable degree of certainty, whether someone will like "The Girl's Guide to Hunting and Fishing" by knowing that the person is a single twenty-eight-year-old woman who lives in Manhattan, any more than you can tell whether somebody will commit a crime knowing only that he's a twenty-eight- year-old African-American male who lives in the Bronx. Riedl has taken demographic data from the people who log on to MovieLens - such as their age and occupation and sex - but he has found that it hardly makes his predictions any more accurate. "What you tell us about what you like is far more predictive of what you will like in the future than anything else we've tried," he says. "It seems almost dumb to say it, but you tell that to marketers sometimes and they look at you puzzled."

  None of this means that standard demographic data is useless. If you were trying to figure out how to market a coming- of-age movie, you'd be most interested in collaborative-filtering data from people below, say, the age of twenty-eight. Facts such as age and sex and place of residence are useful in sorting the kinds of information you get from a recommendation engine. But the central claim of the collaborative-filtering movement is that, head to head, the old demographic and "psychographic" data cannot compete with preference data. This is a potentially revolutionary argument. Traditionally, there has been almost no limit to the amount of information marketers have wanted about their customers: academic records, work experience, marital status, age, sex, race, Zip Code, credit records, focus-group sessions - everything has been relevant, because in trying to answer the question of what we want marketers have taken the long way around and tried to find out first who we are. Collaborative filtering shows that, in predicting consumer preferences, none of this information is all that important. In order to know what someone wants, what you really need to know is what they've wanted.

  4.

  How will this affect the so-called blockbuster complex? When a bookstore's sales are heavily driven by the recommendations of a particular person - a Mary Gay Shipley - sleepers, relatively speaking, do better and blockbusters do worse. If you were going to read only Clancy and Grisham and Steel, after all, why would you need to ask Shipley what to read? This is what David Gernert, Grisham's agent, meant when he said that in a Barnes & Noble superstore a brand like Grisham enjoys a "safe haven." It's a book you read when there is no one, like Shipley, with the credibility to tell you what else you ought to read. Gernert says that at this point in Grisham's career each of his novels follows the same general sales pattern. It rides high on the best-seller lists for the first few months, of course, but, after that, "his sales pick up at very specific times - notably, Father's Day and Mother's Day, and then it will sell well again for Christmas." That description makes it clear that Grisham's books are frequently bought as gifts. And that's because gifts are the trickiest of all purchases. They require a guess about what somebody else likes, and in conditions of uncertainty the logical decision is to buy the blockbuster, the known quantity.

  Collaborative filtering is, in effect, anti-blockbuster. The more information the system has about you, the more narrow and exclusive its recommendations become. It's just like Shipley: it uses its knowledge about you to steer you toward choices you wouldn't normally know about. I gave MovieLens my opinions on fifteen very mainstream American movies. I'm a timid and unsophisticated moviegoer. I rarely see anything but very commercial Hollywood releases. It told me, in return, that I would love "C'est Arrivé Près de Chez Vous," an obscure 1992 Belgian comedy, and "Shall We Dance," the 1937 Fred and Ginger vehicle. In other words, among my moviegoing soul mates are a number of people who share my views on mainstream fare but who also have much greater familiarity with foreign and classic films. The system essentially put me in touch with people who share my tastes but who happen to know a good deal more about movies. Collaborative filtering gives voice to the expert in every preference neighborhood. A world where such customized recommendations were available would allow Shipley's well-read opinions to be known not just in Blytheville but wherever there are people who share her taste in books.

  Collaborative filtering, in short, has the ability to reshape the book market. When customized recommendations are available, choices become more heterogeneous. Big bookstores lose their blockbuster bias, because customers now have a way of narrowing down their choices to the point where browsing becomes easy again. Of the top hundred best-selling books of the nineteen-nineties, there are only a handful that can accurately be termed sleepers - Robert James Waller's "The Bridges of Madison County," James Redfield's "The Celestine Prophecy," John Berendt's "Midnight in the Garden of Good and Evil," Charles Frazier's "Cold Mountain." Just six authors - John Grisham, Tom Clancy, Stephen King, Michael Crichton, Dean Koontz, and Danielle Steel - account for sixty-three of the books on the list. In a world more dependent on collaborative filtering, Grisham, Clancy, King, and Steel would still sell a lot of books. But you'd expect to see many more books like "Divine Secrets of the Ya-Ya Sisterhood" - many more new writers - make their way onto the best- seller list. And the gap between the very best selling books and those in the middle would narrow. Collaborative filtering, Hagel says, "favors the smaller, the more talented, more quality products that may have a hard time getting visibility because they are not particularly good at marketing."

  5.

  In recent years, That Bookstore in Blytheville has become a mecca for fiction in the South. Prominent writers drop by all the time to give readings in the back, by the potbellied stove. John Grisham himself has been there nine times, beginning with his tour for "The Firm," which was the hit that turned him into a blockbuster author. Melinda Haynes, Bobbie Ann Mason, Roy Blount, Jr., Mary Higgins Clark, Billie Letts, Sandra Brown, Jill Conner Browne, and countless others have recently made the drive up from Memphis. Sometimes Shipley will host a supper for them after the reading, and send the proceeds from the event to a local literacy program.

  There seems, in this era of mega-stores, something almost impossibly quaint about That Bookstore in Blytheville. The truth is, though, that the kind of personalized recommendation offered by Mary Gay Shipley represents the future of marketing, not its past. The phenomenal success in recent years of Oprah Winfrey's book club - which created one best-seller after another on the strength of its nominations - suggests that, in this age of virtually infinite choice, readers are starved for real advice, desperate for a recommendation from someone they know and who they feel knows them. "Certain people don't want to waste their time experimenting with new books, and the function we provide here is a filter," Shipley says, and as she speaks you can almost hear the makings of another sleeper on the horizon. "If we like something, we get behind it. I'm reading a book right now called 'Nissa's Place,' by Alexandria LaFaye. She's a woman I think we're going to be hearing more from."
GO TO TOP MENU

  What do Wayne Gretzky, Yo-Yo Ma, and a brain surgeon named Charlie Wilson have in common?

  1.

  Early one recent morning, while the San Francisco fog was lifting from the surrounding hills, Charlie Wilson performed his two thousand nine hundred and eighty-seventh transsphenoidal resection of a pituitary tumor. The patient was a man in his sixties who had complained of impotence and obscured vision. Diagnostic imaging revealed a growth, eighteen millimetres in diameter, that had enveloped his pituitary gland and was compressing his optic nerve. He was anesthetized and covered in blue surgical drapes, and one of Wilson's neurosurgery residents - a tall, slender woman in her final year of training - "opened" the case, making a small incision in his upper gum, directly underneath his nose. She then tunnelled back through his nasal passages until she reached the pituitary, creating a cavity several inches deep and about one and a half centimetres in diameter.

  Wilson entered the operating room quickly, walking stiffly, bent slightly at the waist. He is sixty-nine - a small, wiry man with heavily muscled arms. His hair is cut very close to his scalp, so that, as residents over the years have joked, he might better empathize with the shaved heads of his patients. He is part Cherokee Indian and has high, broad cheekbones and large ears, which stick out at almost forty-five-degree angles. He was wearing Nike cross-trainers, and surgical scrubs marked with the logo of the medical center he has dominated for the past thirty years - Moffitt Hospital, at the University of California, San Francisco. When he was busiest, in the nineteen-eighties, he would routinely do seven or eight brain surgeries in a row, starting at dawn and ending at dusk, lining up patients in adjoining operating rooms and striding from one to the other like a conquering general. On this particular day, he would do five, of which the transsphenoidal was the first, but the rituals would be the same. Wilson believes that neurosurgery is best conducted in silence, with a scrub nurse who can anticipate his every step, and a resident who does not have to be told what to do, only shown. There was no music in the O.R. To guard against unanticipated disturbances, the door was locked. Pagers were set to "buzz," not beep. The phone was put on "Do Not Disturb."

  Wilson sat by the patient in what looked like a barber's chair, manipulating a surgical microscope with a foot pedal. In his left hand he wielded a tiny suction tube, which removed excess blood. In his right he held a series of instruments in steady alternation: Cloward elevator, Penfield No. 2, Cloward rongeur, Fulton rongeur, conchatome, Hardy dissector, Kurze scissors, and so on. He worked quickly, with no wasted motion. Through the microscope, the tumor looked like a piece of lobster flesh, white and fibrous. He removed the middle of it, exposing the pituitary underneath. Then he took a ring curette - a long instrument with a circular scalpel perpendicular to the handle - and ran it lightly across the surface of the gland, peeling the tumor away as he did so.

  It was, he would say later, like running a squeegee across a windshield, except that in this case the windshield was a surgical field one centimetre in diameter, flanked on either side by the carotid arteries, the principal sources of blood to the brain. If Wilson were to wander too far to the right or to the left and nick either artery, the patient might, in the neurosurgical shorthand, "stroke." If he were to push too far to the rear, he might damage any number of critical nerves. If he were not to probe aggressively, though, he might miss a bit of tumor and defeat the purpose of the procedure entirely. It was a delicate operation, which called for caution and confidence and the ability to distinguish between what was supposed to be there and what wasn't. Wilson never wavered. At one point, there was bleeding from the right side of the pituitary, which signalled to Wilson that a small piece of tumor was still just outside his field of vision, and so he gently slid the ring curette over, feeling with the instrument as if by his fingertips, navigating around the carotid, lifting out the remaining bit of tumor. In the hands of an ordinary neurosurgeon, the operation - down to that last bit of blindfolded acrobatics - might have taken several hours. It took Charlie Wilson twenty-five minutes.

  Neurosurgery is generally thought to attract the most gifted and driven of medical-school graduates. Even in that rarefied world, however, there are surgeons who are superstars and surgeons who are merely very good. Charlie Wilson is one of the superstars. Those who have trained with him say that if you showed them a dozen videotapes of different neurosurgeons in action - with the camera focussed just on the hands of the surgeon and the movements of the instruments - they could pick Wilson out in an instant, the same way an old baseball hand could look at a dozen batters in silhouette and tell you which one was Willie Mays. Wilson has a distinctive fluidity and grace.

  One of the most difficult of all neurosurgical procedures is aneurysm repair, where the surgeon sets out to seal, with a tiny titanium clip, a bulge in the side of an artery caused by the weakening of its wall. If the aneurysm bursts in the process - because the clip is applied incorrectly, or the surgeon inadvertently punctures one of the tributary vessels or doesn't see something critical on the underside of the aneurysm - the patient stands a good chance of dying. Aneurysm repair is bomb disposal. Wilson made it look easy. "After he'd dissected the whole aneurysm out, and when he had control of all the feeding vessels, I'd see him grasp it and flip it back and forth, because he somehow knew that if it popped he would still be able to clip it," says Michon Morita, who trained with Wilson at U.C.S.F. in the early nineties and now practices in Honolulu. "Most people are afraid of aneurysms. He wasn't afraid of them at all. He was like a cat playing with a mouse."

  2.

  There are thousands of people who have played in the National Hockey League over the years, but there has been only one Wayne Gretzky. Thousands of cellists play professionally all over the world, but very few will ever earn comparison with Yo-Yo Ma. People like Gretzky or Ma or Charlie Wilson all have an affinity for translating thought into action. They're what we might call physical geniuses. But what makes them so good at what they do?

  The temptation is to treat physical genius in the same way that we treat intellectual genius - to think of it as something that can be ascribed to a single factor, a physical version of I.Q. When professional football coaches assess the year's crop of college prospects, they put them through drills designed to measure what they regard as athleticism: How high can you jump? How many pounds can you bench press? How fast can you sprint? The sum of the scores on these tests is considered predictive of athletic performance, and every year some college player's stock shoots up before draft day because it is suddenly discovered that he can run, say, 4.4 seconds in the forty-yard dash as opposed to 4.6 seconds. This much seems like common sense. The puzzling thing about physical genius, however, is that the closer you look at it the less it can be described by such cut-and-dried measures of athleticism.

  Consider, for example, Tony Gwynn, who has been one of the best hitters in baseball over the past fifteen years. We would call him extraordinarily coordinated, by which we mean that in the course of several hundred milliseconds he can execute a series of perfectly synchronized muscular actions - the rotation of the shoulder, the movement of the arms, the shift of the hips - and can regulate the outcome of those actions so that his bat hits the ball with exactly the desired degree of force. These are abilities governed by specific neurological mechanisms. Timing, for example, appears to be controlled by the cerebellum. Richard Ivry, a psychologist at the University of California at Berkeley, has looked at patients who suffered cerebellar damage as a result of a stroke. He had them pronounce the sounds "bah," "pah," and "dah." The difference between the "b" sound and the "p" sound is primarily a matter of timing. "To make the 'b' sound, you put your lips together and as you open them you immediately vibrate the vocal chords," Ivry said. "For 'p' you open the lips thirty to forty milliseconds before the vocal cords vibrate." Stroke patients with cerebellar damage, Ivry found, make lots of "b"-"p" mistakes: "baby" comes out "paby." Their timing is off. But they don't have trouble with "p" and "d," because the timing of lips and vocal cords for those two sounds is exactly the same. The difference is simply in the configuration of your tongue. "You never hear them say 'dady' instead of 'baby,'" Ivry said.

  Force regulation appears to be controlled by another area of the brain entirely, the basal ganglia. "I like to think of the basal ganglia as a gate to the motor system," Ivry said, although he cautioned that the work on force regulation is still a good deal more speculative than the work on timing. "At any point in time, I have a few actions that I'm thinking about, and the basal ganglia are monitoring all the potential ones, then choosing one. The question is: How quickly does that gate open up?" He devised a study in which subjects were asked to press on a lever with their index finger over and over again, with the same degree of force each time. Patients with Parkinson's disease, which is a degenerative condition affecting the basal ganglia, had relatively little trouble with the timing of that movement, but they had terrible difficulty controlling the force of the tapping. At one moment they were pressing too hard, and the next they weren't pressing hard enough. Their "gate" wasn't working properly.

  Stroke victims and Parkinson's patients, of course, are people who have actually suffered neurological impairment. But Ivry and Steven Keele, of the University of Oregon, suggest that in healthy people, too, there is probably a natural variation in the efficiency of these motor-control functions. They have done work on clumsy children, for example, that shows that what looks like a general lack of coordination can, in some cases, be broken down into either a basal-ganglia problem or a cerebellum problem. Clumsy kids are at one end of the coördination bell curve. "Maybe their neural connections or their branching isn't as well developed, or they don't have as many synaptic connections," Ivry suggests. And at the other end of the curve? That's where you find people like Tony Gwynn.

  But being wonderfully coordinated isn't all there is to hitting. A ball thrown at eighty-nine miles per hour (which is a typical speed in the major leagues) takes roughly four hundred and sixty milliseconds to go from the pitcher's hand to home plate. Someone like Tony Gywnn, with all his finely tuned physiological hardware, takes about a hundred and sixty milliseconds to swing a bat. The decision about how to swing the bat, however, will take Gwynn between a hundred and ninety and four hundred and fifty milliseconds, depending on what the situation is and what he intends to do with the pitch. "Very good hitters base their decisions on past experience with certain pitchers, with the count, with the probabilities of certain types of pitches, with their own skills, and use very early cues in the pitcher's delivery to begin the swing," Janet Starkes, a professor of kinesiology at McMaster University, in Ontario, says.

  What sets physical geniuses apart from other people, then, is not merely being able to do something but knowing what to do - their capacity to pick up on subtle patterns that others generally miss. This is what we mean when we say that great athletes have a "feel" for the game, or that they "see" the court or the field or the ice in a special way. Wayne Gretzky, in a 1981 game against the St. Louis Blues, stood behind the St. Louis goal, laid the puck across the blade of his stick, then bounced it off the back of the goalie in front of him and into the net. Gretzky's genius at that moment lay in seeing a scoring possibility where no one had seen one before. "People talk about skating, puck-handling, and shooting," Gretzky told an interviewer some years later, "but the whole sport is angles and caroms, forgetting the straight direction the puck is going, calculating where it will be diverted, factoring in all the interruptions." Neurosurgeons say that when the very best surgeons operate they always know where they are going, and they mean that the Charlie Wilsons of this world possess that same special feel - an ability to calculate the diversions and to factor in the interruptions when faced with a confusing mass of blood and tissue.

  When Charlie Wilson came to U.C. San Francisco, in July of 1968, his first case concerned a woman who had just had a pituitary operation. The previous surgeon had done the one thing that surgeons are not supposed to do in pituitary surgery - tear one of the carotid arteries. Wilson was so dismayed by the outcome that he resolved he would teach himself how to do the transsphenoidal, which was then a relatively uncommon procedure. He carefully read the medical literature. He practiced on a few cadavers. He called a friend in Los Angeles who was an expert at the procedure, and had him come to San Francisco and perform two operations while Wilson watched. He flew to Paris to observe Gerard Guiot, who was one of the great transsphenoidal surgeons at the time. Then he flew home. It was the equivalent of someone preparing for a major-league tryout by watching the Yankees on television and hitting balls in an amusement-arcade batting cage. "Charlie went slowly," recalls Ernest Bates, a Bay-area neurosurgeon who scrubbed with Wilson on his first transsphenoidal, "but he knew the anatomy and, boom, he was there. I thought, My God, this was the first? You'd have thought he had done a hundred. Charlie has a skill that the rest of us just don't have."

  This is the hard part about understanding physical genius, because the source of that special skill - that "feel" - is still something of a mystery. "Sometimes during the course of an operation, there'll be several possible ways of doing something, and I'll size them up and, without having any conscious reason, I'll just do one of them," Wilson told me. He speaks with a soft, slow drawl, a remnant of Neosho, Missouri, the little town where he grew up, and where his father was a pharmacist, who kept his store open from 7 a.m. to 11 p.m., seven days a week. Wilson has a plainspoken, unpretentious quality. When he talks about his extraordinary success as a surgeon, he gives the impression that he is talking about some abstract trait that he is neither responsible for nor completely able to understand. "It's sort of an invisible hand," he went on. "It begins almost to seem mystical. Sometimes a resident asks, 'Why did you do that?' and I say" - here Wilson gave a little shrug - "'Well, it just seemed like the right thing.'"

  3.

  There is a neurosurgeon at Columbia Presbyterian Center, in Manhattan, by the name of Don Quest, who served two tours in Vietnam flying A-1s off the U.S.S. Kitty Hawk. Quest sounds like the kind of person who bungee jumps on the weekend and has personalized license plates that read "Ace." In fact, he is a thoughtful, dapper man with a carefully trimmed mustache, who plays the trombone in his spare time and quite cheerfully describes himself as compulsive. "When I read the New York Times, I don't speed-read it," Quest told me. "I read it carefully. I read everything. It drives my wife crazy." He was wearing a spotless physician's coat and a bow tie. "When I'm reading a novel - and there are so many novels I want to read - even if it's not very good I can't throw it away. I stick with it. It's quite frustrating, because I don't really have time for garbage." Quest talked about what it was like to repair a particularly tricky aneurysm compared to what it was like to land at night in rough seas and a heavy fog when you are running out of fuel and the lights are off on the carrier's landing strip, because the skies are full of enemy aircraft. "I think they are similar," he said, after some thought, and what he meant was that they were both exercises in a certain kind of exhaustive and meticulous preparation. "There is a checklist, before you take off, and this was drilled into us,"Quest said. "It's on the dashboard with all the things you need to do. People forget to put the hook down, and you can't land on an aircraft carrier if the hook isn't down. Or they don't put the wheels down. One of my friends, my roommate, landed at night on the aircraft carrier with the wheels up. Thank God, the hook caught, because his engine stopped. He would have gone in the water." Quest did not seem like the kind of person who would forget to put the wheels down. "Some people are much more compulsive than others, and it shows," he went on to say. "It shows in how well they do their landing on the aircraft carrier, how many times they screw up, or are on the wrong radio frequency, or get lost, or their ordinances aren't accurate in terms of dropping a bomb. The ones who are the best are the ones who are always very careful."

  Quest isn't saying that fine motor ability is irrelevant. One would expect him to perform extremely well on tests of the sort Ivry and Keele might devise. And, like Tony Gwynn, he's probably an adept and swift decision maker. But these abilities, Quest is saying, are of little use if you don't have the right sort of personality. Charles Bosk, a sociologist at the University of Pennsylvania, once conducted a set of interviews with young doctors who had either resigned or been fired from neurosurgery-training programs, in an effort to figure out what separated the unsuccessful surgeons from their successful counterparts. He concluded that, far more than technical skills or intelligence, what was necessary for success was the sort of attitude that Quest has - a practical-minded obsession with the possibility and the consequences of failure. "When I interviewed the surgeons who were fired, I used to leave the interview shaking," Bosk said. "I would hear these horrible stories about what they did wrong, but the thing was that they didn't know that what they did was wrong. In my interviewing, I began to develop what I thought was an indicator of whether someone was going to be a good surgeon or not. It was a couple of simple questions: Have you ever made a mistake? And, if so, what was your worst mistake? The people who said, 'Gee, I haven't really had one,' or, 'I've had a couple of bad outcomes but they were due to things outside my control' - invariably those were the worst candidates. And the residents who said, 'I make mistakes all the time. There was this horrible thing that happened just yesterday and here's what it was.' They were the best. They had the ability to rethink everything that they'd done and imagine how they might have done it differently."

  What this attitude drives you to do is practice over and over again, until even the smallest imperfections are ironed out. After doing poorly in a tournament just prior to this year's Wimbledon, Greg Rusedski, who is one of the top tennis players in the world, told reporters that he was going home to hit a thousand practice serves. One of the things that set Rusedski apart from lesser players, in other words, is that he is the kind of person who is willing to stand out in the summer sun, repeating the same physical movement again and again, in single-minded pursuit of some fractional improvement in his performance. Wayne Gretzky was the same way. He would frequently stay behind after practice, long after everyone had left, flipping pucks to a specific spot in the crease, or aiming shot after shot at the crossbar or the goal post.

  And Charlie Wilson? In his first few years as a professor at U.C.S.F., he would disappear at the end of the day into a special laboratory to practice his craft on rats: isolating, cutting, and then sewing up their tiny blood vessels, and sometimes operating on a single rat two or three times. He would construct an artificial aneurysm using a vein graft on the side of a rat artery, then manipulate the aneurysm the same way he would in a human being, toughening its base with a gentle coagulating current - and return two or three days later to see how successful his work had been. Wilson sees surgery as akin to a military campaign. Training with him is like boot camp. He goes to bed somewhere around eleven at night and rises at 4:30 a.m. For years, he ran upward of eighty miles a week, competing in marathons and hundred-mile ultra-marathons. He quit only after he had a hip replacement and two knee surgeries and found himself operating in a cast. Then he took up rowing. On his days in the operating room, at the height of his career, Wilson would run his morning ten or twelve miles, conduct medical rounds, operate steadily until six or seven in the evening, and, in between, see patients, attend meetings, and work on what now totals six hundred academic articles. One of his former residents says, with a laugh, that when he was on Wilson's rotation he developed a persistent spasm of his upper eyelid and it did not go away until he moved on to train with someone else. Julian Hoff, the chairman of neurosurgery at the University of Michigan and a longtime friend of Wilson's, says, "The way he would communicate with people in the office is that he would have a little piece of paper and he would put your name with an arrow next to it, and two words saying what he wanted you to do." Once, when a new head of nursing at U.C.S.F. wanted to start rotating nursing teams in neurosurgery, instead of letting Wilson work with the same team every day, he stopped operating for a week in protest. New nurses, he explained, would mean more mistakes - not fatal mistakes but irregularities in the flow of his operating room, such as someone's handing him the wrong instrument, or handing him an instrument with the blade up instead of down, or even just a certain hesitation, because to Wilson the perfect operation requires a particular grace and rhythm. "In every way, it is analogous to the routine of a concert pianist," he says. "If you were going to do a concert and you didn't practice for a week, someone would notice that, just as I notice if one of my scrub nurses has been off for a week. There is that fraction-of-a-second difference in the way she reacts."

  "Wilson has a certain way of positioning the arm of the retractor blade" - an instrument used to hold brain tissue in place - "so that the back end of the retractor doesn't stick up at all and he won't accidentally bump into it," Michon Morita told me. "Every once in a while, though, I'd see him when he didn't quite put it in the position he wanted to, and bumped it, which caused a little bit of hemorrhage on the brain surface. It wasn't harming the patient, and it was nothing he couldn't handle. But I'd hear 'That was stupid,' and I'd immediately ask myself, What did I do wrong? Then I'd realize he was chastising himself. Most people would say that if there was no harm done to the patient it was no big deal. But he wants to be perfect in everything, and when that perfection is broken he gets frustrated."

  4.

  This kind of obsessive preparation does two things. It creates consistency. Practice is what enables Greg Rusedski to hit a serve at a hundred and twenty-five miles per hour again and again. It's what enables a pianist to play Chopin's double-thirds Étude at full speed, striking every key with precisely calibrated force. More important, practice changes the way a task is perceived. A chess master, for example, can look at a game in progress for a few seconds and then perfectly reconstruct that same position on a blank chessboard. That's not because chess masters have great memories (they don't have the same knack when faced with a random arrangement of pieces) but because hours and hours of chess playing have enabled them to do what psychologists call "chunking." Chunking is based on the fact that we store familiar sequences - like our telephone number or our bank-machine password - in long-term memory as a single unit, or chunk. If I told you a number you'd never heard before, though, you would be able to store it only in short-term memory, one digit at a time, and if I asked you to repeat it back to me you might be able to remember only a few of those digits - maybe the first two or the last three. By contrast, when the chess masters see the board from a real game, they are able to break the board down into a handful of chunks - two or three clusters of pieces in positions that they have encountered before.

  In "The Game of Our Lives," a classic account of the 1980-81 season of the Edmonton Oilers hockey team, Peter Gzowski argues that one of the principal explanations for the particular genius of Wayne Gretzky was that he was hockey's greatest chunker. Gretzky, who holds nearly every scoring record in professional hockey, baffled many observers because he seemed to reverse the normal laws of hockey. Most great offensive players prefer to keep the rest of the action on the ice behind them - to try to make the act of scoring be just about themselves and the goalie. Gretzky liked to keep the action in front of him. He would set up by the side of the rink, or behind the opposing team's net, so that the eleven other players on the ice were in full view, and then slide the perfect pass to the perfect spot. He made hockey look easy, even as he was playing in a way that made it more complicated. Gzowski says that Gretzky could do that because, like master chess players, he wasn't seeing all eleven other players individually; he was seeing only chunks. Here is Gzowski's conclusion after talking to Gretzky about a game he once played against the Montreal Canadiens. It could as easily serve as an explanation for Charlie Wilson's twenty-five-minute transsphenoidal resection:

  What Gretzky perceives on a hockey rink is, in a curious way, more simple than what a less accomplished player perceives. He sees not so much a set of moving players as a number of situations. . . . Moving in on the Montreal blueline, as he was able to recall while he watched a videotape of himself, he was aware of the position of all the other players on the ice. The pattern they formed was, to him, one fact, and he reacted to that fact. When he sends a pass to what to the rest of us appears an empty space on the ice, and when a teammate magically appears in that space to collect the puck, he has in reality simply summoned up from his bank account of knowledge the fact that in a particular situation, someone is likely to be in a particular spot, and if he is not there now he will be there presently.

  5.

  For a time, early in his career, Charlie Wilson became obsessed with tennis. He took lessons from Rod Laver. He joined three tennis clubs, so he could be absolutely assured of having court time whenever he wanted it. He had his own ball machine, and would go out early in the morning, before anyone else was on the court, and hit bucket after bucket of balls. He was in great shape. He could play any number of sets. He had a serve that he says was a beauty, a great backhand, and - as he put it - "a very expensive" forehand. But Wilson never turned into the kind of tennis player he wanted to be. Julian Hoff recalls, "There was this guy in the neurosurgery department, John Adams, who was a former tennis champion. An older guy. Arthritic. Rickety. Looked terrible. Charlie decided that he had to beat John Adams. But he never could. It drove him crazy."

  It is easy to understand Wilson's frustration. He was a superb athlete - as a teen-ager he had been an excellent basketball player, and he attended college on a football scholarship - and a surgeon who could make life-or-death decisions in a split second. And yet, for all his focus and determination, he could not respond effectively to an old man shuffling toward the ball twenty feet across the net from him. "A good player knows where the ball is going," Wilson says. "He anticipates it. He is there. I just wasn't." What Wilson is describing is a failure not of skill or of resolve but of the least understood element of physical genius - imagination. For some reason, he could not make the game come alive in his mind.

  When psychologists study people who are expert at motor tasks, they find that almost all of them use their imaginations in a very particular and sophisticated way. Jack Nicklaus, for instance, has said that he has never taken a swing that he didn't first mentally rehearse, frame by frame. Yo-Yo Ma told me that he remembers riding on a bus, at the age of seven, and solving a difficult musical problem by visualizing himself playing the piece on the cello. Robert Spetzler, who trained with Wilson and is widely considered to be the heir to Wilson's mantle, says that when he gets into uncharted territory in an operation he feels himself transferring his mental image of what ought to happen onto the surgical field. Charlie Wilson talks about going running in the morning and reviewing each of the day's operations in his head - visualizing the entire procedure and each potential outcome in advance. "It was a virtual rehearsal," he says, "so when I was actually doing the operation, it was as if I were doing it for the second time." Once, he says, he had finished a case and taken off his gloves and was walking down the hall away from the operating room when he suddenly stopped, because he realized that the tape he had been playing in his head didn't match the operation that had unfolded before his eyes. "I was correlating everything - what I saw, what I expected, what the X-rays said. And I just realized that I had not pursued one particular thing. So I turned around, scrubbed, and went back in, and, sure enough, there was a little remnant of tumor that was just around the corner. It would have been a disaster."

  The Harvard University psychologist Stephen Kosslyn has shown that this power to visualize consists of at least four separate abilities, working in combination. The first is the ability to generate an image - to take something out of long-term memory and reconstruct it on demand. The second is what he calls "image inspection," which is the ability to take that mental picture and draw inferences from it. The third is "image maintenance," the ability to hold that picture steady. And the fourth is "image transformation," which is the ability to take that image and manipulate it. If I asked you whether a frog had a tail, for example, you would summon up a picture of a frog from your long-term memory (image generation), hold it steady in your mind (image maintenance), rotate the frog around until you see his backside (image transformation), and then look to see if there was a tail there (image inspection). These four abilities are highly variable. Kosslyn once gave a group of people a list of thirteen tasks, each designed to test a different aspect of visualization, and the results were all over the map. You could be very good at generation and maintenance, for example, without being good at transformation, or you could be good at transformation without necessarily being adept at inspection and maintenance. Some of the correlations, in fact, were negative, meaning that sometimes being good at one of those four things meant that you were likely to be bad at another. Bennett Stein, a former chairman of neurosurgery at Columbia Presbyterian Center, says that one of the reasons some neurosurgery residents fail in their training is that they are incapable of making the transition between the way a particular problem is depicted in an X-ray or an M.R.I., and how the problem looks when they encounter it in real life. These are people whose capacities for mental imaging simply do not match what's required for dealing with the complexities of brain surgery. Perhaps these people can generate an image but are unable to transform it in precisely the way that is necessary to be a great surgeon; or perhaps they can transform the image but they cannot maintain it. The same may have been true for Charlie Wilson and tennis. Somehow, his particular configuration of imaging abilities did not fit with the demands of the sport. When he stopped playing the game, he says, he didn't miss it, and that's not surprising. Tennis never quite got inside his head. Neurosurgery, of course, is another matter.

  "Certain aneurysms at the base of the brain are surrounded by very important blood vessels and nerves, and the typical neurosurgeon will make that dissection with a set of micro-instruments that are curved, each with a blunt end," Craig Yorke, who trained with Wilson and now practices neurosurgery in Topeka, recalls. "The neurosurgeon will sneak up on them. Charlie would call for a No. 11 blade, which is a thin, very low-profile scalpel, and would just cut down to where the aneurysm was. He would be there in a quarter of the time." The speed and the audacity of Wilson's maneuvers, Yorke said, would sometimes leave him breathless. "Do you know about Gestalt psychology?" he continued. "If I look at a particular field - tumor or aneurysm - I will see the gestalt after I've worked on it for a little while. He would just glance at it and see it. It's a conceptual, a spatial thing. His use of the No. 11 blade depended on his ability to construct a gestalt of the surgical field first. If just anybody had held up the eleven blade in that way it might have been a catastrophe. He could do it because he had the picture of the whole anatomy in his head when he picked up the instrument."

  6.

  If you think of physical genius as a pyramid, with, at the bottom, the raw components of co&ouml;rdination, and, above that, the practice that perfects those particular movements, then this faculty of imagination is the top layer. This is what separates the physical genius from those who are merely very good. Michael Jordan and Karl Malone, his longtime rival, did not differ so much in their athletic ability or in how obsessively they practiced. The difference between them is that Jordan could always generate a million different scenarios by which his team could win, some of which were chunks stored in long-term memory, others of which were flights of fancy that came to him, figuratively and literally, in midair. Jordan twice won championships in the face of unexpected adversity: once, a case of the flu, and, the second time, a back injury to his teammate Scottie Pippen, and he seemed to thrive on these obstacles, in a way Karl Malone never could.

  Yo-Yo Ma says that only once, early in his career, did he try for a technically perfect performance. "I was seventeen," he told me. "I spent a year working on it. I was playing a Brahms sonata at the 92nd Street Y. I remember working really hard at it, and in the middle of the performance I thought, I'm bored. It would have been nothing for me to get up from the stage and walk away. That's when I decided I would always opt for expression over perfection." It isn't that Ma doesn't achieve perfection; it's that he finds striving for perfection to be banal. He says that he sometimes welcomes it when he breaks a string, because that is precisely the kind of thing (like illness or an injury to a teammate) that you cannot prepare for - that you haven't chunked and, like some robot, stored neatly in long-term memory. The most successful performers improvise. They create, in Ma's words, "something living." Ma says he spends ninety per cent of his time "looking at the score, figuring it out - who's saying this, who wrote this and why," letting his mind wander, and only ten per cent on the instrument itself. Like Jordan, his genius originates principally in his imagination. If he spent less time dreaming and more time playing, he would be Karl Malone.

  Here is the source of the physical genius's motivation. After all, what is this sensation - this feeling of having what you do fit perfectly into the dimensions of your imagination - but the purest form of pleasure? Tony Gwynn and Wayne Gretzky and Charlie Wilson and all the other physical geniuses are driven to greatness because they have found something so compelling that they cannot put it aside. Perhaps this explains why a great many top neurosurgeons are also highly musical. Robert Spetzler, Wilson's protégée, seriously considered a career as a concert pianist. Craig Yorke made his début as a violinist at sixteen with the Boston Pops. Quest, of course, plays the trombone. As for Wilson, he is a cellist and, when he was a student in New Orleans, he would play jazz piano at Pat O'Brien's, in the French Quarter. Music is one of the few vocations that offer a kind of sensory and cognitive immersion similar to surgery: the engagement of hand and eye, the challenge of sustained performance, the combination of mind and motion - all of it animated by the full force of the imagination. Once, in an E-mail describing his special training sessions on rats, Wilson wrote that he worked on them for two years and "then trailed off when I finally figured that I was doing it for fun, not for practice." For fun! When someone chooses to end a twelve-hour day alone in a laboratory, inducing aneurysms in the arteries of rats, we might call that behavior obsessive. But that is an uncharitable word. A better explanation is that, for some mysterious and wonderful reason, Wilson finds the act of surgery irresistible, in the way that musicians find pleasure in the sounds they produce on their instruments, or in the way Tony Gwynn gets a thrill every time he strokes a ball cleanly through the infield. Before he was two years old, it is said, Wayne Gretzky watched hockey games on television, enraptured, and slid his stockinged feet on the linoleum in imitation of the players, then cried when the game was over, because he could not understand how something so sublime should have to come to an end. This was long before Gretzky was any good at the game itself, or was skilled in any of its aspects, or could create even the smallest of chunks. But what he had was what the physical genius must have before any of the other layers of expertise fall into place: he had stumbled onto the one thing that, on some profound aesthetic level, made him happy.

  Charlie Wilson says that only once in his career has he allowed himself to become emotionally attached to a patient - attached to the point where the patient's death felt like that of a family member. "It was this beautiful girl who had a spinal tumor," he told me. "She was always bringing me cookies. It was a malignant tumor. She became a paraplegic, and then she got married." Wilson was talking softly and slowly. "It just tore me up. I couldn't help myself. I remember operating on her and crying, right there in the O.R." Charlie Wilson is a man who, when he operates, does not permit music or extraneous talking or the noise of beepers or phones, who is attuned to even the slightest hesitation on the part of his scrub nurse, who admits, in his entire life, to just one day of depression, and who has the audacity and the control to take a No. 11 blade and slice down - just like that - to the basilar artery. But she was young, and it was tragic, and there was nothing he could do, and he has a daughter, too, so perhaps it touched a chord. He was sitting, as he talked, in his office at Moffitt Hospital with his Nike cross-trainers and surgical scrubs, thinking back to a moment when all certitude and composure escaped him. His performance on the day he operated on the girl's spinal tumor must have been compromised by his grief, he admitted. But what did it matter? This was not a procedure that required great judgment or technical mastery. "It was an ugly operation," he said, pronouncing the word "ugly" with a special distaste. "Maybe that was part of it." Of course, it was. Charlie Wilson is one of the world's great neurosurgeons because he can find some beauty in what he does - even in the midst of terrible illness. There was nothing beautiful there. "This lovely, lovely girl." He looked away. "Such a heart."
GO TO TOP MENU

  Is the Belgian Coca-Cola hysteria the real thing?

  The wave of illness among Belgian children last month had the look and feel - in the beginning, at least - of an utterly typical food poisoning outbreak. First, forty-two children in the Belgian town of Bornem became mysteriously ill after drinking Coca-Cola and had to be hospitalized. Two days later, eight more school children fell sick in Bruges, followed by thirteen in Harelbeke the next day and forty two in Lochristi three days after that - and on and on in a widening spiral that, in the end, sent more than one hundred children to the hospital complaining of nausea, dizziness, and headaches, and forced Coca-Cola into the biggest product recall in its hundred-and-thirteen-year history. Upon investigation, an apparent culprit was found. In the Coca-Cola plant in Antwerp, contaminated carbon dioxide had been used to carbonate a batch of the soda's famous syrup. With analysts predicting that the scare would make a dent in Coca-Cola's quarterly earnings, the soft-drink giant apologized to the Belgian people, and the world received a sobering reminder of the fragility of food safety.

  The case isn't as simple as it seems, though. A scientific study ordered by Coca-Cola found that the contaminants in the carbon dioxide were sulfur compounds left over from the production process. In the tainted bottles of Coke, these residues were present at between five and seventeen parts per billion. These sulfides can cause illness, however, only at levels about a thousand times greater than that. At seventeen parts per billion, they simply leave a bad smell - like rotten eggs - which means that Belgium should have experienced nothing more than a minor epidemic of nose-wrinkling. More puzzling is the fact that, in four of the five schools were the bad Coke allegedly struck, half of the kids who got sick hadn't drunk any Coke that day. Whatever went on Belgium, in other words, probably wasn't Coca-Cola poisoning. So what was it? Maybe nothing at all.

  "You know, when this business started I bet two of my friends a bottle of champagne each that I knew the cause," Simon Wessely, a psychiatrist who teaches at the King's College School of Medicine in London, said.

  "It's quite simple. It's just mass hysteria. These things usually are."

  Wessely has been collecting reports of this kind of hysteria for about ten years and now has hundreds of examples, dating back as far as 1787, when millworkers in Lancashire suddenly took ill after they became persuaded that they were being poisoned by tainted cotton. According to Wessely, almost all cases fit a pattern. Someone sees a neighbor fall ill and becomes convinced that he is being contaminated by some unseen evil - in the past it was demons and spirits; nowadays it tends to be toxins and gases - and his fear makes him anxious. His anxiety makes him dizzy and nauseous. He begins to hyperventilate. He collapses. Other people hear the same allegation, see the "victim" faint, and they begin to get anxious themselves. They feel nauseous. They hyperventilate. They collapse, and before you know it everyone in the room is hyperventilating and collapsing. These symptoms, Wessely stresses, are perfectly genuine. It's just that they are manifestations of a threat that is wholly imagined. "This kind of thing is extremely common," he says, "and it's almost normal. It doesn't mean that you are mentally ill or crazy."

  Mass hysteria comes in several forms. Mass motor hysteria, for example, involves specific physical movements: shaking, tremors, and convulsions. According to the sociologist Robert Bartholomew, motor hysteria often occurs in environments of strict emotional repression; it was common in medieval nunneries and in nineteenth-century European schools, and it is seen today in some Islamic cultures. What happened in Belgium, he says, is a fairly typical example of a more standard form of contagious anxiety, possibly heightened by the recent Belgian scare over dioxin-contaminated animal feed. The students' alarm over the rotten-egg odor of their Cokes, for example, is straight out of the hysteria textbooks. "The vast majority of these events are triggered by some abnormal but benign smell," Wessely said. "Something strange, like a weird odor coming from the air conditioning."

  The fact that the outbreaks occurred in schools is also typical of hysteria cases. "The classic ones always involve schoolchildren," Wessely continued. "There is a famous British case involving hundreds of schoolgirls who collapsed during a 1980 Nottinghamshire jazz festival. They blamed it on a local farmer spraying pesticides." Bartholomew has just published a paper on a hundred and fifteen documented hysteria cases in schools over the past three hundred years. As anyone who has ever been to a rock concert knows, large numbers of adolescents in confined spaces seem to be particularly susceptible to mass hysteria. Those intent on pointing the finger at Coca-Cola in this sorry business ought to remember that. "We let the people of Belgium down," Douglas Ivester, the company's chairman, said in the midst of the crisis. Or perhaps it was the other way around.
GO TO TOP MENU

  "Melrose Place," 1992-1999, R.I.P.

  During the 1995-96 season of "Melrose Place" - unquestionably the finest in the seven-year run of the prime-time soap, which comes to an end this week - the winsome redhead known as Dr. Kimberly Shaw experienced a sudden breakthrough in her therapy with Dr. Peter Burns, whom, according to the convolutions of the Melrose narrative, she happened to be living with at the moment. Burns was also acting as her lawyer and guardian, in addition to being her lover and therapist, although those last two descriptions are not quite accurate, since Kimberly and Dr. Burns weren't sleeping together at the time of her therapy breakthrough and, what's more, Burns wasn't really a therapist. From all appearances, he was actually a surgeon, or - since he also treated the show's central figure (and his future lover), Amanda, when she had her cancer scare - an oncologist, or, at the very least, a hunky guy with a stethoscope and a pager, which, in the Melrose universe, is all you really need to be to pass your medical boards.

  In any case, in the first or second session between Kimberly and Dr. Peter Burns - her lawyer, suitor, guardian, non-therapist therapist, landlord, and room-mate - Kimberly realized that the reason she had been exhibiting strong homicidal tendencies was that she had been suppressing the childhood memory of having killed a very evil man who bore a distinct resemblance to a ferret. In a daring plot twist, Michael and Sydney - Kimberly's ex-lover and her romantic rival, respectively - got hold of a sketch she had made of her ferret-faced tormentor and hired an actor to impersonate him in an effort to make Kimberly think that she was still as crazy as ever. And that's exactly what happened, until Kimberly, toward the end of one episode, confronted the actor playing the ferret man and peeled off his prosthetic makeup, vanquishing her personal demon once and for all.

  If you talk to aficionados of "Melrose Place," they will tell you that the ferret-man moment, more than any other, captured what was truly important about the series: here was an actor playing a doctor, in therapy with another actor playing a doctor who was himself impersonating a therapist, confronting an actor playing an actor playing her own personal demon, and when she unmasked him she found . . . that he was just another actor! Or something like that. The wonderful thing about "Melrose Place" was that just when you thought that the show was about to make some self-consciously postmodern commentary on, say, the relationship between art and life, it had the courage to take the easy way out and go for the laugh.

  "Melrose Place" was often, mistakenly, lumped with its sister show on Fox, "Beverly Hills, 90210," which, like "Melrose," was an Aaron Spelling Production. At one point, Fox even ran the two shows back to back on Wednesday nights. But they were worlds apart. "90210" was the most conventional kind of television. It played to the universal desire of adolescents to be grownups, and it presented the world inside West Beverly High as one driven by the same social and ethical and political issues as the real world. "90210" was all about teens behaving like adults. "Melrose" was the opposite. It started with a group of adults - doctors, advertising executives, fashion designers - and dared to have them behave as foolishly and as naively as adolescents. Most of them lived in the same apartment building, where they fought and drank and wore really tight outfits and slept together in every conceivable permutation. They were all dumb, and the higher they rose in the outside world the dumber they got when they came home to Melrose Place.

  In the mid-nineteen-nineties, when a generation of Americans reached adulthood and suddenly realized that they didn't want to be there, the inverted world of Melrose was a wonderfully soothing place. Here, after all, was a show that ostensibly depicted sophisticated grownup society, and every viewer was smarter than the people on the screen. Could anyone believe, for example, that when Kimberly came back from her breakthrough session with Peter Burns and went home to make dinner for Peter Burns, and Peter Burns sidled up to kiss her as she was slicing carrots, bra-less, he never stopped to think that here was his client and patient and tenant and analysand - a woman who had just tried to kill all kinds of people - and she was in his kitchen holding a knife? Peter! You moron! Watch the knife!
GO TO TOP MENU

  Hair dye and the hidden history of postwar America.

  1.

  During the Depression - long before she became one of the most famous copywriters of her day - Shirley Polykoff met a man named George Halperin. He was the son of an Orthodox rabbi from Reading, Pennsylvania, and soon after they began courting he took her home for Passover to meet his family. They ate roast chicken, tzimmes, and sponge cake, and Polykoff hit it off with Rabbi Halperin, who was warm and funny. George's mother was another story. She was Old World Orthodox, with severe, tightly pulled back hair; no one was good enough for her son.

"How'd I do, George?" Shirley asked as soon as they got in the car for the drive home. "Did your mother like me?"

He was evasive. "My sister Mildred thought you were great."

"That's nice, George," she said. "But what did your mother say?"

There was a pause. "She says you paint your hair." Another pause. "Well, do you?"

  Shirley Polykoff was humiliated. In her mind she could hear her future mother-in-law: Fahrbt zi der huer? Oder fahrbt zi nisht? Does she color her hair? Or doesn't she?

  The answer, of course, was that she did. Shirley Polykoff always dyed her hair, even in the days when the only women who went blond were chorus girls and hookers. At home in Brooklyn, starting when she was fifteen, she would go to Mr. Nicholas's beauty salon, one flight up, and he would "lighten the back" until all traces of her natural brown were gone. She thought she ought to be a blonde-or, to be more precise, she thought that the decision about whether she could be a blonde was rightfully hers, and not God's. Shirley dressed in deep oranges and deep reds and creamy beiges and royal hues. She wore purple suede and aqua silk, and was the kind of person who might take a couture jacket home and embroider some new detail on it. Once, in the days when she had her own advertising agency, she was on her way to Memphis to make a presentation to Maybelline and her taxi broke down in the middle of the expressway. She jumped out and flagged down a Pepsi-Cola truck, and the truck driver told her he had picked her up because he'd never seen anyone quite like her before. "Shirley would wear three outfits, all at once, and each one of them would look great," Dick Huebner, who was her creative director, says. She was flamboyant and brilliant and vain in an irresistible way, and it was her conviction that none of those qualities went with brown hair. The kind of person she spent her life turning herself into did not go with brown hair. Shirley's parents were Hyman Polykoff, small-time necktie merchant, and Rose Polykoff, housewife and mother, of East New York and Flatbush, by way of the Ukraine. Shirley ended up on Park Avenue at Eighty-second. "If you asked my mother 'Are you proud to be Jewish?' she would have said yes," her daughter, Alix Nelson Frick, says. "She wasn't trying to pass. But she believed in the dream, and the dream was that you could acquire all the accouterments of the established affluent class, which included a certain breeding and a certain kind of look. Her idea was that you should be whatever you want to be, including being a blonde."

  In 1956, when Shirley Polykoff was a junior copywriter at Foote, Cone & Belding, she was given the Clairol account. The product the company was launching was Miss Clairol, the first hair-color bath that made it possible to lighten, tint, condition, and shampoo at home, in a single step-to take, say, Topaz (for a champagne blond) or Moon Gold (for a medium ash), apply it in a peroxide solution directly to the hair, and get results in twenty minutes. When the Clairol sales team demonstrated their new product at the International Beauty Show, in the old Statler Hotel, across from Madison Square Garden, thousands of assembled beauticians jammed the hall and watched, openmouthed, demonstration after demonstration. "They were astonished," recalls Bruce Gelb, who ran Clairol for years, along with his father, Lawrence, and his brother Richard. "This was to the world of hair color what computers were to the world of adding machines. The sales guys had to bring buckets of water and do the rinsing off in front of everyone, because the hairdressers in the crowd were convinced we were doing something to the models behind the scenes."

  Miss Clairol gave American women the ability, for the first time, to color their hair quickly and easily at home. But there was still the stigma-the prospect of the disapproving mother-in-law. Shirley Polykoff knew immediately what she wanted to say, because if she believed that a woman had a right to be a blonde she also believed that a woman ought to be able to exercise that right with discretion. "Does she or doesn't she?" she wrote, translating from the Yiddish to the English. "Only her hairdresser knows for sure." Clairol bought thirteen ad pages in Life in the fall of 1956, and Miss Clairol took off like a bird. That was the beginning. For Nice 'n Easy, Clairol's breakthrough shampoo-in hair color, she wrote, "The closer he gets, the better you look." For Lady Clairol, the cream-and-bleach combination that brought silver and platinum shades to Middle America, she wrote, "Is it true blondes have more fun?" and then, even more memorably, "If I've only one life, let me live it as a blonde!" (In the summer of 1962, just before "The Feminine Mystique" was published, Betty Friedan was, in the words of her biographer, so "bewitched" by that phrase that she bleached her hair.) Shirley Polykoff wrote the lines; Clairol perfected the product. And from the fifties to the seventies, when Polykoff gave up the account, the number of American women coloring their hair rose from seven per cent to more than forty per cent.

  Today, when women go from brown to blond to red to black and back again without blinking, we think of hair-color products the way we think of lipstick. On drugstore shelves there are bottles and bottles of hair-color products with names like Hydrience and Excellence and Preference and Natural Instincts and Loving Care and Nice 'n Easy, and so on, each in dozens of different shades. Feria, the new, youth-oriented brand from L'Oreal, comes in Chocolate Cherry and Champagne Cocktail-colors that don't ask "Does she or doesn't she?" but blithely assume "Yes, she does." Hair dye is now a billion-dollar-a-year commodity.

  Yet there was a time, not so long ago-between, roughly speaking, the start of Eisenhower's Administration and the end of Carter's-when hair color meant something. Lines like "Does she or doesn't she?" or the famous 1973 slogan for L'Oreal's Preference-"Because I'm worth it" were as instantly memorable as "Winston tastes good like a cigarette should" or "Things go better with Coke." They lingered long after advertising usually does and entered the language; they somehow managed to take on meanings well outside their stated intention. Between the fifties and the seventies, women entered the workplace, fought for social emancipation, got the Pill, and changed what they did with their hair. To examine the hair-color campaigns of the period is to see, quite unexpectedly, all these things as bound up together, the profound with the seemingly trivial. In writing the history of women in the postwar era, did we forget something important? Did we leave out hair?

  2.

  When the "Does she or doesn't she?" campaign first ran, in 1956, most advertisements that were aimed at women tended to be high glamour-"cherries in the snow, fire and ice," as Bruce Gelb puts it. But Shirley Polykoff insisted that the models for the Miss Clairol campaign be more like the girl next door-"Shirtwaist types instead of glamour gowns," she wrote in her original memo to Clairol. "Cashmere-sweater-over-the-shoulder types. Like larger-than-life portraits of the proverbial girl on the block who's a little prettier than your wife and lives in a house slightly nicer than yours." The model had to be a Doris Day type-not a Jayne Mansfield-because the idea was to make hair color as respectable and mainstream as possible. One of the earliest "Does she or doesn't she?" television commercials featured a housewife, in the kitchen preparing hors d'ouvres for a party. She is slender and pretty and wearing a black cocktail dress and an apron. Her husband comes in, kisses her on the lips, approvingly pats her very blond hair, then holds the kitchen door for her as she takes the tray of hors d'ouvres out for her guests. It is an exquisitely choreographed domestic tableau, down to the little dip the housewife performs as she hits the kitchen light switch with her elbow on her way out the door. In one of the early print ads-which were shot by Richard Avedon and then by Irving Penn-a woman with strawberry-blond hair is lying on the grass, holding a dandelion between her fingers, and lying next to her is a girl of about eight or nine. What's striking is that the little girl's hair is the same shade of blond as her mother's. The "Does she or doesn't she?" print ads always included a child with the mother to undercut the sexual undertones of the slogan-to make it clear that mothers were using Miss Clairol, and not just "fast" women-and, most of all, to provide a precise color match. Who could ever guess, given the comparison, that Mom's shade came out of a bottle?

  The Polykoff campaigns were a sensation. Letters poured in to Clairol. "Thank you for changing my life,"read one, which was circulated around the company and used as the theme for a national sales meeting. "My boyfriend, Harold, and I were keeping company for five years but he never wanted to set a date. This made me very nervous. I am twenty-eight and my mother kept saying soon it would be too late for me." Then, the letter writer said, she saw a Clairol ad in the subway. She dyed her hair blond, and "that is how I am in Bermuda now on my honeymoon with Harold." Polykoff was sent a copy with a memo: "It's almost too good to be true!" With her sentimental idyll of blond mother and child, Shirley Polykoff had created something iconic.

  "My mother wanted to be that woman in the picture," Polykoff's daughter, Frick, says. "She was wedded to the notion of that suburban, tastefully dressed, well-coddled matron who was an adornment to her husband, a loving mother, a long-suffering wife, a person who never overshadowed him.She wanted the blond child. In fact, I was blond as a kid, but when I was about thirteen my hair got darker and my mother started bleaching it." Of course-and this is the contradiction central to those early Clairol campaigns-Shirley Polykoff wasn't really that kind of woman at all. She always had a career. She never moved to the suburbs. "She maintained that women were supposed to be feminine, and not too dogmatic and not overshadow their husband, but she greatly overshadowed my father, who was a very pure, unaggressive, intellectual type," Frick says. "She was very flamboyant, very emotional, very dominating."

  One of the stories Polykoff told about herself repeatedly- and that even appeared after her death last year, in her Times obituary-was that she felt that a woman never ought to make more than her husband, and that only after George's death, in the early sixties, would she let Foote, Cone & Belding raise her salary to its deserved level. "That's part of the legend, but it isn't the truth," Frick says. "The ideal was always as vividly real to her as whatever actual parallel reality she might be living. She never wavered in her belief in that dream, even if you would point out to her some of the fallacies of that dream, or the weaknesses, or the internal contradictions, or the fact that she herself didn't really live her life that way." For Shirley Polykoff, the color of her hair was a kind of useful fiction, a way of bridging the contradiction between the kind of woman she was and the kind of woman she felt she ought to be. It was a way of having it all. She wanted to look and feel like Doris Day without having to be Doris Day. In twenty-seven years of marriage, during which she bore two children, she spent exactly two weeks as a housewife, every day of which was a domestic and culinary disaster. "Listen, sweetie," an exasperated George finally told her. "You make a lousy little woman in the kitchen." She went back to work the following Monday.

  This notion of the useful fiction-of looking the part without being the part-had a particular resonance for the America of Shirley Polykoff's generation. As a teen-ager, Shirley Polykoff tried to get a position as a clerk at an insurance agency and failed. Then she tried again, at another firm, applying as Shirley Miller. This time, she got the job. Her husband, George, also knew the value of appearances. The week Polykoff first met him, she was dazzled by his worldly sophistication, his knowledge of out-of-the-way places in Europe, his exquisite taste in fine food and wine. The second week, she learned that his expertise was all show, derived from reading the Times. The truth was that George had started his career loading boxes in the basement of Macy's by day and studying law at night. He was a faker, just as, in a certain sense, she was, because to be Jewish-or Irish or Italian or African-American or, for that matter, a woman of the fifties caught up in the first faint stirrings of feminism - was to be compelled to fake it in a thousand small ways, to pass as one thing when, deep inside, you were something else. "That's the kind of pressure that comes from the immigrants' arriving and thinking that they don't look right, that they are kind of funny-looking and maybe shorter than everyone else, and their clothes aren't expensive," Frick says. "That's why many of them began to sew, so they could imitate the patterns of the day. You were making yourself over. You were turning yourself into an American." Frick, who is also in advertising (she's the chairman of Spier NY), is a forcefully intelligent woman, who speaks of her mother with honesty and affection. "There were all those phrases that came to fruition at that time-you know, 'clothes make the man' and 'first impressions count.'" So the question "Does she or doesn't she?" wasn't just about how no one could ever really know what you were doing. It was about how no one could ever really know who you were. It really meant not "Does she?" but "Is she?" It really meant "Is she a contented homemaker or a feminist, a Jew or a Gentile - or isn't she?"

  3. I am Ilon Specht, hear me roar

  In 1973, Ilon Specht was working as a copywriter at the McCann-Erickson advertising agency, in New York. She was a twenty-three-year-old college dropout from California. She was rebellious, unconventional, and independent, and she had come East to work on Madison Avenue, because that's where people like that went to work back then. "It was a different business in those days," Susan Schermer, a long-time friend of Specht's, says. "It was the seventies. People were wearing feathers to work." At her previous agency, while she was still in her teens, Specht had written a famous television commercial for the Peace Corps. (Single shot. No cuts. A young couple lying on the beach. "It's a big, wide wonderful world" is playing on a radio. Voice-over recites a series of horrible facts about less fortunate parts of the world: in the Middle East half the children die before their sixth birthday, and so forth. A news broadcast is announced as the song ends, and the woman on the beach changes the station.)

  "Ilon? Omigod! She was one of the craziest people I ever worked with," Ira Madris, another colleague from those years, recalls, using the word "crazy" as the highest of compliments. "And brilliant. And dogmatic. And highly creative. We all believed back then that having a certain degree of neurosis made you interesting. Ilon had a degree of neurosis that made her very interesting."

  At McCann, Ilon Specht was working with L'Oreal, a French company that was trying to challenge Clairol's dominance in the American hair-color market. L'Oreal had originally wanted to do a series of comparison spots, presenting research proving that their new product-Preference-was technologically superior to Nice 'n Easy, because it delivered a more natural, translucent color. But at the last minute the campaign was killed because the research hadn't been done in the United States. At McCann, there was panic. "We were four weeks before air date and we had nothing-nada," Michael Sennott, a staffer who was also working on the account, says. The creative team locked itself away: Specht, Madris-who was the art director on the account-and a handful of others. "We were sitting in this big office," Specht recalls. "And everyone was discussing what the ad should be. They wanted to do something with a woman sitting by a window, and the wind blowing through the curtains. You know, one of those fake places with big, glamorous curtains. The woman was a complete object. I don't think she even spoke. They just didn't get it. We were in there for hours."

  Ilon Specht is now the executive creative director of Jordan, McGrath, Case & Partners, in the Flatiron district, with a big office overlooking Fifth Avenue. She has long, thick black hair, held in a loose knot at the top of her head, and lipstick the color of maraschino cherries. She talks fast and loud, and swivels in her chair as she speaks, and when people walk by her office they sometimes bang on her door, as if the best way to get her attention is to be as loud and emphatic as she is. Reminiscing not long ago about the seventies, she spoke about the strangeness of corporate clients in shiny suits who would say that all the women in the office looked like models. She spoke about what it meant to be young in a business dominated by older men, and about what it felt like to write a line of copy that used the word "woman" and have someone cross it out and write "girl."

  "I was a twenty-three-year-old girl-a woman," she said. "What would my state of mind have been? I could just see that they had this traditional view of women, and my feeling was that I'm not writing an ad about looking good for men, which is what it seems to me that they were doing. I just thought, Fuck you. I sat down and did it, in five minutes. It was very personal. I can recite to you the whole commercial, because I was so angry when I wrote it."

  Specht sat stock still and lowered her voice: "I use the most expensive hair color in the world. Preference, by L'Oreal. It's not that I care about money. It's that I care about my hair. It's not just the color. I expect great color. What's worth more to me is the way my hair feels. Smooth and silky but with body. It feels good against my neck. Actually, I don't mind spending more for L'Oreal. Because I'm" -and here Specht took her fist and struck her chest-"worth it."

  The power of the commercial was originally thought to lie in its subtle justification of the fact that Preference cost ten cents more than Nice 'n Easy. But it quickly became obvious that the last line was the one that counted. On the strength of "Because I'm worth it," Preference began stealing market share from Clairol. In the nineteen-eighties, Preference surpassed Nice 'n Easy as the leading hair-color brand in the country, and two years ago L'Oreal took the phrase and made it the slogan for the whole company. An astonishing seventy-one per cent of American women can now identify that phrase as the L'Oreal signature, which, for a slogan-as opposed to a brand name-is almost without precedent.

  4.

  From the very beginning, the Preference campaign was unusual. Polykoff's Clairol spots had male voice-overs. In the L'Oreal ads, the model herself spoke, directly and personally. Polykoff's commercials were "other-directed" -they were about what the group was saying ("Does she or doesn't she?") or what a husband might think ("The closer he gets, the better you look"). Specht's line was what a woman says to herself. Even in the choice of models, the two campaigns diverged. Polykoff wanted fresh, girl-next-door types. McCann and L'Oreal wanted models who somehow embodied the complicated mixture of strength and vulnerability implied by "Because I'm worth it." In the late seventies, Meredith Baxter Birney was the brand spokeswoman. At that time, she was playing a recently divorced mom going to law school on the TV drama "Family." McCann scheduled her spots during "Dallas" and other shows featuring so-called "silk blouse" women - women of strength and independence. Then came Cybill Shepherd, at the height of her run as the brash, independent Maddie on "Moonlighting," in the eighties. Now the brand is represented by Heather Locklear, the tough and sexy star of "Melrose Place." All the L'Oreal spokeswomen are blondes, but blondes of a particular type. In his brilliant 1995 book, "Big Hair: A Journey into the Transformation of Self," the Canadian anthropologist Grant McCracken argued for something he calls the "blondness periodic table," in which blondes are divided into six categories: the "bombshell blonde" (Mae West, Marilyn Monroe), the "sunny blonde" (Doris Day, Goldie Hawn), the "brassy blonde" (Candice Bergen), the "dangerous blonde" (Sharon Stone), the "society blonde" (C.Z. Guest), and the "cool blonde" (Marlene Dietrich, Grace Kelly). L'Oreal's innovation was to carve out a niche for itself in between the sunny blondes-the "simple, mild, and innocent" blondes-and the smart, bold, brassy blondes, who, in McCracken's words, "do not mediate their feelings or modulate their voices."

  This is not an easy sensibility to capture. Countless actresses have auditioned for L'Oreal over the years and been turned down. "There was one casting we did with Brigitte Bardot," Ira Madris recalls (this was for another L'Oreal product), "and Brigitte, being who she is, had the damnedest time saying that line. There was something inside of her that didn't believe it. It didn't have any conviction." Of course it didn't: Bardot is bombshell, not sassy. Clairol made a run at the Preference sensibility for itself, hiring Linda Evans in the eighties as the pitchwoman for Ultress, the brand aimed at Preference's upscale positioning. This didn't work, either. Evans, who played the adoring wife of Blake Carrington on "Dynasty," was too sunny. ("The hardest thing she did on that show," Michael Sennott says, perhaps a bit unfairly, "was rearrange the flowers.")

  Even if you got the blonde right, though, there was still the matter of the slogan. For a Miss Clairol campaign in the seventies, Polykoff wrote a series of spots with the tag line "This I do for me." But "This I do for me" was at best a halfhearted approximation of "Because I'm worth it" - particularly for a brand that had spent its first twenty years saying something entirely different. "My mother thought there was something too brazen about 'I'm worth it,'" Frick told me. "She was always concerned with what people around her might think. She could never have come out with that bald-faced an equation between hair color and self-esteem."

  The truth is that Polykoff's sensibility-which found freedom in assimilation-had been overtaken by events. In one of Polykoff's "Is it true blondes have more fun?" commercials for Lady Clairol in the sixties, for example, there is a moment that by 1973 must have been painful to watch. A young woman, radiantly blond, is by a lake, being swung around in the air by a darkly handsome young man. His arms are around her waist. Her arms are around his neck, her shoes off, her face aglow. The voice-over is male, deep and sonorous. "Chances are," the voice says, "she'd have gotten the young man anyhow, but you'll never convince her of that." Here was the downside to Shirley Polykoff's world. You could get what you wanted by faking it, but then you would never know whether it was you or the bit of fakery that made the difference. You ran the risk of losing sight of who you really were. Shirley Polykoff knew that the all-American life was worth it, and that "he" -the handsome man by the lake, or the reluctant boyfriend who finally whisks you off to Bermuda-was worth it. But, by the end of the sixties, women wanted to know that they were worth it, too.

  5. What Herta Herzog knew

  Why are Shirley Polykoff and Ilon Specht important? That seems like a question that can easily be answered in the details of their campaigns. They were brilliant copywriters, who managed in the space of a phrase to capture the particular feminist sensibilities of the day. They are an example of a strange moment in American social history when hair dye somehow got tangled up in the politics of assimilation and feminism and self-esteem. But in a certain way their stories are about much more: they are about the relationship we have to the products we buy, and about the slow realization among advertisers that unless they understood the psychological particulars of that relationship-unless they could dignify the transactions of everyday life by granting them meaning-they could not hope to reach the modern consumer. Shirley Polykoff and Ilon Specht perfected a certain genre of advertising which did just this, and one way to understand the Madison Avenue revolution of the postwar era is as a collective attempt to define and extend that genre. The revolution was led by a handful of social scientists, chief among whom was an elegant, Viennese-trained psychologist by the name of Herta Herzog. What did Herta Herzog know? She knew-or, at least, she thought she knew-the theory behind the success of slogans like "Does she or doesn't she?" and "Because I'm worth it," and that makes Herta Herzog, in the end, every bit as important as Shirley Polykoff and Ilon Specht.

  Herzog worked at a small advertising agency called Jack Tinker & Partners, and people who were in the business in those days speak of Tinker the way baseball fans talk about the 1927 Yankees. Tinker was the brainchild of the legendary adman Marion Harper, who came to believe that the agency he was running, McCann-Erickson, was too big and unwieldy to be able to consider things properly. His solution was to pluck a handful of the very best and brightest from McCann and set them up, first in the Waldorf Towers (in the suite directly below the Duke and Duchess of Windsor's and directly above General Douglas MacArthur's) and then, more permanently, in the Dorset Hotel, on West Fifty-fourth Street, overlooking the Museum of Modern Art. The Tinker Group rented the penthouse, complete with a huge terrace, Venetian-tiled floors, a double-height living room, an antique French polished-pewter bar, a marble fireplace, spectacular skyline views, and a rotating exhibit of modern art (hung by the partners for motivational purposes), with everything-walls, carpets, ceilings, furnishings-a bright, dazzling white. It was supposed to be a think tank, but Tinker was so successful so fast that clients were soon lined up outside the door. When Buick wanted a name for its new luxury coup?, the Tinker Group came up with Riviera. When Bulova wanted a name for its new quartz watch, Tinker suggested Accutron. Tinker also worked with Coca-Cola and Exxon and Westinghouse and countless others, whose names-according to the strict standards of secrecy observed by the group-they would not divulge. Tinker started with four partners and a single phone. But by the end of the sixties it had taken over eight floors of the Dorset.

  What distinguished Tinker was its particular reliance on the methodology known as motivational research, which was brought to Madison Avenue in the nineteen-forties by a cadre of European intellectuals trained at the University of Vienna. Advertising research up until that point had been concerned with counting heads-with recording who was buying what. But the motivational researchers were concerned with why: Why do people buy what they do?What motivates them when they shop? The researchers devised surveys, with hundreds of questions, based on Freudian dynamic psychology. They used hypnosis, the Rosenzweig Picture-Frustration Study, role-playing, and Rorschach blots, and they invented what we now call the focus group. There was Paul Lazarsfeld, one of the giants of twentieth-century sociology, who devised something called the Lazarsfeld-Stanton Program Analyzer, a little device with buttons to record precisely the emotional responses of research subjects. There was Hans Zeisel, who had been a patient of Alfred Adler's in Vienna, and went to work at McCann-Erickson. There was Ernest Dichter, who had studied under Lazarsfeld at the Psychological Institute in Vienna, and who did consulting for hundreds of the major corporations of the day. And there was Tinker's Herta Herzog, perhaps the most accomplished motivational researcher of all, who trained dozens of interviewers in the Viennese method and sent them out to analyze the psyche of the American consumer.

  "For Puerto Rican rum once, Herta wanted to do a study of why people drink, to tap into that below-the-surface kind of thing," Rena Bartos, a former advertising executive who worked with Herta in the early days, recalls. "We would would invite someone out to drink and they would order whatever they normally order, and we would administer a psychological test. Then we'd do it again at the very end of the discussion, after the drinks. The point was to see how people's personality was altered under the influence of alcohol." Herzog helped choose the name of Oasis cigarettes, because her psychological research suggested that the name-with its connotations of cool, bubbling springs-would have the greatest appeal to the orally-fixated smoker.

  "Herta was graceful and gentle and articulate," Herbert Krugman, who worked closely with Herzog in those years, says. "She had enormous insights. Alka-Seltzer was a client of ours, and they were discussing new approaches for the next commercial. She said, 'You show a hand dropping an Alka-Seltzer tablet into a glass of water. Why not show the hand dropping two? You'll double sales.' And that's just what happened. Herta was the gray eminence. Everybody worshipped her."

  Herta Herzog is now eighty-nine. After retiring from Tinker, she moved back to Europe, first to Germany and then to Austria, her homeland. She wrote an analysis of the TV show "Dallas" for the academic journal Society. She taught college courses on communications theory. She conducted a study on the Holocaust for the Vidal Sassoon Center for the Study of Anti-Semitism, in Jerusalem. Today, she lives in the mountain village of Leutasch, half an hour's hard drive up into the Alps from Innsbruck, in a white picture-book cottage with a sharply pitched roof. She is a small woman, slender and composed, her once dark hair now streaked with gray. She speaks in short, clipped, precise sentences, in flawless, though heavily accented, English. If you put her in a room with Shirley Polykoff and Ilon Specht, the two of them would talk and talk and wave their long, bejeweled fingers in the air, and she would sit unobtrusively in the corner and listen. "Marion Harper hired me to do qualitative research-the qualitative interview, which was the specialty that had been developed in Vienna at the .sterreichische Wirtschaftspsychologische Forschungsstelle," Herzog told me. "It was interviewing not with direct questions and answers but where you open some subject of the discussion relevant to the topic and then let it go. You have the interviewer not talk but simply help the person with little questions like 'And anything else?' As an interviewer, you are not supposed to influence me. You are merely trying to help me. It was a lot like the psychoanalytic method." Herzog was sitting, ramrod straight, in a chair in her living room. She was wearing a pair of black slacks and a heavy brown sweater to protect her against the Alpine chill. Behind her was row upon row of bookshelves, filled with the books of a postwar literary and intellectual life: Mailer in German, Reisman in English. Open and face down on a long couch perpendicular to her chair was the latest issue of the psychoanalytic journal Psyche. "Later on, I added all kinds of psychological things to the process, such as word-association tests, or figure drawings with a story. Suppose you are my respondent and the subject is soap. I've already talked to you about soap. What you see in it. Why you buy it. What you like about it. Dislike about it. Then at the end of the interview I say, 'Please draw me a figure-anything you want-and after the figure is drawn tell me a story about the figure.'"

  When Herzog asked her subjects to draw a figure at the end of an interview, she was trying to extract some kind of narrative from them, something that would shed light on their unstated desires. She was conducting, as she says, a psychoanalytic session. But she wouldn't ask about hair-color products in order to find out about you, the way a psychoanalyst might; she would ask about you in order to learn about hair-color products. She saw that the psychoanalytic interview could go both ways. You could use the techniques of healing to figure out the secrets of selling. "Does she or doesn't she?" and "Because I'm worth it" did the same thing: they not only carried a powerful and redemptive message, but-and this was their real triumph-they succeeded in attaching that message to a five-dollar bottle of hair dye. The lasting contribution of motivational research to Madison Avenue was to prove that you could do this for just about anything-that the products and the commercial messages with which we surround ourselves are as much a part of the psychological furniture of our lives as the relationships and emotions and experiences that are normally the subject of psychoanalytic inquiry.

  "There is one thing we did at Tinker that I remember well,"Herzog told me, returning to the theme of one of her, and Tinker's, coups. "I found out that people were using Alka-Seltzer for stomach upset, but also for headaches," Herzog said. "We learned that the stomach ache was the kind of ache where many people tended to say 'It was my fault.' Alka-Seltzer had been mostly advertised in those days as a cure for overeating, and overeating is something you have done. But the headache is quite different. It is something imposed on you." This was, to Herzog, the classic psychological insight. It revealed Alka-Seltzer users to be divided into two apparently incompatible camps-the culprit and the victim-and it suggested that the company had been wooing one at the expense of the other. More important, it suggested that advertisers, with the right choice of words, could resolve that psychological dilemma with one or, better yet, two little white tablets. Herzog allowed herself a small smile. "So I said the nice thing would be if you could find something that combines these two elements. The copywriter came up with 'the blahs.'" Herzog repeated the phrase, "the blahs," because it was so beautiful. "The blahs was not one thing or the other-it was not the stomach or the head. It was both."

  6.

  This notion of household products as psychological furniture is, when you think about it, a radical idea. When we give an account of how we got to where we are, we're inclined to credit the philosophical over the physical, and the products of art over the products of commerce. In the list of sixties social heroes, there are musicians and poets and civil-rights activists and sports figures. Herzog's implication is that such a high-minded list is incomplete. What, say, of Vidal Sassoon? In the same period, he gave the world the Shape, the Acute Angle, and the One-Eyed Ungaro. In the old "cosmology of cosmetology," McCracken writes, "the client counted only as a plinth.   the conveyor of the cut." But Sassoon made individualization the hallmark of the haircut, liberating women's hair from the hair styles of the times-from, as McCracken puts it, those "preposterous bits of rococo shrubbery that took their substance from permanents, their form from rollers, and their rigidity from hair spray." In the Herzogian world view, the reasons we might give to dismiss Sassoon's revolution-that all he was dispensing was a haircut, that it took just half an hour, that it affects only the way you look, that you will need another like it in a month-are the very reasons that Sassoon is important. If a revolution is not accessible, tangible, and replicable, how on earth can it be a revolution?

  "Because I'm worth it" and "Does she or doesn't she?" were powerful, then, precisely because they were commercials, for commercials come with products attached, and products offer something that songs and poems and political movements and radical ideologies do not, which is an immediate and affordable means of transformation. "We discovered in the first few years of the 'Because I'm worth it' campaign that we were getting more than our fair share of new users to the category-women who were just beginning to color their hair," Sennott told me. "And within that group we were getting those undergoing life changes, which usually meant divorce. We had far more women who were getting divorced than Clairol had. Their children had grown, and something had happened, and they were reinventing themselves." They felt different, and Ilon Specht gave them the means to look different-and do we really know which came first, or even how to separate the two? They changed their lives and their hair. But it wasn't one thing or the other. It was both.

  7.

  Since the mid-nineties, the spokesperson for Clairol's Nice 'n Easy has been Julia Louis-Dreyfus, better known as Elaine, from "Seinfeld." In the Clairol tradition, she is the girl next door-a postmodern Doris Day. But the spots themselves could not be less like the original Polykoff campaigns for Miss Clairol. In the best of them, Louis-Dreyfus says to the dark-haired woman in front of her on a city bus, "You know, you'd look great as a blonde." Louis-Dreyfus then shampoos in Nice 'n Easy Shade 104 right then and there, to the gasps and cheers of the other passengers. It is Shirley Polykoff turned upside down: funny, not serious; public, not covert.

  L'Oreal, too, has changed. Meredith Baxter Birney said "Because I'm worth it" with an earnestness appropriate to the line. By the time Cybill Shepherd became the brand spokeswoman, in the eighties, it was almost flip-a nod to the materialism of the times-and today, with Heather Locklear, the spots have a lush, indulgent feel. "New Preference by L'Oreal,"she says in one of the current commercials. "Pass it on. You're worth it." The "because" -which gave Ilon Specht's original punch line such emphasis-is gone. The forceful "I'm" has been replaced by "you're." The Clairol and L'Oreal campaigns have converged. According to the Spectra marketing firm, there are almost exactly as many Preference users as Nice 'n Easy users who earn between fifty thousand and seventy-five thousand dollars a year, listen to religious radio, rent their apartments, watch the Weather Channel, bought more than six books last year, are fans of professional football, and belong to a union.

  But it is a tribute to Ilon Specht and Shirley Polykoff's legacy that there is still a real difference between the two brands. It's not that there are Clairol women or L'Oreal women. It's something a little subtler. As Herzog knew, all of us, when it comes to constructing our sense of self, borrow bits and pieces, ideas and phrases, rituals and products from the world around us-over-the-counter ethnicities that shape, in some small but meaningful way, our identities. Our religion matters, the music we listen to matters, the clothes we wear matter, the food we eat matters-and our brand of hair dye matters, too. Carol Hamilton, L'Oreal's vice-president of marketing, says she can walk into a hair-color focus group and instantly distinguish the Clairol users from the L'Oreal users. "The L'Oreal user always exhibits a greater air of confidence, and she usually looks better-not just her hair color, but she always has spent a little more time putting on her makeup, styling her hair," Hamilton told me. "Her clothing is a little bit more fashion-forward. Absolutely, I can tell the difference." Jeanne Matson, Hamilton's counterpart at Clairol, says she can do the same thing. "Oh, yes," Matson told me. "There's no doubt. The Clairol woman would represent more the American-beauty icon, more naturalness. But it's more of a beauty for me, as opposed to a beauty for the external world. L'Oreal users tend to be a bit more aloof. There is a certain warmth you see in the Clairol people. They interact with each other more. They'll say, 'I use Shade 101.' And someone else will say, 'Ah, I do, too!' There is this big exchange."

  These are not exactly the brand personalities laid down by Polykoff and Specht, because this is 1999, and not 1956 or 1973. The complexities of Polykoff's artifice have been muted. Specht's anger has turned to glamour. We have been left with just a few bars of the original melody. But even that is enough to insure that "Because I'm worth it" will never be confused with "Does she or doesn't she?" Specht says, "It meant I know you don't think I'm worth it, because that's what it was with the guys in the room. They were going to take a woman and make her the object. I was defensive and defiant. I thought, I'll fight you. Don't you tell me what I am. You've been telling me what I am for generations." As she said "fight," she extended the middle finger of her right hand. Shirley Polykoff would never have given anyone the finger. She was too busy exulting in the possibilities for self-invention in her America-a land where a single woman could dye her hair and end up lying on a beach with a ring on her finger. At her retirement party, in 1973, Polykoff reminded the assembled executives of Clairol and of Foote, Cone & Belding about the avalanche of mail that arrived after their early campaigns: "Remember that letter from the girl who got to a Bermuda honeymoon by becoming a blonde?"

  Everybody did.

  "Well," she said, with what we can only imagine was a certain sweet vindication, "I wrote it."
GO TO TOP MENU

  Last week, New York City began confiscating the automobiles of people caught drinking and driving. On the first day of the crackdown, the police seized three cars, including one from a man who had been arrested for drunk driving on eight previous occasions. The tabloids cheered. Mothers Against Drunk Driving nodded in approval. After a recent series of brutal incidents involving the police tarnished the Giuliani administration, the Mayor's anti-crime crusade appeared to right itself. The city now has the toughest anti-drunk-driving policy in the country, and the public was given a welcome reminder that the vast majority of the city's thirty-eight thousand cops are neither racist nor reckless and that the justice they mete out is largely deserved. "There's a very simple way to stay out of this problem, for you, your family, and anyone else," a triumphant Giuliani said. "Do not drink and get behind the wheel of a car."

  Let's leave aside, for a moment, the question of whether the new policy is constitutional. That is a matter for the courts. A more interesting issue is what the willing acceptance of such a hard-line stance on drunk driving says about the sometimes contradictory way we discuss traffic safety. Suppose, for example, that I was stopped by the police for running a red light on Madison Avenue. I would get points on my license and receive a fine. If I did the same thing while my blood-alcohol level was above the prescribed limit, however, I would be charged with drunk driving and lose my car. The behavior is the same in both cases, but the consequences are very different. We believe, as a society, that the combination of alcohol and driving deserves particular punishment. And that punishment isn't necessarily based on what you have actually done. It's often based on what you could do - or, to be more precise, on the extra potential for harm that your drinking poses.

  There is nothing wrong with this approach. We have laws against threatening people with guns for the same reason. It hardly makes sense to wait for drunks or people waving guns to kill someone before we arrest them. But if merely posing a threat to others on the road is the threshold for something as drastic as civil forfeiture, then why are we stopping with drunks? Fifty per cent of all car accidents in the United States are attributed to driver inattention, for example. Some of that inattention is caused by inebriation, but there are other common and obvious distractions. Two studies made in the past three years - the first conducted at the Rochester Institute of Technology and the second published in the New England Journal of Medicine - suggest that the use of car phones is associated with a four-to-fivefold increase in the risk of accidents, and that hands-free phones may not be any safer than conventional ones. The driver on the phone is a potential risk to others, just as the driver who has been drinking is. It is also now abundantly clear that sport-utility vehicles and pickup trucks can - by virtue of their weight, high clearance, and structural rigidity - do far more damage in an accident than conventional automobiles can. S.U.V.s and light trucks account for about a third of the vehicles on the road. But a disproportionate number of the fatalities in two-vehicle crashes are caused by collisions between those bigger vehicles and conventional automobiles, and the people riding in the cars make up a stunning eighty-one per cent of those killed.

  The reason we don't like drunk drivers is that by making the decision to drink and drive an individual deliberately increases his or her chance of killing someone else with a vehicle. But how is the moral culpability of the countless Americans who have walked into a dealership and made a decision to buy a fifty-six- hundred-pound sport utility any different? Of course, there are careful S.U.V. drivers and careful car-phone users. Careful people can get drunk, too, and overcompensate for their impairment by creeping along at twenty-five miles an hour, and in New York City we won't hesitate to take away their vehicles. Obviously, Giuliani, even in his most crusading moments, isn't about to confiscate all the car phones and S.U.V.s on the streets of New York. States should, however, stop drivers from using car phones while the car is in motion, as some countries, including England, do. And a prohibitive weight tax on sport utilities would probably be a good idea. The moneys collected could be used to pay the medical bills and compensate the family of anyone hit by some cell-phone-wielding yuppie in a four-wheeled behemoth.
GO TO TOP MENU

  Is the hectic pace of contemporary life really to blame for A.D.D.? Not so fast.

  1.

  There has always been a temptation in American culture to think of drugs as social metaphors. In the early sixties, the pharmaceutical metaphor for the times was Valium. During the sexual revolution, it was the Pill, and that was followed, in quick succession, by marijuana in the nineteen-seventies, cocaine in the nineteen-eighties, and Prozac in the early nineteen-nineties. Today, of course, the drug that has come to symbolize our particular predicaments is Ritalin, the widely prescribed treatment for attention-deficit hyperactivity disorder, or attention-deficit disorder, as it is more popularly known. In his new book, "The Hyperactivity Hoax," the neuropsychiatrist Sydney Walker calls attention disorders and the rise of Ritalin "symptoms of modern life, rather than symptoms of modern disease." In "Ritalin Nation" the psychologist Richard DeGrandpre argues that Ritalin and A.D.H.D. are the inevitable by-products of a culture-wide addiction to speed - to cellular phones and beepers and faxes and overnight mail and computers with powerful chips and hard-driving rock music and television shows that splice together images at hundredth-of-a-second intervals, and a thousand other social stimulants that have had the effect of transforming human expectations. The soaring use of Ritalin, the physician Lawrence Diller concludes in his new book, "Running on Ritalin," reveals something about the kind of society we are at the turn of the millennium.   . It throws a spotlight on some of our most sensitive issues: what kind of parents we are, what kind of schools we have, what kind of health care is available to us. It brings into question our cultural standards for behavior, performance, and punishment; it reaches into the workplace, the courts and the halls of Congress. It highlights the most basic psychological conundrum of nature versus nurture, and it raises fundamental philosophical questions about the nature of free will and responsibility.

  In a recent Time cover story on Ritalin, the mother of a child with A.D.H.D. is described as tearing up her daughter's Ritalin prescription. "I thought, maybe there is something else we can do," she says. "I knew that medicine can mask things." That is the kind of question that Ritalin provokes - not the simple, traditional "Will this cure my child?" but the harder, postmodern question "In curing my child, what deeper pathology might this drug be hiding?"

  It's important that we ask questions like this, particularly of drugs that are widely used. The problem with Ritalin is that many of the claims made to support the drug's status as a symbol turn out, on closer examination, to be vague or confusing. Diller, DeGrandpre, and Walker are all, for example, deeply suspicious of our reliance on Ritalin. They think that it is overprescribed - that it is being used to avoid facing broader questions about our values and our society. This sounds plausible: the amount of Ritalin consumed in the United States has more than tripled since 1990. Then again, it has been only in the last ten years that clinical trials have definitively proved that Ritalin is effective in treating A.D.H.D. And, even with that dramatic increase, the number of American children taking Ritalin is estimated to be one or two per cent. Given that most estimates put the incidence of A.D.H.D. at between three and five per cent, are too many children taking the drug - or too few? "You really run into problems with teen-agers," William Pelham, a professor of psychology at SUNY Buffalo and a prominent A.D.H.D. expert, told me. "They don't want to take this medication. They don't feel they need to. It's part of the oppositional stuff you run into. The kids whom you most want to take it are the ones who are aggressive, and they are the most likely to blow it off."

  Or consider how A.D.H.D. is defined. According to the Diagnostic and Statistical Manual-IV, a child has A.D.H.D. if, for a period of six months, he or she exhibits at least six symptoms from a list of behavioral signs. Among them: "often has difficulty organizing tasks and activities," "often does not seem to listen when spoken to directly," "is often easily distracted by extraneous stimuli," "is often 'on the go' or acts as if 'driven by a motor,'" and "often blurts out answers before questions have been completed," and so on. "Ritalin Nation" argues that all these are essentially symptoms of boredom - the impatience of those used to the rapid-fire pace of MTV, Nintendo, and the rest of contemporary culture. The A.D.H.D. child blurts out answers before questions have been completed because, DeGrandpre says, "listening is usually a waiting situation that provides a low level of stimulation." The A.D.H.D. child is easily distracted because, "by definition, extraneous stimuli are novel." Give A.D.H.D. kids something novel to do, something that can satisfy their addiction, DeGrandpre argues, and they'll be fine. Diller works with a different definition of A.D.H.D. but comes to some of the same conclusions. High-stimulus activities like TV and video games "constitute a strange sort of good-fit situation for distractible children," he writes. "These activities are among the few things they can concentrate on well."

  2.

  When A.D.H.D. kids are actually tested on activities like video games, however, this alleged "good fit" disappears. Rosemary Tannock, a behavioral scientist at the Hospital for Sick Children, in Toronto, recently looked at how well a group of boys between the ages of eight and twelve actually did at Pac Man and Super Mario World, and she found that the ones with A.D.H.D. completed fewer levels and had to restart more games than their unaffected peers. "They often failed to inhibit their forward trajectory and crashed headlong into obstacles," she explained. A.D.H.D. kids may like the stimulation of a video game, but that doesn't mean they can handle it. Tannock has also given a group of A.D.H.D. children what's called a letter-naming test. The child is asked to read as quickly as he can five rows of letters, each of which consists of five letters repeated in different orders - "A, B, C, D, E," for example, followed by "D, E, B, A, C," and so on. A normal eight-year-old might take twenty-five seconds to complete the list. His counterpart with attention deficit might take thirty-five seconds, which is the kind of performance usually associated with dyslexia. "Some of our most articulate [A.D.H.D.] youngsters describe how doing this test is like speaking a foreign language in a foreign land," Tannock told me. "You get exhausted. That's how they feel. They have a thousand different ideas crowding into their heads at the same time." This doesn't sound like a child attuned to the quicksilver rhythms of the modern age. This sounds like a garden-variety learning disorder.

  What further confounds the culture-of-Ritalin school is that A.D.H.D. turns out to have a considerable genetic component. As a result of numerous studies of twins conducted around the world over the past decade, scientists now estimate that A.D.H.D. is about seventy per cent heritable. This puts it up there with the most genetically influenced of traits - traits such as blood pressure, height, and weight. Meanwhile, the remaining thirty per cent - the environmental contribution to the disorder - seems to fall under what behavioral geneticists call "non-shared environment," meaning that it is likely to be attributable to such factors as fetal environment or illness and injury rather than factors that siblings share, such as parenting styles or socioeconomic class. That's why the way researchers describe A.D.H.D. has changed over the past decade. There is now less discussion of the role of bad parents, television, and diet and a lot more discussion of neurology and the role of specific genes.

  This doesn't mean that there is no social role at all in the expression of A.D.H.D. Clearly, something has happened to make us all suddenly more aware of the disorder. But when, for instance, Diller writes that "the conditions that have fueled the A.D.D. epidemic and the Ritalin boom" will not change until "America somehow regains its balance between material gain and emotional and spiritual satisfaction," it's clear that he is working with a definition of A.D.H.D. very different from that of the scientific mainstream. In fact, books like "Running on Ritalin" and "Ritalin Nation" don't seem to have a coherent definition of A.D.H.D. at all. This is what is so confusing about the popular debate over this disorder: it's backward. We've become obsessed with what A.D.H.D. means. Don't we first have to figure out what it is?

  3.

  One of the tests researchers give to children with A.D.H.D. is called a stop-signal task. A child sits down at a computer and is told to hit one key if he sees an "X" on the screen and another key if he sees an "O." If he hears a tone, however, he is to refrain from hitting the key. By changing the timing of the tone - playing it just before or just as or just a millisecond after the "X" or "O" appears on the screen - you can get a very good idea of how well someone reacts. "Kids with A.D.H.D. have a characteristically longer reaction time," Gordon Logan, a cognitive psychologist at the University of Illinois, told me. "They're fifty per cent slower than other kids." Unless the tone is played very early, giving them plenty of warning, they can't stop themselves from hitting the keys.

  The results may seem a relatively trivial matter - these are differences measured in fractions of a second, after all. But for many researchers the idea that children with A.D.H.D. lack some fundamental ability to inhibit themselves, to stop a pre-programmed action, is at the heart of the disorder. Suppose, for example, that you have been given a particularly difficult math problem. Your immediate, impulsive response might be to throw down your pencil in frustration. But most of us wouldn't do that. We would check those impulses, and try to slog our way through the problem, and, with luck, maybe get it right. Part of what it takes to succeed in a complex world, in other words, is the ability to inhibit our impulses. But the child with A.D.H.D., according to the official diagnosis, "often does not follow through on instructions and fails to finish schoolwork, chores, or duties in the workplace" and "often runs about or climbs excessively in situations in which it is inappropriate." He cannot apply himself because he cannot regulate his behavior in a consistent manner. He is at the mercy of the temptations and distractions in his immediate environment. "It's not that a child or an individual is always hyperactive or always inattentive or distracted," Tannock says. "The same individual can one minute be restless and fidgeting or the next minute lethargic or yawning. The individual can be overfocussed one minute and incredibly distractible the next. It is this variability, from day to day and moment to moment, that is the most robust finding we have."

  Russell Barkley, a professor of psychiatry at the University of Massachusetts at Worcester, has done experiments that look at the way A.D.H.D. kids experience time, and the results demonstrate how this basic problem with self-regulation can have far-reaching consequences. In one experiment, he turns on a light for a predetermined length of time and then asks a child to turn the light back on and off for what the child guesses to be the same interval. Children without A.D.H.D. perform fairly consistently. At twelve seconds, for example, their guesses are just a little low. At thirty-six seconds, they are slightly less accurate - still on the low side - and at sixty seconds their guesses are coming in at about fifty seconds. A.D.H.D. kids, on the other hand, are terrible at this game. At twelve seconds, they are well over; apparently, twelve seconds seems much, much longer to them. But at sixty seconds their guesses are much lower than everyone else's; apparently, the longer interval is impossible to comprehend. The consequences of having so profoundly subjective a sense of time are obvious. It's no surprise that people with A.D.H.D. often have problems with punctuality and with patience. An accurate sense of time is a function of a certain kind of memory - an ability to compare the duration of ongoing events with that of past events, so that a red light doesn't seem like an outrageous imposition, or five minutes doesn't seem so impossibly long that you can imagine getting from one side of town to the other in that amount of time. Time is about imposing order, about exercising control over one's perceptions, and that's something that people with attention deficit have trouble with.

  This way of thinking about A.D.H.D. clarifies some of the more confusing aspects of the disorder. In DeGrandpre's formulation, the A.D.H.D. child can't follow through on instructions or behaves inappropriately because there isn't enough going on in his environment. What the inhibition theory implies is the opposite: that the A.D.H.D. child can't follow through or behaves inappropriately because there is too much going on; he falters in situations that require him to exercise self-control and his higher cognitive skills. DeGrandpre cannot explain why A.D.H.D. kids like video games but are also so bad at them. Shouldn't they thrive in that most stimulating of environments? If their problem is self-control, that apparent contradiction makes perfect sense. The A.D.H.D. child likes video games because they permit - even encourage - him to play impulsively. But he's not very good at them because to succeed at Pac Man or Super Mario World a child must learn to overcome the temptation posed by those games to respond impulsively to every whiz and bang: the child has to learn to stop and think (ever so quickly) before reacting.

  At the same time, this theory makes it a lot clearer what kind of problem A.D.H.D. represents. The fact that children with the disorder can't finish the hard math problem doesn't mean that they're not smart enough to know the answer. It means they can't focus long enough to get to the answer. As Barkley puts it, A.D.H.D. is a problem not of knowing what you should do but, rather, of doing what you know. Motivation and memory and higher cognitive skills are intact in people with attention deficit. "But they are secondarily delayed," Barkley says. "They have no chance. They are rarely engaged and highly ineffective, because impulsive actions take precedence." The inability to stop pressing that "X" or "O" key ends up causing much more serious problems down the road.

  This way of thinking about A.D.H.D. also demystifies Ritalin. Implicit in the popular skepticism about the drug has always been the idea that you cannot truly remedy something as complicated as A.D.H.D. with a pill. That's why the mother quoted in the Time story ripped up her child's Ritalin prescription, and why Diller places so much emphasis on the need for "real" social and spiritual solutions. But if A.D.H.D. is merely a discrete problem in inhibition why couldn't Ritalin be a complete solution? People with A.D.H.D. don't need a brain overhaul. They just need a little help with stopping..

  4.

  There is another way to look at the A.D.H.D.-Ritalin question, which is known as the dopamine theory. This is by no means a conclusive account of A.D.H.D., but it may help clarify some of the issues surrounding the disorder. Dopamine is the chemical in the brain - the neurotransmitter - that appears to play a major role in things like attention and inhibition. When you tackle a difficult task or pay attention to a complex social situation, you are essentially generating dopamine in the parts of the brain that deal with higher cognitive tasks. If you looked at a thousand people at random, you would find a huge variation in their dopamine systems, just as you would if you looked at, say, blood pressure in a random population. A.D.H.D., according to this theory, is the name we give to people whose dopamine falls at the lower end of the scale, the same way we say that people suffer from hypertension if their blood pressure is above a certain point. In order to get normal levels of attention and inhibition, you have to produce normal levels of dopamine.

  This is what Ritalin does. Dopamine is manufactured in the brain by special receptors, and each of those receptors has a "transport," a kind of built-in vacuum cleaner that sucks up any excess dopamine floating around and stores it inside the neuron. Ritalin shuts down that transport, so the amount of dopamine available for cognition remains higher than it would be otherwise. In about sixty-five per cent of those who take the drug, Ritalin appears to make them "normal," and in an additional ten per cent it appears to bring about substantial improvement. It does have a few minor side effects - appetite loss and insomnia, in some users - but by and large it's a remarkably safe drug, with remarkably specific effects.

  So what does the fact that we seem to be relying more and more on Ritalin mean? The beginning of the answer, I think, lies in the fact that Ritalin is not the only drug in existence that enhances dopamine. Cocaine affects the brain in almost exactly the same way. Nicotine, too, is a dopamine booster, although its mechanism is somewhat different. Obviously, taking Ritalin doesn't have the same consequences as snorting cocaine or smoking a cigarette. It's not addictive, and its effect is a lot more specific. Still, nicotine, cocaine, and Ritalin are all performing the same basic neurological function.

  What, for instance, was the appeal of cocaine at the beginning of the coke epidemic of the eighties? It was a feel-good drug. But it was a feel-good drug of a certain kind - a drug that people thought would help them master the complexity and the competitive pressures of the world around them. In the now infamous Time story on cocaine that ran in the summer of 1981, there is a picture of a "freelance artist" in Manhattan doing lines on his lunch break, with the caption "Feeling stronger, smarter, faster, more able to cope." Cocaine, the article begins, "is becoming the all-American drug," and its popularity, in the words of one expert, is a symptom of the fact that "right from childhood in this country there is pressure for accomplishment." At the moment of its greatest popularity, cocaine was considered a thinking drug, an achievement drug, a drug for the modern world. Does that sound familiar?

  Nicotine has a similar profile. Cigarettes aid concentration. Understandably, this isn't a fact that has received much publicity in recent years. But there are plenty of data showing that nicotine does exactly what you would expect a dopamine enhancer to do. In one experiment, for example, smokers were given three minutes to read randomly ordered letters, in rows of thirty, and cross out the letter "e" every time they encountered it. The smokers took the test twice, before and after smoking a cigarette, and, on average, they were able to read 161.5 more letters - or more than five extra lines - after smoking than before. It's no surprise that this test sounds a lot like the test that A.D.H.D. kids do so poorly on, because we are really talking about the same set of cognitive skills - the ability to concentrate and screen out distractions. Numerous studies have shown that children with A.D.H.D. are much more likely to smoke and take illegal drugs in later life; what the dopamine theory suggests is that many people resort to such substances as a way of medicating themselves. Nora Volkow, the chairman of medicine at Brookhaven National Laboratory, says that between ten and twenty per cent of drug addicts have A.D.H.D. "In studies, when they were given Ritalin they would stop taking cocaine," she told me. Timothy Wilens, a psychiatrist at Harvard Medical School, presented data at a recent National Institutes of Health conference on A.D.H.D. which showed that treating A.D.H.D. kids with Ritalin and the like lowered the risk of their developing drug problems in adolescence by an extraordinary sixty-eight per cent. Among people with dopamine deficits, Ritalin is becoming a safe pharmaceutical alternative to the more dangerous dopamine boosters of the past.

  Here, surely, is one of the deeper implications of the rise of Ritalin - particularly among adults, whose use of the drug has increased rapidly in recent years. For decades, in this country and around the world, millions of people used smoking as a way of boosting their dopamine and sharpening focus and concentration. Over the past twenty years, we have gradually taken away that privilege, by making it impossible for people to smoke at work and by marshalling an array of medical evidence to convince people that they should not start at all. From a public-health standpoint, this has been of critical importance: countless lives have been saved. But the fact remains that millions of people have lost a powerful pharmacological agent - nicotine - that they had been using to cope with the world around them. In fact, they have lost it precisely at a moment when the rising complexity of modern life would seem to make dopamine enhancement more important than ever. Among adults, Ritalin is a drug that may fill the void left by nicotine.

  Among children, Ritalin is clearly performing a similar function. We are extending to the young cognitive aids of a kind that used to be reserved exclusively for the old. It is this reliance on a drug - the idea that children should have to be medicated - that, of course, people like Diller, Walker, and DeGrandpre find so upsetting. If some children need to take a drug in order to be "normal," they think that the problem is with our definition of "normal." Diller asks, "Is there still a place for childhood in the anxious, downsizing America of the late nineteen-nineties? What if Tom Sawyer or Huckleberry Finn were to walk into my office tomorrow? Tom's indifference to schooling and Huck's 'oppositional' behavior would surely have been cause for concern. Would I prescribe Ritalin for them, too?" But this is just the point. Huck Finn and Tom Sawyer lived in an age where difficult children simply dropped out of school, or worked on farms, or drifted into poverty and violence. The "childhood" Diller romanticizes was a ruthlessly Darwinian place, which provided only for the most economically - and genetically - privileged. Children are now being put into situations that demand attention and intellectual consideration, and it is no longer considered appropriate simply to cast aside those who because of some neurological quirk have difficulty coping. Only by a strange inversion of moral responsibility do books like "Ritalin Nation" and "Running on Ritalin" seek to make those parents and physicians trying to help children with A.D.H.D. feel guilty for doing so. The rise of A.D.H.D. is a consequence of what might otherwise be considered a good thing: that the world we live in increasingly values intellectual consideration and rationality - increasingly demands that we stop and focus. Modernity didn't create A.D.H.D. It revealed it.
GO TO TOP MENU

  Sagaponack Homeowners Association vs. Ira Rennert

  Now that the strange case of the Sagaponack Homeowners Association vs. Ira Rennert appears to be concluded, it may be time to reflect on what has - and has not - been learned from this, the season's most engrossing episode of rich-on-rich violence. Rennert, as is now widely known, is the multimillionaire industrialist who is building a forty-two- thousand-square-foot single-family home, reported to cost a hundred million dollars, on a sixty-three-acre Hamptons potato field. The Sagaponack Homeowners Association is the opposing group of angry neighbors whose petition to withdraw Rennert's building permits was voted down earlier this month by the Town of Southampton's Zoning Appeals Board. The battle has been a long and heated one, about zoning laws and ocean views, and it has left those of us without homes in the Hamptons more than a little confused. Herewith, then, a brief guide for the uninitiated.

  Let's start with the least technical, but perhaps the philosophically thorniest, question raised by the case. Why, in an area full of very big houses, is the prospect of a very, very big house so controversial? The answer is that "big" is a relative term. A standard new monster home in the Hamptons, for example, now runs between ten thousand and fourteen thousand square feet, which is to say that it probably has six or seven large bedrooms (each with its own bathroom), a great hall, a formal living room, an informal living room, a dining room, a media room, a library, a five-to-six-hundred-square-foot kitchen, maids' quarters, a pantry, and, say, a three-or four-car garage. "Some houses have a squash court, and I've built a few with bowling alleys," Kurt Andreassen, a local contractor, said.

  When Southampton decided, this fall, to place a limit on the size of all new houses, it settled on twenty thousand square feet, on the ground that that figure represents a reasonable limit, given the big-house norms of the area. At twenty thousand square feet, a house has perhaps ten or eleven bedrooms, a dozen bathrooms, a six-car garage, and maybe, oh, a mini- trading floor for the kids. By comparison, Rennert's house, at forty-two thousand square feet, has twenty-nine bedrooms, thirty-three bathrooms, and two bowling alleys. What the Town of Southampton was saying, in other words, is that twelve bedrooms and one bowling alley is fine, but twenty-nine bedrooms and two bowling alleys is not. Think of the twenty-thousand figure as the community standard - a social consensus - for the maximum size a Hamptons monster home ought to be. With that extra bowling alley and those seventeen additional bedrooms, Rennert just went too far.

  Which brings us to question two: If twenty thousand square feet represents the Hamptons monster-home limit, then why didn't the Town of Southampton pass this zoning restriction long ago, before Rennert started building his dream house? You might think that it was because the town had been lax about zoning. But that is not the case. For example, the Sagaponack potato field where Rennert is building his house is zoned R-120, which means that the minimum lot size in the area for anyone looking to build a new house is three acres. There is also on eastern Long Island virtually no land zoned specifically for multifamily rental apartments. If you wanted to build an apartment house in the town of Southampton for people who are not millionaires, you would have to go before the town board and ask for a special variance. In Sagaponack, according to Paul Houlihan, Southampton's chief building inspector, such a variance would be impossible to get.

  Suppose, then, that Rennert was not a multimillionaire but only a millionaire and couldn't afford the standard Sagaponack three-acre plot - which runs somewhere between seven hundred and fifty thousand dollars and a million dollars. He wants one acre. And suppose that, to make things more affordable, he wants to put up a small town house with a couple of apartments that he could rent out. Under Southampton zoning rules, his application would be denied outright, even though that little town house wouldn't block anyone's view of the beach. But a forty-two-thousand-square-foot house on sixty-three acres? According to the original zoning laws of the town, perfectly legal. In other words, the citizens of Southampton have lots of rules in place to protect their community from people who have less money than they do. It just never occurred to them, until Rennert came their way, that they also need to protect their community from people who have more money than they do.
GO TO TOP MENU

  Science and the Perils of a Parable

  In the movie "A Civil Action, " the families of eight leukemia victims accuse two major corporations of contaminating the drinking water of Woburn, Massachusetts. John Travolta's portrayal of the lawyer who argues their case has been justifiably praised by critics for its subtlety: he is neither a villain nor a hero but an uncomfortable and ambiguous combination of the two - a man of equal parts greed and idealism who is in the grip of a powerful obsession. Curiously, though, when it comes to the scientific premise of the story, "A Civil Action" (like Jonathan Harr's best-seller, on which it is based) permits no ambiguity at all. It is taken as a given that the chemical allegedly dumped, trichloroethylene (TCE), is a human carcinogen - even though, in point of fact, TCE is only a probable human carcinogen: tests have been made on animals, but no human-based data have tied it to cancer. It is also taken as a given that the particular carcinogenic properties of TCE were what resulted in the town's leukemia outbreak, even though the particular causes and origins of that form of cancer remain mysterious. The best that can be said is that there might be a link between TCE and disease. But the difference between what "might be" and what "is" - which in scientific circles is all the difference in the world - does not appear to amount to much among the rest of us. We know that human character can be complex and ambiguous. But we want science to conform to a special kind of narrative simplicity: to begin from obvious premises and proceed, tidily and expeditiously, to morally satisfying conclusions.

  Consider the strange saga of silicone breast implants. Almost seven years ago, the Food and Drug Administration placed a moratorium on most uses of silicone implants, because the devices had been inadequately tested and the agency wanted to give researchers time to gather new data on their safety. Certain that the data would indict implants in the end, personal-injury lawyers rounded up hundreds of thousands of women in a massive class- action suit. By 1994, four manufacturers of implants had been instructed to pay out the largest class-action settlement in history: $4.25 billion. And when that amount proved insufficient for all the plaintiffs, the largest of the defendants - Dow Corning - filed for Chapter 11, offering $3.2 billion last November to settle its part of the suit.

  Now, however, we actually have the evidence on implant safety. More than twenty studies have been completed, by institutions ranging from Harvard Medical School to the Mayo Clinic. The governments of Germany, Australia, and Britain have convened scientific panels. The American College of Rheumatology, the American Academy of Neurology, and the Council on Scientific Affairs of the American Medical Association have published reviews of the evidence, and last month, in a long-awaited decision, an independent scientific panel, appointed by a federal court, released its findings. All of the groups have reached the same conclusion: there is little or no reason to believe that breast implants cause disease of any kind. The author of the toxicological section of the federal court's panel concluded, "There is no evidence silicone breast implants precipitate novel immune responses or induce systemic inflammation," and the author of the immunology section of the same report stated, "Women with silicone breast implants do not display a silicone-induced systemic abnormality in the types or functions of cells of the immune system."

  There is some sense now that with the unequivocal of the December report, the tide against implants may finally be turning. But that is small consolation. For almost seven years, at a cost of billions and in the face of some twenty-odd studies to the contrary, the courts and the public clung to a conclusion with no particular merit other than that it sounded as if it might be true. Here, after all, was a group of profit-driven multinationals putting gooey, leaky, largely untested patties of silicone into the chests of a million American women. In the narrative we have imposed on science, that act ought to have consequences, just as the contamination of groundwater by a profit-seeking multinational ought to cause leukemia. Our moral sense said so, and, apparently, that was enough. Of course, if science always made moral sense we would not need scientists. We could staff our laboratories with clergy.

  It may be hard to shed a tear for implant manufacturers Dow Corning, even though their shareholders have been royally ransomed for no good reason. Those who sell drugs and medical devices must expect to be held hostage, from time to time, by the irrationalities of the legal system. The women in this country with breast implants do, however, deserve our compassion. They chose cosmetic in order to feel better about themselves. For this, they were first accused of an unnatural vanity and then warned that they had placed themselves in physical peril, and the first charge informed the second until the imagined threat of silicone implants took on the force of moral judgment: they asked for it, these women. They should have been satisfied with what God gave them and stayed home to reread "The Beauty Myth." Well, they didn't ask for anything, and what they did with their bodies turns out to have no larger meaning at all. Science, tempting though it is to believe otherwise, is not in the business of punishing the politically retrograde, nor is it a means of serving retribution to the wicked and the irresponsible. In the end, one may find that the true health toll of breast implants was the seven years of needless anxiety suffered by implant wearers at the hands of all those lawyers and health "advocates" who were ostensibly acting on their behalf.
GO TO TOP MENU

  We all know that underdogs can win - that's what the David versus Goliath legend tells us, and we've seen it with our own eyes. Or have we? In David and Goliath, Malcolm Gladwell, with his unparalleled ability to grasp connections others miss, uncovers the hidden rules that shape the balance between the weak and the mighty, the powerful and the dispossessed. Gladwell examines the battlefields of Northern Ireland and Vietnam, takes us into the minds of cancer researchers and civil rights leaders, and digs into the dynamics of successful and unsuccessful classrooms - all in an attempt to demonstrate how fundamentally we misunderstand the true meaning of advantages and disadvantages. When is a traumatic childhood a good thing? When does a disability leave someone better off? Do you really want your child to go to the best school he or she can get into? Why are the childhoods of people at the top of one profession after another marked by deprivation and struggle?

  Drawing upon psychology, history, science, business, and politics, David and Goliath is a beautifully written book about the mighty leverage of the unconventional. Millions of readers have been waiting for the next Malcolm Gladwell book. That wait is over.

  There is a story that is usually told about extremely successful people, a story that focuses on intelligence and ambition. Gladwell argues that the true story of success is very different, and that if we want to understand how some people thrive, we should spend more time looking around them-at such things as their family, their birthplace, or even their birth date. And in revealing that hidden logic, Gladwell presents a fascinating and provocative blueprint for making the most of human potential.

  In The Tipping Point Gladwell changed the way we understand the world. In Blink he changed the way we think about thinking. In OUTLIERS he transforms the way we understand success.

  The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.