The Animal in Us All

Animals in Translation: Using the Mysteries of Autism to Decode Animal Behavior
by Temple Grandin and Catherine Johnson. Scribner (Simon & Schuster), 2005 ($25)

Temple Grandin has been known to crawl through slaughterhouses to get a sense of what the animals there are experiencing. An autistic woman who as a child was recommended for institutionalization, Grandin has managed not only to enter society's mainstream but ultimately to become prominent in animal research. An associate professor at Colorado State University, she designs facilities used worldwide for humane handling of livestock. She also invented a "hug machine" (based on a cattle-holding chute) that calms autistic children.

In Animals in Translation, co-authored with science writer Catherine Johnson, Grandin makes an intriguing argument that, psychologically, animals and autistic people have a great deal in common--and that both have mental abilities typically underestimated by normal people. The book is a valuable, if speculative, contribution to the discussion of both autism and animal intelligence, two subjects on which there is little scientific consensus.

Autistics, in Grandin's view, represent a "way station" between average people, with all their verbal and conceptual abilities, and animals. In touring animal facilities, Grandin often spots details--a rattling chain, say, or a fluttering piece of cloth--that disturb the animals but have been overlooked by the people in charge. She also draws on psychological studies to show how oblivious humans can be to their surroundings. Ordinary humans seem to be less detail-oriented than animals and autistics.

Grandin argues that animals have formidable cognitive capabilities, albeit specialized ones, whereas humans are cognitive generalists. Dogs are smell experts, birds are migration specialists, and so on. In her view, some animals have a form of genius--much as autistic savants can perform feats of memory and calculation far beyond the abilities of average people. Some dogs, for example, can predict when their owner is about to have a seizure.

Delving into animal emotion, aggression and suffering, Grandin gives tips that may be useful for caretakers of pets and farm animals. She also notes that humans seem to need, and thrive on, the proximity of animals. Indeed, she states provocatively, in the process of becoming human we gave up something primal, and being around animals helps us get a measure of that back. --Kenneth Silber

Killer Education

Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter                                                                                                                                                  by Steven Johnson. Riverhead Books, 2005 ($23.95)

I am not a big fan of video games. Having watched friends devote weeks to slaughtering aliens in Halo, I have decided that time spent in virtual worlds is time wasted. It is just this kind of thinking that Steven Johnson tries to counter in Everything Bad Is Good for You.

A best-selling science writer who often tackles neuroscientific issues, Johnson argues against the presumption that popular media undermines our intellect. He claims that video games, television and movies are more complex than ever, to the benefit of viewers' cognitive skills. Whether we are mastering the intricacies of the simulation game SimCity or tracking the multiple plotlines in the TV drama 24, we are "honing ... mental skills that are just as important as the ones exercised by reading books," Johnson writes.

The learning does not come from content but from form, Johnson says. Video games, for example, enhance our problem-solving and decision-making skills as we test the limits of a game's logic; the aliens we are blasting are secondary. After making similar arguments for television, film and the Internet, he proposes that this increasingly challenging media environment may help explain the upward trend in IQ scores.

Unfortunately, Johnson uses only a modicum of neuroscience to back up his thesis. Elsewhere, and in the absence of footnotes, his arguments lack rigor. It may be true that a child's zombielike stare at the TV set is a sign of focus, as he writes, but the positive implication inherent in this statement pales in the face of a large amount of research that links young children's excessive television viewing with attention, learning and social problems during childhood and teen years.

Johnson also addresses video-game violence with more opinion than science. Even though he maintains that content does not matter, he often underplays the violent objectives of popular games. I am not convinced that the cognitive skills derived from building a virtual city equal those gleaned from shooting cops and innocent bystanders. In the end, Johnson has persuaded me that perhaps some of what is bad is good, but certainly not everything. --Aimee Cunningham

Older but Wiser

The Wisdom Paradox: How Your Mind Can Grow Stronger as Your Brain Grows Older   by Elkhonon Goldberg. Gotham Books, 2005 ($26)

The possibilities of cognitive decline and dementia are among the most frightening aspects of aging. But according to New York University neuropsychologist Elkhonon Goldberg, brains get better in key respects as they get older. Moreover, he argues in The Wisdom Paradox, people can do much to ward off the debilities associated with aging.

The brain's capacity for pattern recognition is central to Goldberg's premise. Moving through middle age and beyond, the brain develops a vast store of "generic memories"--knowledge of the shared patterns in events or things. This reservoir gives older people an improved ability to size up situations and solve problems without going through the step-by-step assessments a younger person might need.

Such pattern recognition underlies competence and expertise and can compensate for age-related declines in attention or memory. Pattern recognition can even amount to "wisdom"--basically, knowing what to do. The author cites various elderly achievers to demonstrate that mental vigor can persist late in life. He notes that sculptor Eduardo Chillida retained formidable abilities even as his Alzheimer's disease progressed.

Delving into the relevant neurobiology, Goldberg points to a growing body of evidence that the brain's left hemisphere is oriented toward familiar patterns, whereas the right hemisphere focuses on novelty. He argues that this dichotomy is more important than nuts-and-bolts partitions, such as the left hemisphere handling language while the right handles spatial reasoning. This maturation of mind means that the left hemisphere becomes increasingly important over a person's lifetime.

Moreover, the brain is shaped by how it is used. For instance, musicians who practice consistently develop a larger Heschl's gyrus, an area involved in processing sound. And contrary to onetime scientific belief, the brain forms new neurons throughout adulthood.

Through such observations, Goldberg emphasizes the importance of maintaining an active mind as a defense against mental decline. Though not a new idea, Goldberg impressively fits it into a wide-ranging picture of the aging brain. He speculates, for example, that art serves a central societal function in boosting mental acumen. He also outlines a "cognitive exercise program" he runs in which participants engage in computer-based exercises. The discussion here would have benefited from home-based exercises readers might try.

Altogether, The Wisdom Paradox makes a compelling case for the possibility of maintaining a sharp mind far into old age. The book merits attention from the old and not so old alike. --Kenneth Silber

Ice-Pick Therapy

The Lobotomist: A Maverick Medical Genius and His Tragic Quest to Rid the World of Mental Illness                                                                                                                                        by Jack El-Hai. John Wiley & Sons, 2005 ($27.95)

Few words conjure up more gruesome connotations than "lobotomy"--surgically severing the brain's frontal lobe in an attempt to relieve intractable psychiatric symptoms. And yet these operations--first performed in the U.S. in 1936 by psychiatrist and neurologist Walter Jackson Freeman and neurosurgeon James Winston Watts--continued for more than 40 years. In that time, Freeman, the procedure's champion, cut the brains of 3,500 people.

Biographer Jack El-Hai chronicles lobotomy's reign through Freeman's quest to treat mental illness surgically. The tale follows this son and grandson of prominent physicians from his youth in Philadelphia during the early 1900s through his rise and eventual fall in national prominence. Freeman emerges not merely as a maniacal devotee of radical "psychosurgery" but as an earnest advocate of potential treatments for otherwise intractable mental illness. Most of Freeman's work took place when state psychiatric hospitals overflowed with seemingly untreatable patients, many of whom suffered relentlessly. Effective psychiatric medications were not yet available, and lobotomy became a measure of last resort. El-Hai describes how neurosurgeons experimented to transform the complicated prefrontal lobotomy into the simpler transorbital lobotomy--nearly an outpatient procedure in which a physician entered a patient's brain through a region above the eye with an ice-picklike tool. A skilled practitioner could perform a transorbital lobotomy in minutes.

Surprisingly, many of Freeman's lobotomies were reported as successful, not only by Freeman but also by some patients and their families, who sent hundreds of letters expressing gratitude. Of course, many surgeries failed; Rosemary Kennedy, the sister of President John F. Kennedy who suffered "agitated depression," was left "inert and unable to speak more than a few words," as El-Hai says, and was ultimately institutionalized. In 1950 Freeman and Watts reported that of 711 lobotomies they had performed, "45 percent yielded good results, 33 percent produced fair results, and 19 percent left the patient unimproved or worse off." Not surprisingly, many patients remained confused, disconnected, listless and plagued by complications such as seizures. With the emergence of effective drugs during the 1970s, physicians halted lobotomies altogether.

The tale of lobotomy's rise and fall entails far more than one man's quest to spearhead a dubious surgical method. It is a story of desperation among thousands of patients, families, clinicians and policymakers struggling to manage a population seemingly crippled by illnesses for which there was no help. It is also a worrisome account of physicians groping for solutions to problems that they could not adequately address. In this sense, El-Hai's treatment of this medical saga is also poignant and illuminating. --Richard Lipkin