Brainy Benefits: The Dyslexic Advantage: Unlocking the Hidden Potential of the Dyslexic Brain
by Brock L. and Fernette F. Eide .
Hudson Street Press, 2011 (($25.95))
Perhaps the most challenging part of being dyslexic is the misconception that it makes people unintelligent or slow. In response, Brock and Fernette Eide have delivered a compelling call to action in their new book The Dyslexic Advantage: it is time to stop classifying dyslexia as a learning disability and start appreciating that different brain-wiring patterns allow people to process information in unique ways. When it comes to learning, they argue, there is no good or bad, right or wrong, only a difference in style, which should be fostered rather than corrected.
Although people with dyslexia may struggle with the fine-processing skills of reading and writing, often unintentionally interchanging letters and words, they can excel at “big picture” thinking. People with dyslexia frequently prefer thinking in narrative form, a proclivity that makes them natural storytellers, and they tend to have exceptional spatial navigation skills, visualizing 3-D structures with ease.
The Eides present functional MRI studies to illustrate what is different about the dyslexic brain. For instance, imaging shows that when people with dyslexia read, the right side of their brain dominates, which might help them absorb bigger themes in a text. They also exhibit deficits, however, in parts of the left hemisphere associated with reading and writing and understanding symbols. The nondyslexic brain splits the task more evenly between hemispheres.
The authors interweave case studies from their own psychological practice with current research on dyslexia. They also highlight a few of the world's dyslexic elite, such as acclaimed novelist Anne Rice and entrepreneur Richard Branson, both of whom struggled with traditional schooling before using their unique skills to thrive. Although it would be easy to assume that Rice and Branson flourished because they triumphed over their disability, the Eides contend that they succeeded because of their condition. Being dyslexic allowed them to break from conventional ways of thinking to dream of fantastic new worlds and create alternative solutions to vexing problems.
Despite offering a fresh perspective on dyslexia, the Eides agree with traditional psychologists on the need to intervene at an early age. But unlike their contemporaries, the authors are looking not to fix perceived weaknesses but rather to foster the individual strengths each child displays.
In 1972 Thomas Eagleton was chosen to run as the democratic vice-presidential nominee under George McGovern in the race against Richard Nixon. But it soon emerged that Eagleton suffered from depression and had received shock treatment for it. A scandal erupted, and Eagleton stepped down, forming a cloud that still hovers over politics today.
Psychiatrist Nassir Ghaemi thinks the public is mistaken in wanting leaders who appear sane and mentally healthy. In A First-Rate Madness, he proposes that Eagleton may have actually been the best candidate to deal with a national crisis because of, not in spite of, his depression.
The crux of Ghaemi's argument is that people who are depressed exhibit what psychologists have dubbed “depressive realism”—an all too accurate view of the world. Since the 1970s, when the concept of depressive realism first surfaced, some studies have suggested that people who are mentally healthy actually have overly optimistic ideas about their place in the world.
Being depressed, on the other hand, can give people keener powers of perception and heightened abilities to assess complex or tumultuous situations. In fact, various studies have shown that being bipolar can make people more creative, resilient and in tune with their environment.
Ghaemi details “case studies” wherein he examines respected political figures—such as Abraham Lincoln, Winston Churchill and John F. Kennedy—who lived with depression or mania, or both, and argues that these qualities enhanced their leadership skills. Conversely, he asserts that leaders considered mentally healthy do well during times of peace and prosperity but falter during crises because they lack the practicality or creative thinking skills that leaders with mental disorders often exhibit. Ghaemi offers an anecdote in support of his point: the sane British prime minister Neville Chamberlain thought Adolf Hitler was someone who could be reasoned with, but Churchill saw from the beginning that the strategy would never work.
On the surface, the theory may seem counterintuitive. But Ghaemi provides exhaustive research and makes a compelling case for his point, which is perhaps best summed up by an aphorism from Martin Luther King, Jr.: “Human salvation lies in the hands of the creatively maladjusted.”
Although the Internet has redefined how we access information, many schools and employers still expect their students and staff to behave just as they did 100 years ago, working rigid hours and performing assembly line–like tasks. But digital games, social media and virtual environments are rewarding our brains differently, forging new ways to learn and do business.
In her new book Now You See It, Cathy N. Davidson—a self-identified “student of the Internet”—uses infant language learning to argue that our attention is strongly guided by experience and culture. Eastern and Western babies, for example, differ vastly in the phonemes they recognize at an early age. They each learn to pay attention to distinct sounds, those that elicit a reaction or a reward from their caretakers.
Davidson argues that the Internet has likewise altered where we focus our attention. Boundaries once drawn by physical distance, language or expertise can now be bridged with a backlit screen and a few mouse clicks. Through a series of anecdotes, she asserts that the true trailblazers of this shifting landscape, from small-town teachers to key players in giant corporations, are those melding skills needed online with those that serve both the classroom and the workplace. It is impossible to pay attention to everything at once, but by collaborating—sharing links on our favorite social media sites or working together in a multiplayer role-playing game—we learn how powerful the wisdom of the group can be.
Although the book provides glimpses of the brain's inner workings, Now You See It is not for those readers seeking the latest insight into the neuroscience of learning or attention. In fact, most of Davidson's explanations are oversimplified. But dismissing the book on those grounds alone would be shortsighted.
The book's purpose and strength are in detailing the important lessons we can glean from the online world. Rather than focusing on how games such as World of Warcraft or the social-networking services of Twitter and Facebook change our brains, Davidson believes we should foster these newfound skills, building curricula around interactive multiplayer games and training workers using virtual environments.
If Davidson is right, 21st-century society will move away from categorizing people based on standardized tests, which are crude measures of intelligence at best. Instead we will define new metrics, ones that are better aligned with the skills needed to succeed in the shifting global marketplace. And those who cannot embrace this multidisciplinary world will simply be left behind.
As a research psychologist, when I see a book that claims to reveal “everything you need to know about the mind,” I keep my hopes up and my expectations low.
This new book, edited by legendary literary agent John Brockman, dashed most of my hopes. It contains conversations with 16 prominent neuroscientists, biologists and psychologists, but only one is female—a clue about one of the book's flaws, namely, that much of its content is obsolete.
In the first chapter, for example, in a chat between Brockman and psychologist Steven Pinker of Harvard University, Pinker complains about theories of mind that are “decades out-of-date” while advancing an information-processing theory of the brain that is also out-of-date. This gaffe can be explained, perhaps, by the fact that the interview took place in 1997. Since that time, great strides have been made in neuroscience, which has gradually been coming to grips with the fact that the brain works nothing like a computer, contrary to Pinker's assertions. Fully half the interviews in this book took place in the 1990s—a serious problem when one is looking at one of the fastest-moving fields in all the sciences.
Also troubling, every chapter has long been available in unedited form on Brockman's Web site, Edge.org, created to be a forum where outstanding scholars and scientists could interact. As one might expect, the experts featured in the book are often talking to one another, not to the general public.
The biggest problem with the book, however, has to do with the diversity of topics it tackles. Eight of the book's 18 chapters say nothing about either brain or mind, focusing instead on topics such as birth order, morality and even protozoan parasites. How these various forays are related to either mind or each other is unclear.
These negatives notwithstanding, if you want a quick introduction to some of the smartest and most interesting thinkers around—Stanislas Dehaene of the Collège de France in Paris, Vilayanur S. Ramachandran of the University of California, San Diego, Steven Rose of the Open University in Milton Keynes, England, and others—read this book or simply click on Edge.org.