Speak for Yourself,” by Ferris Jabr, is astounding in its implications. Helen Keller said she didn't have self-awareness or consciousness before she met her teacher and learned language. An aphasic stroke victim who loses inner dialogue loses the ability to structure thoughts in terms of past and future and so can only exist.

Is it possible that the evolution of a primitive language predated the expansion of the human brain and our rise to lofty heights of intelligence? What if we hit a tipping point of rudimentary language that allowed us, as with Helen Keller, to become self-aware for the first time and begin to structure our thoughts and world in a more complex way, with past and future, layered emotion, and a capacity for structured planning?

And have we ourselves already done this experiment unwittingly? We have taught gorillas to use sign language so they could tell us how they think. But what if the act of teaching them a language fundamentally changed the way that they actually thought? What if we then saw not the intelligence of the average gorilla but rather a gorilla that was endowed with a heightened ability to reason beyond that of any other gorilla that ever existed? What an amazing example of observer effect!

Jeremy Fox

JABR REPLIES: Fox has some fascinating ideas. Although human self-awareness seems to depend on verbal thought, that does not mean other animals construct self-awareness in the same way. No one definitively knows the difference between self-awareness and consciousness because consciousness is still such an ill-defined concept, but one could imagine a person or creature that is conscious—aware of its surroundings—yet does not understand that it has a “self.” It is possible that the young Helen Keller, stroke patients who have lost all inner speech, and even newborns fall into this category: certainly conscious, even without language, but not self-aware.


Here we go again. In “Taking Early Aim at Autism,” by Luciana Gravotta, I am being told before I even start to read the article that I have a “developmental disorder” and “deficits that will become debilitating.” The only trouble is, I am almost 89 years old, and when I was an infant no one told my mother about my “condition.” But my mother was a wise woman who managed to steer me through those awkward years when I did things like saying in a loud voice, “That lady is fat.” She allowed me to take an alternative route when pneumatic drills on the road terrified me. She worked out ways of managing my various behaviors, remarkably similar to those strategies now advised by experts working with children with autism.

I eventually managed to qualify as a medical practitioner and worked at my trade for nearly 40 years, still blissfully unaware of my “disorder.” In fact, I had already been retired from active practice for 10 years before the truth dawned on me. I did indeed suffer from several physical ailments that are now known to be sometimes associated with autism, and it was these rather than the psychiatric symptoms that led to my failure to keep practicing beyond the usual retirement age.

Current trends make me question whether the neurotypical majority is correct in labeling us as abnormal and whether perhaps many of the more severe grades of autism might actually be caused by the treatment we receive at the hands of that majority. Are you quite sure we are in fact abnormal? Could it just be that we are different?

J. Michael Hayman
Laurieton, Australia


I read with interest “Just Say No?” by Scott O. Lilienfeld and Hal Arkowitz [Facts and Fictions in Mental Health]. After reading it, I couldn't help but wonder what value there was in writing such an article. Your criticisms were of a D.A.R.E. program that is no longer being administered. I'm sure you were unaware of that fact, otherwise I'm confident that you would have mentioned that the D.A.R.E. program that exists today (D.A.R.E. keepin' it REAL) is indeed evidence-based. It has been evaluated to be a good use of police resources and is very beneficial to kids.

In the interest of fairness to the story, you were also remiss in not mentioning that the previous D.A.R.E. program did include the peer role playing that you claimed it lacked.

I've been a D.A.R.E. officer for 15 years and have never taught a version of the program where the message is simply “just say no.” D.A.R.E. has always given the students a variety of tools to help them make safe and responsible choices surrounding not only the use of substances but in most any circumstance. Perhaps even more valuable, but seldom regarded, is that the program affords an opportunity for the police to build healthy and mutually beneficial relationships with the schools, the kids and their families. To suggest that D.A.R.E. “doesn't work” indicates that your expectation was that the program should act as a “vaccination” against drug use. Although it would be great if such a thing actually existed, the fact remains that it doesn't.

It would be nice to see some support for the effort and not have reputable publications such as yours use outdated and misguided data as a means of promoting damaging misinformation.

Scott Hilderley
RCMP Drugs and Organized Crime
Awareness Service
Victoria, B.C.

LILIENFELD AND ARKOWITZ REPLY: Hilderley's assertions to the contrary, the older and ineffective D.A.R.E. program is still administered widely in school districts in the U.S. and other countries. As we also noted in our article, “the good news is that some proponents of D.A.R.E. are now heeding the negative research findings and incorporating potentially effective elements, such as role playing with peers, into the intervention.” In fact, we did note that traditional D.A.R.E. programs sometimes afford opportunities for peer rehearsal but that these opportunities are insufficient. We also pointed out that there is reason for cautious optimism regarding revised D.A.R.E. programs.

Nevertheless, we do not share Hilderley's conclusion that the revised D.A.R.E. program is “evidence-based.” As Renee Singh of the University of California, Santa Barbara, and her co-authors observed in a 2011 review, preliminary evidence suggests that this new program may exert promising effects on attitudes toward substance abuse and substance-refusal skills, but “empirical evidence to date does not provide compelling evidence of effectiveness” for this intervention.


As causes of violence, heat and cold are both stressors, as Ajai Raj and Andrea Anderson detailed in “Heat-Fueled Rage” and “Cold Confusion” [Head Lines]. The prefrontal cortex (PFC) is the part of the brain that makes us do the hard thing or, conversely, inhibits us from doing the easy thing. When there is an excess of stress, whether mental or physical, it is more difficult for the PFC to intervene and prevent automatic behavior. It is reasonable that thermal stressors would have a similar effect.

The lack of violence in cold weather is probably related to the fact that it is easier to stay warmer (and therefore mitigate stress) by staying inside. What would be interesting is whether domestic violence increases in cold weather, when the opportunity to take it out on someone else is lessened.

Pat King
Hot Springs, Ark.


In “Heat-Fueled Rage,” by Ajai Raj [Head Lines], the study by Solomon Hsiang and his colleagues is mistakenly cited as having appeared in Nature. The paper appeared in Science on September 13, 2013.