(this article originally appeared in the AMA Journal of Ethics,April, 2015, Vol. 17, No. 4, pp. 348-352).
In the basement of the Bureau International des Poids et Mesures (BIPM) headquarters in Sevres, France, a suburb of Paris, there lies a piece of metal that has been secured since 1889 in an environmentally controlled chamber under three bell jars. It represents the world standard for the kilogram, and all other kilo measurements around the world must be compared and calibrated to this one prototype. There is no such standard for the human brain. Search as you might, there is no brain that has been pickled in a jar in the basement of the Smithsonian Museum or the National Institute of Health or elsewhere in the world that represents the standard to which all other human brains must be compared. Given that this is the case, how do we decide whether any individual human brain or mind is abnormal or normal? To be sure, psychiatrists have their diagnostic manuals. But when it comes to mental disorders, including autism, dyslexia, attention deficit hyperactivity disorder, intellectual disabilities, and even emotional and behavioral disorders, there appears to be substantial uncertainty concerning when a neurologically based human behavior crosses the critical threshold from normal human variation to pathology.
A major cause of this ambiguity is the emergence over the past two decades of studies suggesting that many disorders of the brain or mind bring with them strengths as well as weaknesses. People diagnosed with autism spectrum disorder (ASD), for example, appear to have strengths related to working with systems (e.g., computer languages, mathematical systems, machines) and in experiments are better than control subjects at identifying tiny details in complex patterns [1]. They also score significantly higher on the nonverbal Raven’s Matrices intelligence test than on the verbal Wechsler Scales [2]. A practical outcome of this new recognition of ASD-related strengths is that technology companies have been aggressively recruiting people with ASD for occupations that involve systemizing tasks such as writing computer manuals, managing databases, and searching for bugs in computer code [3].
Valued traits have also been identified in people with other mental disorders. People with dyslexia have been found to possess global visual-spatial abilities, including the capacity to identify “impossible objects” (of the kind popularized by M. C. Escher) [4], process low-definition or blurred visual scenes [5], and perceive peripheral or diffused visual information more quickly and efficiently than participants without dyslexia [6]. Such visual-spatial gifts may be advantageous in jobs requiring three-dimensional thinking such as astrophysics, molecular biology, genetics, engineering, and computer graphics [7, 8]. In the field of intellectual disabilities, studies have noted heightened musical abilities in people with Williams syndrome, the warmth and friendliness of individuals with Down syndrome, and the nurturing behaviors of persons with Prader-Willi syndrome [9, 10]. Finally, researchers have observed that subjects with attention deficit hyperactivity disorder (ADHD) and bipolar disorder display greater levels of novelty-seeking and creativity than matched controls [11-13].
Such strengths may suggest an evolutionary explanation for why these disorders are still in the gene pool. A growing number of scientists are suggesting that psychopathologies may have conferred specific evolutionary advantages in the past as well as in the present [14]. The systemizing abilities of individuals with autism spectrum disorder might have been highly adaptive for the survival of prehistoric humans. As autism activist Temple Grandin, who herself has autism, surmised: “Some guy with high-functioning Asperger’s developed the first stone spear; it wasn’t developed by the social ones yakking around the campfire” [15].
Similarly, the three-dimensional thinking seen in some people with dyslexia may have been highly adaptive in preliterate cultures for designing tools, plotting out hunting routes, and constructing shelters, and would not have been regarded as a barrier to learning [16]. The key symptoms of ADHD, including hyperactivity, distractibility, and impulsivity, would have been adaptive traits in hunting and gathering societies in which people who were peripatetic in their search for food, quick in their response to environmental stimuli, and deft in moving toward or away from potential prey would have thrived [17]. There might also have been evolutionary advantages in prehistoric times for people with mania, since high energy and creative expression might have fueled sexual and reproductive success [18].
The cumulative effect of these studies suggests that a more judicious approach to treating mental disorders would be to replace a “disability” or “illness” paradigm with a “diversity” perspective that takes into account both strengths and weaknesses and the idea that variation can be positive in and of itself. To this end, a new term has arisen within the autism rights community: neurodiversity. Although the origin of the neurodiversity movement is often traced back to a speech entitled “Don’t Mourn for Us,” given by autism activist Jim Sinclair at the 1993 International Conference on Autism in Toronto [19], the word itself was first used by autism rights advocate Judy Singer and New York journalist Harvey Blume to articulate the needs of people with autism who did not want to be defined by a disability label but wished to be seen instead as neurologically different [20, 21]. Since that time, the use of the term has continued to grow beyond the autism rights movement to fields such as disability studies, special education, higher education, business, counseling, and medicine [22-27]. Embracing the concept of neurodiversity would bring the study of mental health disorders in line with movements that have already taken place over the past 50 years around biodiversity and cultural diversity [28, 29]. As Harvey Blume noted, “Neurodiversity may be every bit as crucial for the human race as biodiversity is for life in general. Who can say what form of wiring will prove best at any given moment?” How absurd it would be to label a calla lily as having “petal deficit disorder” or to diagnose a person from Holland as suffering from “altitude deprivation syndrome.” There is no normal flower or culture. Similarly, we ought to accept the fact that there is no normal brain or mind.
Thomas Armstrong, PhD, is the executive director of the American Institute for Learning and Human Development in Cloverdale, California. He is the author of 19 books, including The Power of Neurodiversity: Unleashing the Advantages of Your Differently Wired Brain (published in hardcover as Neurodiversity) (Da Capo Press, 2011) and Neurodiversity in the Classroom: Strength-Based Strategies to Help Students with Special Needs Succeed in School and Life(ASCD, 2012). His books have been translated into 28 languages, and he has lectured on learning and human development themes in 44 states and 29 countries over the past 32 years.
This page was brought to you by Thomas Armstrong, Ph.D. and www.institute4learning.com.
Follow me on Twitter: @Dr_Armstrong