Unlike any other empirical object in Nature, the mind's presence is immediately apparent to itself, but opaque to all external observers.
—George Makari, Soul Machine, 2015

My life, as well as this column, is dedicated to understanding the conscious mind and how it relates to the brain. This presupposes that you, the reader, and I have a precise sense of what is referred to by such seemingly innocent terms as “consciousness” and “mind.” And lest it be forgotten, the allied concept of “soul” (or spirit), banned from scientific discourse, continues to remain profoundly meaningful to vast throngs of humankind here and abroad.

But there's the rub! Unlike such material objects as “egg,” “dog” or “brain,” this triptych of intangible concepts is a historical construct, endowed with a universe of religious, metaphysical, cultural and scientific meaning, as well as an array of underlying assumptions, some clearly articulated, others wholly ignored. These meanings adapt over time as society changes in response to wars and revolutions, catastrophes, trade and treaties, invention and discovery. Psychiatrist and historian George Makari tries to illuminate this historical evolution in his Soul Machine: The Invention of the Modern Mind, published last November by W. W. Norton. His intellectual history masterfully describes how consciousness, mind and soul are shape-shifters that philosophers, theologians, scholars, scientists and physicians seek to tame, by conceptualizing, defining, reifying, denying and redefining these terms through the ages to come to grips with the mystery that is our inner life.

A Brief History of the Soul

The systematic search for answers goes back to Aristotle (384–322 B.C.), foremost of all biologists, taxonomists, embryologists and evolutionists. His De Anima (literally On the Soul) classifies the nature of living things and discusses his notion of the soul (psyche), which for him means the essence of a thing. The soul defines an organism. All living things have souls with distinct faculties. The vegetative soul embodies the life force that distinguishes living matter, be it plants, animals or people, from inanimate matter, such as a rock. It supports nutrition, growth and reproduction. The sensitive soul enables sense perception, pain and pleasure, memory, imagination and motion. It is common to animals and to humans. Both the vegetative and the sensitive souls are corporeal and, therefore, mortal. It is the rational soul, unique to people, that is responsible for intellect, thought and reason. The rational soul constitutes the quiddity of what it is to be a human. For Aristotle, although the rational soul is immaterial, it cannot exist independent of the body. Famously, of course, Socrates and Plato differed with Aristotle, arguing for the immortality of the soul on the death of the body.

Dominican friar and Scholastic philosopher Thomas Aquinas (1225–1274) casts these classical Greek ideas into a form that meshed with Christian ones and would remain an important influence through the Middle Ages. A triumvirate of three souls makes up every living human—a nutrient soul common to all organisms, a sensitive (or appetitive) soul characteristic of animals and people, and a rational soul that is immortal, a repository of humanity's godhood, lifting people above the natural, material world. The rational soul could not become sick, because it was immaterial, but it could be possessed by the Devil or some of his demonic servants. Doctors could not help those so afflicted, but ecclesiastical authority could and did—saving their immortal souls one way or another as attested to by the fiery death of tens of thousands of both female and male witches.

For close to four centuries, this Thomist philosophy was the dominant intellectual narrative for Christians, noblemen and peasants alike. It offered solace to the weary and the dying, and it justified the divine right and the absolute power of kings and queens. Yet decades of bloody religious warfare among Christians for the “one true faith” during the first half of the 17th century led to widespread questioning of these received truths.

Superstition—as exemplified in William Blake's The Witch of Endor—received withering critiques from Enlightenment philosophers, including René Descartes, Thomas Hobbes and John Locke.

This is the chronological starting point for Soul Machine—it follows the philosophers, savants, doctors, writers and revolutionaries of the English, Scottish, French and German Enlightenment as they transmogrified the rational soul over two centuries into a mechanized, naturalized and desacralized thing. This process gave birth to psychology, neurology and psychiatry and the knowledge that we, children of the 21st century, are evolved from apes.

All of this starts with the reclusive Frenchman René Descartes (1596–1650) and the radical and outspoken Englishman Thomas Hobbes (1588–1679). The former is one of the fathers of modern science (he linked algebra to geometry, thereby giving us the Cartesian coordinate system). Descartes replaced the moth-eaten final causes and forms of the Scholastics—wood burns because it possesses an inherent form that seeks to burn—by mechanistic ones. In particular, he argued that the movements and actions of animals and humans are caused by particles of various shapes that jostle one another and move about. Nothing more and nothing less.

Descartes postulated that everything under the sun is made out of one of two substances. The stuff that can be touched and that has spatial extension is res extensa; it includes the bodies and brains of animals and people. The stuff that cannot be seen, that does not have extension, is thinking stuff, res cogitans. It alone enables humans to reason, to speak and to freely decide. Descartes's dualism divided the world into two magisteria: a mechanistic one that was to be the playground of experimental philosophers, the precursors of modern scientists and clinicians, and a theological one, the dominion of the immaterial and immortal soul. Descartes thereby safeguarded Christian dogma and ecclesiastical authority.

This dichotomy won Descartes the enmity of Hobbes, who published his celebrated Leviathan, a bold materialistic manifesto, considered the foundation for Western political philosophy. For Hobbes, everything was made out of matter. There was no necessity for any special thinking substance. Matter could think. Even though the bulk of Leviathan was a book-length argument for absolute monarchy (rather than religious authority) to prevent the kind of religiously motivated bloodshed of the European Wars of Religion (circa 1524–1648), Hobbes was considered blasphemous, and his books were burned.

English doctor John Locke (1632–1704) further naturalized the rational soul in his Essay Concerning Human Understanding, written while in exile in Holland and first published in an abridged French edition. Locke's work helped to turn the soul into something closer to the modern mind (from the Old English mynde), the theater of our subjective experience. The mind is populated by ideas that ultimately derive from the outside, from sensations, for the mind at birth is an empty slate, a tabula rasa. The ideas of God, justice, mathematics and the self, as well as everyday objects, whether implements, machines, animals or people, are not innate. Rather they are learned by experience, by reflection and by association. How the mind could carry out these tasks was a mystery for Locke as it was for Descartes, Hobbes and everybody else. For how mere brain matter could think, reason or speak was inexplicable given the mechanics and chemistry of the day. Thus, Locke postulated that God had superadded active forces to brain matter.

Common to Descartes, Hobbes, Locke, Baruch Spinoza and other radical thinkers was a disdain for superstition. Makari cites an entry from Locke's journal: “The three great things that govern mankind are reason, passion, and superstition. The first governs a few, the two last share the bulk of mankind and possess them in their turns. But superstition most powerfully produces the greatest mischief.” Two centuries hence, Fyodor Dostoyevsky's Grand Inquisitor understood this mind-set well: “the only three forces that are able to conquer and hold captive forever the conscience of these weak rebels for their own happiness ... are miracle, mystery and authority.” Today, another two centuries onward, humanity continues to battle these forces.

As the mind of the closing years of the 17th century had lost many of its heavenly attributes and had become a part of nature, it could now suffer the corruptions all matter is prey to; it could become dysfunctional, sick or afflicted with melancholia (a widespread ailment). Or it could be fallible and form misassociations that led to cognitive errors, explaining the rising tide of religious fanatics, enthusiasts and prophets: the Anabaptists, Methodists, Seekers, Quakers, and other self-avowed divine messengers who wandered the world, preaching their own interpretation of God and the Bible. Perhaps God was not speaking through them, but rather they were simply deluded. Likewise, perhaps witches were not truly possessed. Maybe they were simply ill, sick to their souls or crazy, and they should not be burned.

If people had unbalanced minds, could these be righted? Could they be cured? How so? By confining them to madhouses? What kind of therapies would work best? How can one tell a mad person from an eccentric? These questions captivated the United Kingdom in response to the bizarre behavior of King George III, the sovereign who lost the American colonies and triggered a political crisis concerning his sanity and whether and how it could be restored. Echoes of these controversies can be heard even today in the ongoing dispute concerning who to blame for mass shootings—deranged individuals or gun ownership and cultural factors.

Ever so slowly, with countless setbacks, as the decades turned into a century and then two, religious explanations of idiosyncratic behaviors turned into clinical ones, with attendant mental asylums and specialist doctors to treat the afflicted, now considered neither evil nor touched by God but patients in need of help.

Makari rightfully spends many pages on Prussian astronomer and philosopher Immanuel Kant (1724–1804), who did more than anybody else to delimit and plumb what the mind can know and what reason can deduce about the world. With rapierlike precision, Kant argued that our mind can never penetrate to the true nature of things.

Of Spirits and the Profane

The book does an outstanding job of relating changing epistemological narratives to the politics of the day. Possessions and exorcisms provided visible proof of the reality of the spiritual world. If these were now profane matters, subject to medicine and reason, where did this leave the divine justifications for the absolute rights of monarchs?

The odd behaviors of King George III entranced all his subjects. The 1994 film The Madness of King George rendered an account of this period.

Soul Machine ends in the mid-19th century, with a portrayal of German physicians Franz Joseph Gall (1758–1828) and his assistant Johann Spurzheim (1776–1832). Based on systematic dissection of human and animal brains, Gall formulated a thoroughly materialistic, empirically based account of the brain as the sole organ of the mind, one that is not homogeneous but an aggregate of distinct parts and, as a consequence, distinct “functions.” Gall argued for 27 functions, each one assigned to different and distinct regions of the brain. Every individual inherits a separate set of organs, some smaller, some larger, thus explaining individual differences. These views of the brain as a machine for producing thought and memory clashed with religious sentiments and public morality to such an extent that Gall had to leave Vienna and settle in postrevolutionary Paris.

Using the detailed curvature, shape and extent of the skull, Gall and Spurzheim claimed to be able to infer the size and import of the organ underneath the cranium and thereby diagnose the mental character of the individual examined. Their phrenological method proved immensely popular, as it appealed to the growing middle class as scientific, sophisticated and modern. Phrenology was used to classify criminals, lunatics, the eminent and the (in)-famous. It eventually lost favor as a reputable scientific method but lingered on until the early 20th century.

Although there is no discernible relation between the morphology of the external skull and the size and function of the underlying neural tissue, Gall's insistence on localization for specific cognitive functions in the cerebral cortex found validation in 1848 through the work of Parisian neurologist Paul Broca. The physician presented the landmark case of a patient unable to speak except for the single word “tan.” His brain proved to have suffered damage to its left frontal lobe. Thus, Broca concluded that meaningful speech was closely related to this region. An analysis of a second patient fortified his conclusion that a circumscribed region in the frontal cortex—the left inferior frontal gyrus, named Broca's area—was responsible for productive speech, that most human of all behaviors.

Overall, Soul Machine is a monumental work, replete with reproductions of contemporary engravings, that describes in sometimes overwhelming detail the work of a large cast of individuals—and their influences on one another—over the course of several centuries.

It seems strange that Makari stops short of describing Charles Darwin's influence on the conception of the human mind as an evolutionary refinement, an extension of the minds of apes, monkeys and other animals, shaped by natural selection to fit a particular socioecological niche. That is, we have the cognitive apparatus that we have precisely because it enabled our proximal and distal ancestors to better survive the struggle for existence. Our genetic endowment profoundly shapes the way we apprehend the world. This inborn bias to see the world in a particular way—for example, for most of us in a combination of three colors—also irredeemably shapes our perception and ultimately our knowledge about the world. This echoes Kant's celebrated argument for the existence of knowledge that cannot be logically derived yet is prior to our experience (synthetic a priori proposition).

Descartes theorized that the pineal gland—denoted “H” in this illustration from his 1662 De Homine—was the “seat of the soul.”

My far bigger complaint with Soul Machine is the book's complete neglect of the dominant strand of modern thinking about the mind—the information-processing paradigm. In this narrative (dominant in academic psychology and neuroscience), the brain transforms incoming sensory information to yield an internal representation of the external world. In conjunction with emotional and cognitive states and both conscious and unconscious memories, the mind generates—or computes, as the cognoscenti would have it—an appropriate response and generates the associated motor behaviors. Think of the human body as a robot, with its brain as a neuromorphic computer. Thanks to Steve Jobs, Bill Gates, Paul Allen and the other visionary entrepreneurs who gifted us with personal computers, this is the view of the mind that prevails, one as familiar to us all as mother's milk.

Descartes's ideas were rooted in his inability to conceive of procedures and mechanisms to explain intelligence, reasoning and language. In the 17th century nobody could envision how the mind-less application of innumerable, meticulously detailed, step-by-step instructions, what we today refer to as algorithms, could get a computing machine to play chess or Go, recognize faces, label photographs and translate Web pages. Descartes had to appeal to a mysterious, ethereal substance that, in some nebulous manner, did the thinking and reasoning.

A mere couple of decades later the seed of the computational paradigm was laid down by German rationalist philosopher, scientist and polymath Gottfried Wilhelm Leibniz (1646–1716), who developed the binary number system and, in fierce competition with Isaac Newton, invented calculus. He was on a lifelong quest to develop a universal calculus, what he termed a “calculus ratiocinator,” in conjunction with a universal conceptual language. If he had been capable at the time of creating such a thing, it would have resembled either a proto–computer program (software) or a description of a powerful calculating machine (hardware). Leibniz was looking for ways to cast any dispute into a rigorous mathematical form that could then be evaluated for its truth. As he wrote:

The only way to rectify our reasonings is to make them as tangible as those of the Mathematicians, so that we can find our error at a glance, and when there are disputes among persons, we can simply say: Let us calculate, without further ado, to see who is right.

Leibniz was no mere theoretician but an all-around talent who designed and built an early general digital calculator. His dream of a calculus ratiocinator motivated logicians of the late 19th and early 20th centuries, culminating in the 1930s with work by Kurt Gödel, Alonzo Church and Alan Turing that gave us two things. First, their labors placed absolute and formal limits on what can be proved by mathematics, bringing to an end its ancient, aspirational dream of formalizing truth, of constructing a universal alethiometer, that is, a truth meter. Second, it gave birth to the universal Turing machine, a dynamic model of how any mathematical procedure can be implemented and evaluated on a very simple machine.

These conceptual breakthroughs fed two related but distinct streams of inquiry, with profound implications for our contemporary view of the mind. One strand ushered in neural networks and computational neuroscience, demonstrating how large networks of interconnected nodes can learn to recognize letters, faces or objects, navigate a complex environment, speak and reason. The second strand completely upended society and our way of life because it gave rise to digital computers, first in the shape of a few large university- or government-operated centers, then on millions of desks in offices, and now living in the pockets and hands of billions of people.

Polymath Gottfried Wilhelm Leibniz invented an early digital calculator at the end of the 17th century, a manual processing unit capable of performing the basic arithmetical operations of addition, subtraction, multiplication and division.

Even more critical, computers gave rise to the idea and later the practice of artificial intelligence, the design of machine minds whose performance is narrowly defined but increasingly able to match and exceed what the human mind is capable of. What would Aristotle, Aquinas and Descartes have made of a Roomba, a popular disk-shaped household robot for cleaning floors, or of IBM's Watson, the computer program that understands and speaks English and that bested humans in the quiz show game Jeopardy? Judged purely by their behaviors, one would have to accord these technologies as possessing both sensitive and rational souls capable of achieving res cogitans. Yet the extent to which digital computers can experience anything and can be conscious in the way that people are remains controversial, with at least one popular theory of consciousness denying it. (To go still further and achieve a naturalized immortality, some of the more enthusiastic techno utopians postulate a heaven in the appropriately located Cloud, to which our digital simulacrum will eventually be uploaded, provided we practice the right brain-freezing technique.)

Supernatural meaning has been leeched from the modern conception of the computational mind by the acid bath of the Enlightenment. No brain, never mind! Yet by no means has our understanding of the interbraided leitmotifs of Soul Machine—consciousness, mind and soul—reached its final apogee. It will continue to evolve as scientists, clinicians and philosophers, newly joined by engineers, seek an ever more precise carving of nature at its joints, to use a beautiful Platonic idiom.

Soul Machine is an eminently readable account of how these concepts are shaped and determined by historical and cultural contingency in ways that science usually chooses to ignore.