We wake up to time, courtesy of an alarm clock, and go through a day run by time—the meeting, the visitors, the conference call, the luncheon are all set to begin at a particular hour. We can coordinate our own activities with those of others because we all implicitly agree to follow a single system for measuring time, one based on the inexorable rise and fall of daylight. In the course of evolution, humans have developed a biological clock set to this alternating rhythm of light and dark. This clock, located in the brain's hypothalamus, governs what I call body time [see “Times of Our Lives,” by Karen Wright].
But there is another kind of time altogether. “Mind time” has to do with how we experience the passage of time and how we organize chronology. Despite the steady tick of the clock, duration can seem fast or slow, short or long. And this variability can happen on different scales, from decades, seasons, weeks and hours, down to the tiniest intervals of music—the span of a note or the moment of silence between two notes. We also place events in time, deciding when they occurred, in which order and on what scale, whether that of a lifetime or of a few seconds.
How mind time relates to the biological clock of body time is unknown. It is also not clear whether mind time depends on a single timekeeping device or if our experiences of duration and temporal order rely primarily, or even exclusively, on information processing. If the latter alternative proves to be true, mind time must be determined by the attention we give to events and the emotions we feel when they occur. It must also be influenced by the manner in which we record those events and the inferences we make as we perceive and recall them.
Time and Memory
I was first drawn to the problems of time processing through my work with neurological patients. People who sustain damage to regions of the brain involved in learning and recalling new facts develop major disturbances in their ability to place past events in the correct epoch and sequence. Moreover, these amnesiacs lose the ability to estimate the passage of time accurately at the scale of hours, months, years and decades. Their biological clock, on the other hand, often remains intact, and so can their abilities to sense brief durations lasting a minute or less and to order them properly. At the very least, the experiences of these patients suggest that the processing of time and certain types of memory must share some common neurological pathways.
The association between amnesia and time can be seen most dramatically in cases of permanent brain damage to the hippocampus, a region of the brain important to memory, and to the nearby temporal lobe, the region through which the hippocampus holds a two-way communication with the rest of the cerebral cortex. Damage to the hippocampus prevents the creation of new factual memories. The ability to form memories is an indispensable part of the construction of a sense of our own chronology. We build our time line event by event, and we connect personal happenings to those that occur around us. When the hippocampus is impaired, patients become unable to hold factual memories for longer than about one minute. Patients so afflicted are said to have anterograde amnesia.
Intriguingly, the memories that the hippocampus helps to create are not stored in it. They are distributed in neural networks located in parts of the cerebral cortex (including the temporal lobe) related to the material being recorded: areas dedicated to visual impressions, sounds, tactile information, and so forth. These networks must be activated to both lay down and recall a memory; when they are destroyed, patients cannot recover long-term memories, a condition known as retrograde amnesia. The memories most markedly lost in retrograde amnesia are precisely those that bear a time stamp: recollections of unique events that happened in a particular context on a particular occasion. For instance, the memory of one's wedding bears a time stamp. A different but related kind of recollection—say, that of the concept of marriage—carries no such date with it. The temporal lobe cortex that surrounds the hippocampus is critical for making and recalling such memories.
In patients who sustain damage to the temporal lobe cortex, years and even decades of autobiographical memory can be expunged irrevocably. Viral encephalitis, stroke and Alzheimer's disease are among the neurological insults responsible for the most profound impairments.
For one such patient, whom my colleagues and I studied for 25 years, the time gap went almost all the way to the cradle. When this patient was 46, he sustained damage both to the hippocampus and to parts of the temporal lobe. Accordingly, he had both anterograde and retrograde amnesia: he could not form new factual memories, and he could not recall old ones. The patient inhabited a permanent present, unable to remember what happened a minute earlier or 20 years before.
Indeed, he had no sense of time at all. He could not tell us the date, and when asked to guess, his responses were wild—as disparate as 1942 and 2013. He could guess time more accurately if he had access to a window and could approximate it based on light and shadows. But if he was deprived of a watch or a window, morning was no different from afternoon, and night was no different from day; the clock of body time was of no help. This patient could not state his age, either. He could guess, but the guess tended to be wrong.
Two of the few specific things he knew for certain were that he was married and that he was the father of two children. But when did he get married? He could not say. When were the children born? He did not know. He could not place himself in the time line of his family life. He was in fact married, but his wife divorced him more than two decades before. His children had long been married and had children of their own.
How the brain assigns an event to a specific time and then puts that event in a chronological sequence—or in the case of my patient, fails to do so—is still a mystery. We know only that both the memory of facts and the memory of spatial and temporal relations between those facts are involved. Accordingly, when I was at the University of Iowa, my colleagues Daniel Tranel and Robert Jones and I decided to investigate how an autobiographical time line is established. By looking at people with different kinds of memory impairment, we hoped to identify what region or regions of the brain are required to place memories in the correct epoch.
We selected four groups of participants, 20 people in total. The first group consisted of patients with amnesia caused by damage in the temporal lobe. Patients with amnesia caused by damage in the basal forebrain, another area relevant for memory, made up the second set. The third group was composed of patients without amnesia who had damage in places other than the temporal lobe or basal forebrain. We chose as control subjects individuals without neurological disease who had normal memories and who were matched to the patients in terms of age and level of education.
Every participant completed a detailed questionnaire about key events in their life. We asked them about parents, siblings and various relatives, schools, friendships and professional activities. We verified the answers with relatives and records. We also established what the participants remembered of key public events, such as the election of officials, wars and natural disasters, and prominent cultural developments. Then we had each participant place a customized card that described a specific personal or public event on a board that laid out a year-by-year and decade-by-decade time line for the 1900s. For the participants, the situation was an experience similar to playing the board game Life. For the investigators, the setup permitted a measurement of the accuracy of time placement.
Predictably, the amnesiac patients differed from the controls. Normal individuals were relatively accurate in their time placements: on average, they were wrong by 1.9 years. Amnesiac patients made far more errors, especially those with basal forebrain damage. Although they recalled the event exactly, they were off the mark by an average of 5.2 years. But their recall of events was superior to that of temporal lobe amnesiacs, who were nonetheless more accurate with regard to time stamping—they were off by an average of only 2.9 years.
The results suggest that time stamping and event recall are processes that can be separated. More intriguingly, the outcome indicates that the basal forebrain may be critical in helping to establish the context that allows us to place memories in the right epoch. This notion is in keeping with the clinical observation of basal forebrain patients. Unlike certain of their counterparts with temporal lobe damage, these patients do learn new facts. But they often recall the facts they have just learned in the incorrect order, reconstructing sequences of events in a fictional narrative that can change from occasion to occasion.
Being Late for Consciousness
Most of us do not have to grapple with the large gaps of memory or the chronological confusion that many of my patients do. Yet we all share a strange mental time lag, a phenomenon first brought to light in the 1970s by the late neurophysiologist Benjamin Libet of the University of California, San Francisco. In one experiment, Libet documented a gap between the time an individual was conscious of the decision to flex his finger (and recorded the exact moment of that awareness) and the time his brain waves indicated that a flex was imminent. The brain activity occurred a third of a second before the person consciously decided to move his finger. In another experiment, Libet tested whether a stimulus applied directly to the brain caused any sensation in some patients undergoing brain surgery, who were awake, as most patients are in such operations. He found that a mild electrical charge to the cortex produced a tingling in the patient's hand—a full half a second after the stimulus was applied.
Although the interpretation of those experiments, and of others in the field of consciousness studies, is entangled in controversy, one general fact emerged from Libet's work. It is apparent that a lag exists between the beginning of the neural events leading to awareness and the moment one actually experiences the consequence of those events.
This finding may be shocking at first glance, and yet the reasons for the delay are fairly obvious. It takes time for the physical changes that constitute an event to impinge on the body and to modify the sensory detectors of an organ such as the retina. It takes time for the resulting electrochemical modifications to be transmitted as signals to the central nervous system. It takes time to generate a neural pattern in the brain's sensory maps. Finally, it takes time to relate the neural map of the event and the mental image arising from it to the neural map and image of the self—that is, the notion of who we are—the last and critical step without which the event will never become conscious.
We are talking about nothing more than mere milliseconds, but there is a delay nonetheless. This situation is so strange that the reader may well wonder why we are not aware of this delay. One attractive explanation is that because we have similar brains and they work similarly, we are all hopelessly late for consciousness and no one notices it. But perhaps other reasons apply. At the microtemporal level, the brain manages to “antedate” some events so that delayed processes can appear less delayed and differently delayed processes can appear to have similar delays.
This possibility, which Libet contemplated, may explain why we maintain the illusion of continuity of time and space when our eyes move quickly from one target to another. We notice neither the blur that attends the eye movement nor the time it takes to get the eyes from one place to the other. Patrick Haggard and John C. Rothwell of University College London suggest that the brain predates the perception of the target by as much as 120 milliseconds, thereby giving us the perception of seamless viewing.
The brain's ability to edit visual experiences and to impart a sense of volition after neurons have already acted is an indication of its exquisite sensitivity to time. Although our understanding of mind time is incomplete, we are gradually coming to know more about why we experience time so variably and about what the brain needs to create a time line.
How Hitchcock's Rope Stretches Time
The elasticity of time is perhaps best appreciated when we are the spectators of a performance, be it a film, a play, a concert or a lecture. The actual duration of the performance and its mental duration are different things. To illustrate the factors that contribute to this varied experience of time, we can turn to the example of Alfred Hitchcock's 1948 film Rope. This technically remarkable work was shot in continuous, unedited 10-minute takes. Few features have been produced in their entirety using this approach. Orson Welles in Touch of Evil, Robert Altman in The Player and Martin Scorsese in GoodFellas employed long, continuous shots but not as consistently as in Rope. (In spite of the many plaudits the innovation earned the director, filming proved a nightmare for all concerned, and Hitchcock used the method again only in part of his next film, Under Capricorn.)
Hitchcock invented this technique for a sensible reason. He was attempting to depict a story that had been told in a play occurring in continuous time. But he was limited to the amount of film that could be loaded into the camera, which was roughly enough for 10 minutes of action.
Now let us consider how Rope's real time plays in our mind. In an interview with François Truffaut in 1966, Hitchcock stated that the story begins at 7:30 P.M. and terminates at 9:15, 105 minutes later. Yet the film consists of eight reels of 10 minutes each: a total of 80 minutes, including the credits at the beginning and end. Where did the missing 25 minutes go? Do we experience the film as shorter than 105 minutes? Not really. The film never seems shorter than it should, and a viewer has no sense of haste or clipping. On the contrary, for many the film seems longer than its projection time.
I suspect that several aspects account for this alteration of perceived time. First, most of the action takes place in the living room of a penthouse in summer, and the skyline of New York City is visible through a panoramic window. At the beginning of the film the light suggests late afternoon; by the end night has set in. Our daily experience of fading daylight makes us perceive the real-time action as taking long enough to cover the several hours of the coming of night, when in fact, those changes in light are artificially accelerated by Hitchcock.
In the same way, the nature and context of the depicted actions elicit other automatic judgments about duration. After the proverbial Hitchcock murder, which occurs at the beginning of the film's first reel, the story focuses on an elegant dinner party hosted by the two unsavory murderers and attended by the relatives and friends of the victim. The actual time during which food is served is about two reels. Yet viewers attribute more time to that sequence because we know that neither the hosts nor the guests, who look cool, polite and unhurried, would swallow dinner at such breakneck speed. When the action later splits—some guests converse in the living room in front of the camera, while others repair to the dining room to look at rare books—we sensibly attribute a longer duration to this offscreen episode than the few minutes it takes up in the actual film.
Another factor may also contribute to the deceleration of time. There are no jump cuts within each 10-minute reel; the camera glides slowly toward and away from each character. Yet to join each segment to the next, Hitchcock finished most takes with a close-up on an object. In most instances, the camera moves to the back of an actor wearing a dark suit, and the screen goes black for a few seconds; the next take begins as the camera pulls away from the actor's back. Although the interruption is brief and is not meant to signal a time break, it may nonetheless contribute to the elongation of time because we are used to interpreting breaks in the continuity of visual perception as a lapse in the continuity of time. Film-editing devices such as the dissolve and the fade often cause spectators to infer that time has passed between the preceding shot and the following one. In Rope, each of the seven breaks delays real time by a fraction of a second. But cumulatively for some viewers, the breaks may suggest that more time has passed.
The emotional content of the material may also extend time. When we are uncomfortable or worried, we often experience time more slowly because we focus on negative images associated with our anxiety. Studies in my laboratory suggest that the brain generates images at faster rates when we are experiencing positive emotions (perhaps this is why time flies when we're having fun) and reduces the rate of image making during negative emotions. On a recent flight with heavy turbulence, for instance, I experienced the passage of time as achingly slow, perhaps because my attention was directed to the discomfort of the experience. The unpleasantness of the situation in Rope may similarly conspire to stretch time.
Rope provides a noticeable discrepancy between real time and the audience's perception of time. In so doing, it illustrates how the experience of duration is a construct. It is based on factors as varied as the content of the events being perceived, the emotional reactions those events provoke and the way in which images are presented to us, as well as the conscious and unconscious inferences that accompany the film's images.—A.D.