Subconscious, split-second time measurements occur all the time in daily life. When the brain is processing speech, judging the trajectory of a speeding ball, playing (and even appreciating) music it requires using past and present knowledge of the latest stimulus—be it spoken syllables, points in space or musical notes.

Dean Buonomano, a neurobiologist at the University of California, Los Angeles, Brain Research Institute, and Uma Karmarkar, a postdoc researcher at U.C., Berkeley, have proposed a model of how the brain tells time that takes into account information from the immediate past while encoding a specific event.

The researchers believe their work could aid in understanding whether timing deficiencies underlie impaired linguistic functioning, as happens in dyslexia.

At short sub-one-half second intervals, their research, published in this week's issue of Neuron, contradicts internal clock models (involving linear metrics such as oscillators or pacemakers) proposed by several prominent neuroscientists that measure time by pulse counts.

"We would argue," Buonomano says, "that this type of timing is such a fundamental aspect of what the brain does that it's not allocated some master clock—like your computer has a clock chip, and when the rest of the computer needs to know the time, it just asks the clock chip."

In his model, a beep, flash of light or other stimulus prompts a response from a baseline network of neurons, and a second beep or flash acts on that recently perturbed network. Buonomano uses the analogy of a pebble dropped into a pond causing ripples; if a second pebble is tossed in a short time later, one can determine the amount of relative time that has passed based on the distance the waves have traveled when it hits the water. The ripples generated by pebble number two are influenced by the first pebble's waves—creating a record of both events.

"The entire sequence is coded as a single object—you can't break it into its individual parts," Karmarkar says. "It's a fairly familiar concept in other types of information processing," she adds, citing the subconscious judgments made to distinguish letters and words in Morse code as a prime example.

The pair tested their theory on human subjects, who were told to judge the relative length of an interval between two auditory beeps. In two versions they added a rogue first tone, which served to distract. The subjects could still make accurate comparisons when the distracting tone was consistently played 250 milliseconds before an interval. But the subjects were unable to discriminate the time lapse when the space between the distracting tone and the second two beeps varied randomly by either 250 or 750 milliseconds. Thus, the randomness of the original beep perturbed the subjects' gauging of the time interval.

Karmarkar notes that the predictions of the new model do not hold beyond one second, and that an internal clock may actually account for longer time spans on the order of seconds or minutes. Buonomano, returning to his pond example, notes that the change induced by the first pebble can no longer be measured after several seconds have elapsed. "The model is just not applicable," he says, "because the system returns to its initial state after a certain period of time."

Corby Dale, a specialist at the U.C., San Francisco, Dynamic Neuroimaging Laboratory says the current study provides a fine model for intervals shorter than 500 milliseconds, noting it's pretty common in the motor-timing literature that the systems that do timing below 500 milliseconds are different than what happens above 500 milliseconds." Warren Meck, a Duke University neuroscientist agrees: "This paper has important implications for our every day perception of the temporal relationships among all of the sights and sounds that we process." He adds that the study complements his research focusing on internal clock mechanisms—which he has localized to the basal ganglia at the brain's center—at durations of seconds to hours involving cognitive and memory processes.