Scott Diddams and Tom O'Brian of the Time and Frequency Division of the National Institute of Standards and Technology, explain.
Just how fast an event is depends somewhat on your point of view. In nature around us there are various physical events that occur on time scales from the yoctosecond (10-24 second) to the exasecond (1018 second). In the time it just took your heart to beat once, the computer on the desk next to you completed about one billion clock cycles, whereas the electron of a hydrogen atom could have circled its proton about 1 quadrillion (1015) times. On the other hand, that very slow heart beat is actually quite fast and fleeting if one considers it relative to the 500 quadrillion (500 x 1015) second lifetime of our universe. Within this tremendous range of time scales, science and technology, which are constantly improving, determine how accurately different events can be measured or inferred.
For example, in the late 19th century, the best scientists and technologists struggled to measure time intervals on the order of a hundredth or thousandth of a second. In a well-known (and often mythologized) story, photography pioneer Eadweard Muybridge, on commission from Leland Stanford, took several years to develop a system of rapid-sequence photography to conclusively prove that a galloping or trotting horse briefly has all four feet aloft simultaneously--an event too fast for the human eye to follow. Muybridge was able to perfect his system to record events on the scale of about 0.001 second in 1877.
But this story also points out a challenge in answering the original question: The answer depends on how one interprets the word "measured." This might sound like a pedantic dodge, but at the National Institute of Standards and Technology (NIST) we spend a lot of time trying to understand and apply the subtleties of measurement. Muybridge's photography was a record of short duration events--possibly the best such record of its time--but was not a measurement of time interval in the strict sense. Both the recording or inference of short duration events and accurate measurements of such events are of interest, so we suggest rephrasing the original question into two new questions: "What are the shortest time durations that can be measured with a particular accuracy?" and "What are the shortest duration events that can be recorded or inferred in experiments?"
To best answer the first question about measurement with a particular accuracy, let us agree to define measurement as a comparison to a generally accepted standard. By international treaty the standard for the second, the unit of time, is defined as exactly 9,192,631,770 cycles of a particular electron transition in cesium 133 atoms. So a time measurement is a direct or indirect comparison to this defined standard.
Currently, the best technical approach to measuring time against this standard is to use laser-cooled cesium atomic fountain frequency standards, known as cesium atomic fountain clocks. The handful of these cesium atomic fountain devices operating around the world are actually frequency standards rather than clocks (timekeeping devices), and they are used to realize the defined cesium standard frequency with exceptional accuracy of about 1 part in 1015. The best reported uncertainty at the time of this writing is about 6 x 10-16 for the NIST-F1 fountain standard. Because 86,400 seconds make up one day, this relative uncertainty means the standard is accurate to about 50 picoseconds (50 x 10-12 second) per day. Put another way, if the frequency standard could be operated indefinitely as a clock it would neither gain nor lose more than a second in 50 million years compared to a perfect clock.
Cesium atomic fountain standards are the world's most accurate primary standards of any kind. No other standard--including ones used for length, mass and electrical current--has an accuracy within even a factor of 1,000 of the cesium atomic fountain clock. The atomic fountain standard uncertainty of about 1 x 10-15 might seem to imply that these "clocks" could be used to measure events on the order of femtoseconds (10-15 second), but in fact fountain standards are not generally useful for directly measuring short duration events.