Chris Oates, a physicist in the Time and Frequency Division of the National Institute of Standards and Technology (NIST), explains.

Despite the differences between light and sound, the same two basic methods have been used in most measurements of their respective speeds. The first method is based on simply measuring the time it takes a pulse of light or sound to traverse a known distance; dividing the distance by the transit time then gives the speed. The second method makes use of the wave nature common to these phenomena: by measuring both the frequency (f) and the wavelength () of the propagating wave, one can derive the speed of the wave from the simple wave relation, speed = f×. (The frequency of a wave is the number of crests that pass per second, whereas the wavelength is the distance between crests). Although the two phenomena share these measurement approaches, the fundamental differences between light and sound have led to very different experimental implementations, as well as different historical developments, in the determination of their speeds.

In its simplest form, sound can be thought of as a longitudinal wave consisting of compressions and extensions of a medium along the direction of propagation. Because sound requires a medium through which to propagate, the speed of a sound wave is determined by the properties of the medium itself (such as density, stiffness, and temperature). These parameters thus need to be included in any reported measurements. In fact, one can turn such measurements around and actually use them to determine thermodynamic properties of the medium (the ratio of specific heats, for example).

The first known theoretical treatise on sound was provided by Sir Isaac Newton in his Principia, which predicted a value for the speed of sound in air that differs by about 16 percent from the currently accepted value. Early experimental values were based on measurements of the time it took the sound of cannon blasts to cover a given distance and were good to better than 1 percent of the currently accepted value of 331.5 m/s at 0 degrees Celsius. Daniel Colladon and Charles-Francois Sturm first performed similar measurements in water in Lake Geneva in 1826. They found a value only 0.2 percent below the currently accepted value of ~1,440 m/s at 8 degrees C. These measurements all suffered from variations in the media themselves over long distances, so most subsequent determinations have been performed in the laboratory, where environmental parameters could be better controlled, and a larger variety of gases and liquids could be investigated. These experiments often use tubes of gas or liquid (or bars of solid material) with precisely calibrated lengths. One can then derive the speed of sound from a measurement of the time that an impulse of sound takes to traverse the tube. Alternatively (and usually more accurately), one can excite resonant frequencies of the tube (much like those of a flute) by inducing a vibration at one end with a loudspeaker, tuning fork, or other type of transducer. Because the corresponding resonant wavelengths have a simple relationship to the tube length, one can then determine the speed of sound from the wave relation and make corrections for tube geometry for comparisons with speeds in free space.

The wave nature of light is quite different from that of sound. In its simplest form, an electromagnetic wave (such as light, radio, or microwave) is transverse, consisting of oscillating electric and magnetic fields that are perpendicular to the direction of propagation. Moreover, although the medium through which light travels does affect its speed (reducing it by the index of refraction of the material), light can also travel through a vacuum, thus providing a unique context for defining its speed. In fact, the speed of light in a vacuum, c, is a fundamental building block of Einstein's theory of relativity, because it sets the upper limit for speeds in the universe. As a result, it appears in a wide range of physical formulae, perhaps the most famous of which is E=mc2. The speed of light can thus be measured in a variety of ways, but due to its extremely high value (~300,000 km/s or 186,000 mi/s), it was initially considerably harder to measure than the speed of sound. Early efforts such as Galileo's pair of observers sitting on opposing hills flashing lanterns back and forth lacked the technology needed to measure accurately the transit times of only a few microseconds. Remarkably, astronomical observations in the 18th century led to a determination of the speed of light with an uncertainty of only 1 percent. Better measurements, however, required a laboratory environment. Louis Fizeau and Leon Foucault were able to perform updated versions of Galileos experiment through the use of ingenious combinations of rotating mirrors (along with improved measurement technology) and they made a series of beautiful measurements of the speed of light. With still further improvements, Albert A. Michelson performed measurements good to nearly one part in ten thousand.

Metrology of the speed of light changed dramatically with a determination made here at NIST in 1972. This measurement was based on a helium-neon laser whose frequency was fixed by a feedback loop to match the frequency corresponding to the splitting between two quantized energy levels of the methane molecule. Both the frequency and wavelength of this highly stable laser were accurately measured, thereby leading to a 100-times reduction in the uncertainty for the value of the speed of light. This measurement and subsequent measurements based on other atomic/molecular standards were limited not by the measurement technique, but by uncertainties in the definition of the meter itself. Because it was clear that future measurements would be similarly limited, the 17th Confrence Gnrale des Poids et Mesures (General Conference on Weights and Measures) decided in 1983 to redefine the meter in terms of the speed of light. The speed of light thus became a constant (defined to be 299,792,458 m/s), never to be measured again. As a result, the definition of the meter is directly linked (via the relation c= f×) to that of frequency, which is by far the most accurately measured physical quantity (presently the best cesium atomic fountain clocks have a fractional frequency uncertainty of about 1x10-15).