This story is a supplement to the feature "Hands On Computing: How Multi-touch Screens Could Change The Way We Interact With Computers and Each Other" which was printed in the July 2008 issue of Scientific American.
The most advanced multi-touch screens respond to the motion and pressure of numerous fingers. In the Perceptive Pixel design (below), projectors send images through an acrylic screen onto the surface facing the viewer. When fingers or other objects (such as a stylus) touch the surface, infrared light shone inside the acrylic sheet by LEDs scatters off the fingers and back to sensors. Software interprets the data as finger movements. Tapping the screen brings up command menus when desired.
To create a signal, LEDs bounce light through the acrylic sheet. No light escapes. But if a finger is placed against the face (below), light will scatter off it toward the sensors. Also, a pressure-sensitive coating flexes when pressed firmly or lightly, making the scattered fingertip signal appear slightly brighter or dimmer, which the computer interprets as more or less pressure.
A projector inside Microsoft’s multi-touch table, called Surface, sends imagery up through the acrylic top. An LED shines near-infrared light up as well, which reflects off objects or fingers back to various infrared cameras; a computer monitors the reflections to track finger motions.