Whether they know it or not, the National Football League faithful who root for their teams every weekend have also become big fans of "augmented reality". If you're not sure what this is, take a look at your television on Sunday afternoon and notice the yellow "first down" stripe running the width of the field (as well as a blue one that delineates the line of scrimmage). That's augmented reality—technology that merges computer-generated images with real-world sights and sounds.

For decades, researchers also have been developing a more sophisticated, wearable technology that brings augmented reality to people in their daily lives. Think of a see-through visor in automobiles that would provide directions (not just text, but arrows) to drivers as they cruise down the highway. These kinds of mobile applications are not easy to engineer because the wearer's location is constantly changing, making it difficult for the cameras, angular velocity sensors and accelerometers that are part of a wearable augmented-reality system to determine which graphics to display, says Jurjen Caarls, a researcher at Eindhoven University of Technology in the Netherlands.

Caarls should know, because last week he received a PhD from Delft University of Technology in the Netherlands, where he and colleagues created a prototype augmented-reality system that superimposed computer-generated graphics over real-world scenes. The system included a helmet with a visor that projected images into the wearer's eyes to create the illusion that these images were part of reality. This effect was achieved using two small screens and two semitransparent mirrors built into the helmet.

Caarls's system used a camera and inertia sensors for measuring the wearer's position. Cameras are capable of obtaining accurate absolute positions, but they do so rather slowly, according to an interview with Caarls in the university's Delta newspaper. Inertia sensors are faster, but less accurate, providing only acceleration and rotation speed. By combining the two data inputs, Caarls developed image-processing techniques and filters that allowed for the absolute position to be determined with the camera and a single small visual marker, while information from the inertia sensors was used to interpolate in-between camera measurements. One problem that Caarls could not solve completely, however, was the slight time lag (about 80 milliseconds) between what the user sees in the real world and the virtual projection.

"I can imagine a future in which people experience augmented reality by wearing glasses with integrated displays that project images on their retinas," Caarls said in a press release describing his research. "Think of it as a visual Walkman."

Any augmented-reality system includes three key features, according to Mark Billinghurst, director of the Human Interface Technology Laboratory at the University of Canterbury in New Zealand, in the article, "Annotating the Real World," published in the October 2008, issue of Scientific American. These include: "virtual information that is tightly registered, or aligned, with the real world; the ability to deliver information and interactivity in real-time; and seamless mixing of real-world and virtual information." The keys to advancing augmented reality, he wrote, are advances in display technologies ("virtual" eyeglasses, for example), processors and graphics chips for mobile devices, along with tracking systems and cameras.

Researchers have been building prototype systems for more than three decades, according to an April 2002, Scientific American article entitled, "Augmented Reality: A New Way of Seeing". The first was developed in the 1960s by computer graphics pioneer Ivan Sutherland and his students at Harvard University and the University of Utah. In the 1970s and '80s a small number of researchers studied augmented reality at institutions such as the U.S. Air Force's Armstrong Laboratory, NASA Ames Research Center and the University of North Carolina at Chapel Hill.

In the 1990s Boeing envisioned mechanics wearing head-mounted displays (HMDs) and carrying small computers that would contain a database of information for fixing Boeing products, says Eric Foxlin, founder and chief technology officer of InterSense, Inc., a Billerica, Mass., motion-tracking technology–maker that helped Boeing with the project. Augmented reality would inform mechanics how to replace parts without the need for consulting a manual. More recently, Boeing has proposed its HMD technology (or some variation of it) as a way to help with repairs on space missions.

The publisher of The Official Michael Jackson Opus, a biography of the late performer due in December, plans to incorporated augmented reality into the project. Opus Media Group designed the 400-page, 12-kilogram book so that, when certain pages are held up to a Web camera, "the pages will come alive, with both video and music appearing on your screen while you hold the book up to [Webcam]," according to RollingStone.com.