MIT Media Lab’s Gershon Dublon and Joseph Paradiso believe that augmented reality will go far beyond Google Glass. Soon, data from sensors embedded in a growing number of places will add a new layer to human perceptual experience. But first, two things need to happen. One, sensor data that today are siloed for use in specific applications need to be made available to any device that wants to use the information. Two, researchers must develop intuitive graphical interfaces that can make sense of the flood of data produced by a world of ubiquitous sensors. The DoppelLab application embedded below is one such interface. Dublon and Paradiso developed the software to process data captured by sensors throughout the MIT Media Lab and depict it in real time on a translucent model of the building. Temperature, motion, sound and other properties are depicted with icons, making it possible to virtually inspect the conditions in the Media Lab at this moment. Dublon, Paradiso and their research team created this DoppelLab demo for Scientific American