The armed forces have been using simulators to train troops and pilots for battle since World War I, but this virtual combat has typically relied on joysticks, visors and other sensors to make the action realistic. New developments in motion-sensing technology, however, promise to deliver training simulations that let a user or group of users control the on-screen action using their own arms and legs to communicate with the simulator.
"Think of this as Second Life, but instead of controlling your avatar with your mouse, you are the avatar," says Andrew Tschesnok, CEO of New York City–based Organic Motion, a maker of computer vision- and motion-tracking systems. The company on Monday introduced the latest version of its OpenStage software, designed to allow individuals or, more importantly for the military, groups of people to roam untethered through computerized landscapes.
OpenStage relies on multiple 2-D video cameras positioned around a subject or group to track movement. The data gathered from each camera is fed into a vision processor that maps every pixel of information and triangulates a subject's location by determining where the various camera images intersect. The system is able to analyze hundreds of megabytes of data per second and dynamically select which pixels are more important than others in creating a particular scene, Tschesnok says. This selection results in 3-D avatars that can mimic subjects' real-world motion.
Previous generations of OpenStage could create a simulation for only one participant at a time. To track a group of people, the new version captures the positions of each individual and estimates the positions of their body parts that are obscured or hidden from the cameras' views.
The company hopes that its technology will serve as a key component of next-generation simulators that offer scenarios including urban warfare and close combat tactical training for troops on foot. Tschesnok says he hopes to hear from the U.S. government about the bids in which Organic Motion is involved within six months or so.
Although Organic Motion provides a similar technology to New York City's Sony Wonder Technology Lab, enabling visitors to step in front of a motion-tracking camera and convert themselves into onscreen avatars, a consumer version of OpenStage will not be available until 2012.
The technology behind OpenStage is similar in principle to that of Microsoft's Project Natal, being developed to let the company's Xbox 360 customers play video games without the need for a controller. Tschesnok points out, however, that whereas Natal uses a single camera and sensor placed between the player and the game screen to capture player movement, Organic Motion's system relies on multiple cameras to cover a 360-degree field and feed the data to its animation software.
E-Motion: Next-Gen Simulators to Blur the Line between Person and Avatar
New motion-sensing technology lets trainees react more naturally to action, without the need for controllers or sensor tags