Navy developing augmented-reality training system
August 27, 2012
The Office of Naval Research (ONR) has completed the first year of a multi-year augmented-reality effort, developing a system that allow trainees to view simulated images superimposed on real-world landscapes.
“The training capability [that] augmented reality offers is revolutionary, because you can train in a real-world environment and inject simulated forces or entities,” said Dr. Peter Squire, ONR program manager for Human Performance Training and Education.
“This will decrease costs and allow trainers to execute a wide range of scenarios with a fraction of the support required for live training. You can construct simulations to meet your training needs and objectives rather than going to a training facility, enabling users to train anywhere.”
The technology uses advanced software algorithms and multiple sensors to determine a trainee’s viewpoint, while virtual aircraft, targets and munitions effects are inserted into the real-world view through glasses, goggles or a visor.
One application for augmented reality is Joint Terminal Attack Controller (JTAC) training. JTACs work on the ground to manage the attacks of nearby combat aircraft. Today, live JTAC training is conducted on a few specialized ranges with static targets and limited reconfigurability. This training also requires aircraft flight hours, range time and live artillery — all of which are scarce resources.
Augmented reality offers huge cost savings, since the only element needed is the terrain: aircraft, targets and effects can all be computer generated.
Researchers working on the project will present papers at ISMAR 2012, the International Symposium on Mixed and Augmented Reality, Nov. 5-8 in Atlanta, as well as at I/ITSEC 2012, the Interservice/Industry Training, Simulation, and Education Conference, Dec. 3–6 in Orlando, Fla.