Augmented Reality Sandtable Explained

The Augmented Reality Sandtable (ARES) is an interactive, digital sand table that uses augmented reality (AR) technology to create a 3D battlespace map. It was developed by the Human Research and Engineering Directorate (HRED) at the Army Research Laboratory (ARL) to combine the positive aspects of traditional military sand tables with the latest digital technologies to better support soldier training and offer new possibilities of learning.[1] It uses a projector to display a topographical map on top of the sand in a regular sandbox as well as a motion sensor that keeps track of changes in the layout of the sand to appropriately adjust the computer-generated terrain display.[2] [3]

An ARL study conducted in 2017 with 52 active duty military personnel (36 males and 16 females) found that the participants who used ARES spent less time setting up the table compared to participants who used a traditional sand table. In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index (NASA-TLX) ratings, compared to the traditional sand table. However, there was no significant difference in post-knowledge test scores in recreating the visual map.[4]

Development

The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces.[5] It was developed by HRED's Simulation and Training Technology Center (STTC) with Charles Amburn as the principal investigator. Collaborations involved with ARES included Dignitas Technologies, Design Interactive (DI), the University of Central Florida's Institute for Simulation and Training, and the U.S. Military Academy at West Point.[6]

ARES was largely designed to be a tangible user interface (TUI), in which digital information can be manipulated using physical objects such as a person's hand. It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft's Xbox Kinect sensor, and government-developed ARES software. With the projector and Kinect sensor both facing down on the surface of the sandbox, the projector provides a digital overlay over the sand and the Kinect sensor scans the surface of the map to detect any user gestures inside the boundaries of the sandbox.

During development, researchers explored the possibility of incorporating ideas such as multi-touch surfaces, 3D holographic displays, and virtual environments. However, budget restrictions limited the implementation of such ideas.

In September 2014 during the Modern Day Marine exhibition in Quantico, Virginia, researchers from ARL showcased ARES for the first time.[7]

Uses

According to a 2015 technical report by ARL scientists, ARES is reported to have the following capabilities.

Notes and References

  1. Amburn. Charles. Vey. Nathan. Boyce. Michael. Mize. Jerry. October 2015. The Augmented REality Sandtable (ARES). The US Army Research Laboratory.
  2. News: Microsoft's Kinect aids in 'augmented reality sand' mapping tool for Marines, Army. September 23, 2014. Marine Corps Times. August 2, 2018.
  3. News: Design Digital Terrain with the Army's Projection-Mapped Sandtable. Mufson. Beckett. November 5, 2014. Vice Creators. August 2, 2018.
  4. Hale. Kelly. Riley. Jennifer. Amburn. Charles. Vey. Nathan. January 1, 2018. Evaluation of Augmented REality Sandtable (ARES) during Sand Table Construction. US Army Research Laboratory. Defense Technical Information Center.
  5. Garneau. Christopher. Boyce. Michael. Shorter. Paul. Vey. Nathan. Amburn. Charles. February 1, 2018. The Augmented Reality Sandtable (ARES) Research Strategy. US Army Research Laboratory. Defense Technical Information Center.
  6. News: Army and Marines research sand table technology. Glass. Dolly. December 18, 2014. Team Orlando. August 2, 2018.
  7. News: New Sand Table Technology Featured at Modern Day Marine. Hedelt. Carden. September 24, 2014. CHIPS. August 2, 2018.