Researchers explore benefits of immersive technology for soldiers

  
Army researchers explore benefits of immersive technology for soldiers
As part of their research efforts regarding the benefits of immersive technology for warfighters, Army researchers have developed the Mixed Reality Tactical Analysis Kit, or MRTAK, which functions as an experimental platform to perform assessments during collaborative mission planning and execution. MRTAK is now being developed as the mixed reality module of project AURORA (Accelerated User Reasoning for Operations, Research, and Analysis), as AURORA-MR. Credit: The Army Research Laboratory

The emergence of next generation virtual and augmented reality devices like the Oculus Rift and Microsoft HoloLens has increased interest in using mixed reality to simulate training, enhance command and control, and improve the effectiveness of warfighters on the battlefield.

It is thought that putting mission relevant battlefield data, such as satellite imagery or body-worn sensor information, into an immersive environment will allow warfighters to retrieve, collaborate and make decisions more effectively than traditional methods.

However, there is currently little evidence in the scientific literature that using immersive technology provides any measurable benefits, such as increased task engagement or improved decision accuracy.

There are also limited metrics that can be used to assess these benefits across display devices and tasks.

Researchers at RDECOM's Army Research Laboratory, the Army's corporate research laboratory (ARL), in collaboration with the University of Minnesota and the U.S. Army's Institute for Creative Technologies at the University of Southern California, have set out to change this.

In a recently published paper, the researchers surveyed potential methods for assessing the usefulness of immersive systems, discuss how the data might be acquired in experimental and tactical scenarios, as well as pose issues in multi-user collaboration.

This paper is one of the first to survey metrics and methods that are relevant to the unique problems that warfighters may face when performing decision-making in command and control or intelligence analysis scenarios.

In addition, the researchers discuss the ARL-developed Mixed Reality Tactical Analysis Kit, or MRTAK, which functions as an experimental platform to perform these assessments during collaborative mission planning and execution. MRTAK is now being developed as the mixed reality module of project AURORA (Accelerated User Reasoning for Operations, Research, and Analysis), as AURORA-MR.

This research was recently presented at the 23rd International Command and Control Information and Technology Symposium held in Pensacola, Florida.

"Our survey of the existing literature determined that new methods and metrics are essential to ensure that future basic and applied research can efficiently and accurately assess performance differences between immersive technologies and traditional 2-D systems," said Dr. Mark Dennison, research psychologist in ARL's Battlefield Information Processing Branch stationed at ARL West in Playa Vista, California.

According to Dennison, their work has often shown that researchers in this field have performed studies where collected data do not allow for useful metrics to be reported on, making it difficult or impossible for key decision-makers to determine how, when and where immersive technology provides any benefit or deficit to specific mission or task needs.

"In this paper, we suggest a paradigm shift away from simply comparing non-immersive and immersive systems on similar tasks, and instead meticulously breaking down complex decision-making into component processes that can be more accurately modeled and compared across disparate display types," Dennison said. "For example, when studying the planning of a tactical operation, such as the breach and clear of a hostile building, the same spatial information must be present in the 2-D and VR experimental conditions to allow for precise quantitative comparisons.

As part of this research into collaborative immersive analytics, the researchers developed and deployed AURORA-MR, which serves as a test-bed to perform tightly controlled basic and applied research of multi-user decision making with distributed immersive systems.

Currently, AURORA-MR is being used for collaborative immersive analytics research in Maryland at ARL headquarters at the Adelphi Laboratory Center and Aberdeen Proving Ground, in California at ARL West and the ICT's Mixed Reality Lab, and at the University of Minnesota.

The system has also been demonstrated to NATO SET-256, the Air Force's TAP Lab, and was featured at the AUSA 2018 Global Force Innovator's Corner.

According to the researchers, research conducted with AURORA-MR will enable the Soldier to understand when visualizing and interacting with critical battlefield information might be best done in an immersive system, or in collaboration with others using traditional systems.

"Through virtualization of some or all elements of the Tactical Operations Center, commanders and intelligence analysts can communicate and collaborate without the constraints of a physical building and with a reduced footprint to enemy intelligence, surveillance and reconnaissance, or ISR," Dennison said.

The design of AURORA-MR seeks to enable easy integration with other databases, sensors and machine learning so that joint research can more fluidly occur internally across ARL and externally with their academic and industry partners.

"Currently, we are evolving the network powering AURORA-MR, called AURORA-NET, to allow for greater control over the information that is sent and received by clients, while ensuring that the virtual environment is rendered at a comfortable frame rate to minimize the crippling effects of motion sickness on immersed users," Dennison said. "This will enable us to conduct research on how ingestion and analysis of data from noisy systems, such as the Internet of Battlefield Things, can be augmented through distributed collaboration in mixed reality."

AURORA-MR started development as a single-user VR environment in December 2017. It is actively being developed and a connection was recently established in November 2018 allowing remote collaboration with researchers at APG.

Explore further: Mandarin language learners get a boost from AI