Using Unity 3D, the experiment recreated the interior of a space station. Participants navigated from the core node module to the laboratory module I, where they operated the Space Raman Spectrometer (SRS). Tasks included retrieval, adjustment, installation, assembly, calibration, and testing. The setup utilized the Vive Pro Eye VR headset, integrated with headphones, and Tobii Pro VR Analytics software for data collection. Visual instructions were displayed prominently in white bold font on a black background, while auditory instructions were delivered at 60 decibels in a conversational tone.
The experimental design followed a 2 + 2 Latin square approach. Participants were split into AB and BA sequence groups and received VR device training prior to the experiment. Researchers recorded completion time, error rates, and eye movement data. Post-experiment, the NASA-TLX scale assessed mental workload. The Shapiro-Wilk test checked data normality, with paired t-tests for normal data and Wilcoxon tests for non-normal data. Results were expressed as mean +/- standard deviation, with significance set at P < 0.05 and high significance at P < 0.01.
Eye movement data focused on defined Areas of Interest (AOIs) within the space station model, analyzed using Tobii Pro VR Analytics software. Interviews were conducted post-experiment to gather participant feedback on AR visual and auditory guidance.
Outlier data from 13 men and 12 women were excluded after filtering. The NASA-TLX scale revealed that AR visual guidance resulted in lower psychological demand, time demand, frustration, and total scores compared to auditory guidance. Task completion times and error rates also favored AR visual guidance. Eye movement analysis showed superior performance in fixation points, scan distance, total fixation duration, and average pupil diameter for the AR visual group.
The study concluded that AR visual instructions outperformed auditory instructions in task completion time, error rates, and eye movement metrics, highlighting the advantages of AR guidance in reducing cognitive load and enhancing task performance. No significant differences were noted in physical demand, self-performance, or regression time between the two modes, likely due to the nature of the tasks.
This research provides valuable insights into human-computer interaction design, offering evidence to support the use of AR visual guidance in space missions to reduce cognitive load and improve efficiency.
Research Report:Effects of Visual and Auditory Instructions on Space Station Procedural Tasks
Related Links
Beijing Institute of Technology
Space Medicine Technology and Systems
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |