Multimodal Approach to Enhance the Navigation System of the Shared E-scooter
Publicerad
Författare
Typ
Examensarbete för masterexamen
Master's Thesis
Master's Thesis
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
Shared electric scooters are an important component of sustainable urban transportation, yet current navigation systems rely heavily on smartphone screens, introducing safety risks and usability limitations. This thesis addresses these issues by
designing and evaluating a multimodal navigation interface that integrates ground projected augmented reality (AR) with auditory instructions and a visual user interface display (UI). A user-centered design process guided the research. It started with qualitative methods (interviews, on-site observations) to identify user pain points and contextual needs, followed by quantitative analysis to deepen understanding of common issues and riding behaviors. Iterative design yielded concept sketches and wireframes, then a Unity-built VR prototype. Final usability testing evaluated task performance and cognitive load across interaction modalities, using cognitive efficiency as the key metric. A controlled evaluation of the design prototype with 30 participants compared six configurations (single-modality, e.g., UI-only, to multimodal combinations). Results showed the AR + audio interaction + UI display configuration achieved the highest cognitive efficiency (E=0.84 ±1.20) and lowest mental workload (NASA-TLX = 33.35 ±16.81), with statistically significant improvements over all other configurations, while the UI-only system had the lowest efficiency (E=1.46 ±1.08) and highest cognitive load (statistical tests confirmed differences in performance, effort, and perceived frustration). Key findings from user testing included the following: (1) Channel redundancy may reduce cognitive efficiency, as users tend to rely primarily on one modality (e.g., augmented reality [AR] significantly outperformed user interface (UI) + AR, efficiency: p = .001); (2) Complementary coordination of modalities yields significant improvements (e.g., UI + AR + audio significantly outperformed UI + AR [efficiency: p = .001] and UI + audio [efficiency: p = .002]). These findings highlight the potential of multimodal navigation systems: especially those that integrate projected AR - for improving safety, usability, and rider satisfaction in shared micromobility systems. However, this study has limitations: user testing was primarily conducted in a static, stable VR setup, while real-world road environments are far more complex, which would impose greater cognitive load on users.
Beskrivning
Ämne/nyckelord
Multimodality in Interaction Design, User Experience, Shared E-scooter, Navigation Systems, Augmented Reality, Cognitive Efficiency
