Multimodal Approach to Enhance the Navigation System of the Shared E-scooter
| dc.contributor.author | Wei, Peiran | |
| dc.contributor.department | Chalmers tekniska högskola / Institutionen för data och informationsteknik | sv |
| dc.contributor.department | Chalmers University of Technology / Department of Computer Science and Engineering | en |
| dc.contributor.examiner | Fjeld, Morten | |
| dc.contributor.supervisor | Diapoulis, Georgios | |
| dc.date.accessioned | 2026-01-19T08:02:16Z | |
| dc.date.issued | 2025 | |
| dc.date.submitted | ||
| dc.description.abstract | Shared electric scooters are an important component of sustainable urban transportation, yet current navigation systems rely heavily on smartphone screens, introducing safety risks and usability limitations. This thesis addresses these issues by designing and evaluating a multimodal navigation interface that integrates groundprojected augmented reality (AR) with auditory instructions and a visual user interface display (UI). A user-centered design process guided the development of the prototype. Initial qualitative research: interviews and on-site observations to identify user pain points and contextual needs. Then, through quantitative analysis, this study deeply investigates the common issues and riding habits of users. Iterative design phases produced concept sketches and wireframes,leading to a final system featuring AR turn arrows projected onto the ground, synchronized audio prompts, and a mobile screen interface for route overview. The design prototype was evaluated in a controlled study involving 30 participants, comparing six configurations ranging from single-modality (e.g., user interface display only) to multimodal combinations. The results indicated that the configuration integrating augmented reality (AR), audio interaction, and user interface (UI) display achieved the highest cognitive efficiency (E=0.84 ±1.20) and the lowest mental workload (NASA-TLX=33.35 ±16.81), demonstrating statistically significant improvement over all other tested setups. The UI-only system produced the lowest efficiency (E=1.46 ±1.08) and the highest cognitive load. Statistical tests confirmed these differences across performance, effort, and perceived frustration. Participant feedback further validated the design: 18 out of 30 preferred the AR and audio combination, describng it as intuitive, fast, and less distracting. The visual UI was primarily used for route previews at rest points, not during active riding. These findings highlight the potential of multimodal navigation systems: especially those that integrate projected AR and audio-for improving safety, usability, and rider satisfaction in shared micromobility systems. This work contributes a validated interaction model and design framework for future urban mobility applications. | |
| dc.identifier.coursecode | DATX05 | |
| dc.identifier.uri | http://hdl.handle.net/20.500.12380/310915 | |
| dc.language.iso | eng | |
| dc.setspec.uppsok | Technology | |
| dc.subject | Multimodality in Interaction Design | |
| dc.subject | User Experience | |
| dc.subject | Shared E-scooter | |
| dc.subject | Navigation Systems | |
| dc.subject | Cognitive Efficiency | |
| dc.title | Multimodal Approach to Enhance the Navigation System of the Shared E-scooter | |
| dc.type.degree | Examensarbete för masterexamen | sv |
| dc.type.degree | Master's Thesis | en |
| dc.type.uppsok | H | |
| local.programme | Interaction design and technologies (MPIDE), MSc |
