ODR kommer att vara otillgängligt pga systemunderhåll onsdag 25 februari, 13:00 -15:00 (ca). Var vänlig och logga ut i god tid. // ODR will be unavailable due to system maintenance, Wednesday February 25, 13:00 - 15:00. Please log out in due time.
 

Multimodal Approach to Enhance the Navigation System of the Shared E-scooter

dc.contributor.authorWei, Peiran
dc.contributor.departmentChalmers tekniska högskola / Institutionen för data och informationstekniksv
dc.contributor.departmentChalmers University of Technology / Department of Computer Science and Engineeringen
dc.contributor.examinerFjeld, Morten
dc.contributor.supervisorDiapoulis, Georgios
dc.date.accessioned2026-01-19T08:02:16Z
dc.date.issued2025
dc.date.submitted
dc.description.abstractShared electric scooters are an important component of sustainable urban transportation, yet current navigation systems rely heavily on smartphone screens, introducing safety risks and usability limitations. This thesis addresses these issues by designing and evaluating a multimodal navigation interface that integrates groundprojected augmented reality (AR) with auditory instructions and a visual user interface display (UI). A user-centered design process guided the development of the prototype. Initial qualitative research: interviews and on-site observations to identify user pain points and contextual needs. Then, through quantitative analysis, this study deeply investigates the common issues and riding habits of users. Iterative design phases produced concept sketches and wireframes,leading to a final system featuring AR turn arrows projected onto the ground, synchronized audio prompts, and a mobile screen interface for route overview. The design prototype was evaluated in a controlled study involving 30 participants, comparing six configurations ranging from single-modality (e.g., user interface display only) to multimodal combinations. The results indicated that the configuration integrating augmented reality (AR), audio interaction, and user interface (UI) display achieved the highest cognitive efficiency (E=0.84 ±1.20) and the lowest mental workload (NASA-TLX=33.35 ±16.81), demonstrating statistically significant improvement over all other tested setups. The UI-only system produced the lowest efficiency (E=1.46 ±1.08) and the highest cognitive load. Statistical tests confirmed these differences across performance, effort, and perceived frustration. Participant feedback further validated the design: 18 out of 30 preferred the AR and audio combination, describng it as intuitive, fast, and less distracting. The visual UI was primarily used for route previews at rest points, not during active riding. These findings highlight the potential of multimodal navigation systems: especially those that integrate projected AR and audio-for improving safety, usability, and rider satisfaction in shared micromobility systems. This work contributes a validated interaction model and design framework for future urban mobility applications.
dc.identifier.coursecodeDATX05
dc.identifier.urihttp://hdl.handle.net/20.500.12380/310915
dc.language.isoeng
dc.setspec.uppsokTechnology
dc.subjectMultimodality in Interaction Design
dc.subjectUser Experience
dc.subjectShared E-scooter
dc.subjectNavigation Systems
dc.subjectCognitive Efficiency
dc.titleMultimodal Approach to Enhance the Navigation System of the Shared E-scooter
dc.type.degreeExamensarbete för masterexamensv
dc.type.degreeMaster's Thesisen
dc.type.uppsokH
local.programmeInteraction design and technologies (MPIDE), MSc

Ladda ner

Original bundle

Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
CSE 25-191 PW.pdf
Storlek:
2.15 MB
Format:
Adobe Portable Document Format

License bundle

Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
license.txt
Storlek:
2.35 KB
Format:
Item-specific license agreed upon to submission
Beskrivning: