Human intent-recognition system for safety-critical human-machine interactions

dc.contributor.authorKünzler, Simon
dc.contributor.departmentChalmers tekniska högskola / Institutionen för mekanik och maritima vetenskapersv
dc.contributor.examinerBoyraz Baykas, Pinar
dc.contributor.supervisorBoyraz Baykas, Pinar
dc.date.accessioned2020-07-02T08:54:48Z
dc.date.available2020-07-02T08:54:48Z
dc.date.issued2020sv
dc.date.submitted2020
dc.description.abstractThe aim of this thesis was to investigate the potential of eye tracking technology, to help recognizing the intent of humans when working with a machine under shared control. An experiment was designed to study the eye gaze behaviour of test subjects, while manipulating a two degrees-of-freedom (DOF) SCARA robot. The subjects were given the task to maneuver the end-effector of the robot through a sequence of LEDs located on the robot action plane. The LED sequence was different for each experiment run and not known by the subjects before the start of each run. In the first step, eye gaze data was collected while the robot was unactuated. The fixation point of the subjects gaze was 4.5 times more likely to be in the proximity of the goal LEDs they intended to connect, opposed to fixating a point outside of the intended area. In addition, when the subjects planned to move from one LED to the next, the subject’s gaze tended to fixate on the next LED between one and two seconds before reaching the position with the robot end-effector, depending on how much distance the subject had to cover when moving from the current LED to the next. After reaching the fixated position, the gaze is shifted almost immediately (with 0.1-0.2s delay) onto the next LED, while movement onset is delayed about 0.5 seconds. This information was then used to develop an algorithm to predict which LED a subject is intending to reach. While performing a second set of tests, more data was collected, but this time under shared-control with the robot. The implemented algorithm was able to successfully identify the next goal LED in the subject’s planned path and to provide assistance in the movement of the robot arm. How far ahead of time the goals were recognized was dependent on how soon the subjects gaze shifted from a reached LED to their next planned goal LED. If the subject fixates on a goal LED 0.3s before initiating the movement towards it, the robot was able to perform the whole movement between the LEDs. In most cases the algorithm initiated the support half-way through the planned motion of the subjects. No significant differences in the subjects gaze data between passive robot manipulation and shared control could be identified.sv
dc.identifier.coursecodeMMSX30sv
dc.identifier.urihttps://hdl.handle.net/20.500.12380/301180
dc.language.isoengsv
dc.relation.ispartofseries2020:52sv
dc.setspec.uppsokTechnology
dc.subjectHuman intent-recognitionsv
dc.subjectEye trackingsv
dc.subjectShared-controlsv
dc.subjectHuman-robot shared manipulationsv
dc.titleHuman intent-recognition system for safety-critical human-machine interactionssv
dc.type.degreeExamensarbete för masterexamensv
dc.type.uppsokH
local.programmeComplex adaptive systems (MPCAS), MSc
Ladda ner
Original bundle
Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
2020-52 Simon Künzler.pdf
Storlek:
1.68 MB
Format:
Adobe Portable Document Format
Beskrivning:
Master thesis
License bundle
Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
license.txt
Storlek:
1.14 KB
Format:
Item-specific license agreed upon to submission
Beskrivning: