Machine Learning Meets Localization

Examensarbete för masterexamen

Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.12380/304888
Download file(s):
File Description SizeFormat 
CSE 22-48 Stenhammar Bejmer.pdf3.27 MBAdobe PDFView/Open
Bibliographical item details
FieldValue
Type: Examensarbete för masterexamen
Title: Machine Learning Meets Localization
Authors: STENHAMMAR, THEODOR
BEJMER, DAVID
Abstract: This thesis project was conducted in cooperation with Zenseact for the purpose of creating a solution for determining the lane in which an autonomous vehicle is driving. Solving this is part of the larger problem of localization and state estimation of autonomous vehicles and is referred to as Lane-Level Localization (LLL). The problem is connected to the area of Early Time Series Classification, which is the field of applying supervised learning and time series classification techniques for classifying time series accurately with as few observations as possible. The problem of LLL may be solved by applying what is known as a multi-hypothesis technique. This is a technique that estimates some state by tracking several different possibilities (hypotheses) for the state and using some model for inferring the most likely scenario. It is found that using an architecture that allows for the possibility of rejecting output depending on the certitude with which a classification can be made can be adapted to solving the problem of LLL in autonomous vehicles. In the current scenario, the model produced an accuracy of 99,5% while only rejecting to classify in 1% of the sequences.
Keywords: Engineering;thesis;Zenseact;machine learning;localization;autonomous driving;data;time series;early;early classification;lane-level localization
Issue Date: 2022
Publisher: Chalmers tekniska högskola / Institutionen för data och informationsteknik
URI: https://hdl.handle.net/20.500.12380/304888
Collection:Examensarbeten för masterexamen // Master Theses



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.