Annotation-free Learning for Sensor Fusion in ADAS

Loading...
Thumbnail Image

Date

Type

Examensarbete för masterexamen
Master's Thesis

Model builders

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Vehicle automation has the potential to significantly improve road safety. Achieving comprehensive vehicle perception requires systems that optimally combine information from multiple sensor modalities. Such systems leverage the strengths of each modality while compensating for their weaknesses. By continuously encoding and fusing information from cameras, LiDARs, RADARs and the motion of the egovehicle, a dynamic representation of the surrounding environment can be created and maintained. A major challenge for these systems is the large amount of annotated data required for training, as manual labelling creates a significant bottleneck for scalability. In this study, a pre-training task for a multi-modal machine learning model was implemented and evaluated. To circumvent labour-intensive labelling, self-supervision was employed, with both the model input and the supervision signal involving annotation-free data. The pre-training aimed to learn general features related to sensor pose changes by predicting ego-vehicle pose changes using odometry data. To assess pre-training performance, the features were then used as initial weights for fine-tuning a perception model. The performance of the perception model using baseline weights trained on annotated data was similar to that using weights trained on annotation-free data, indicating that the proposed method is viable. However, further testing is required to establish statistical significance. Future work could explore implementing attention-based methods for feature matching between scene representations to improve model performance.

Description

Keywords

ADAS, Annotation-free, Ego-vehicle, Multi-modal, Perception, Pretraining, Sensor Fusion, Transformer

Citation

Architect

Location

Type of building

Build Year

Model type

Scale

Material / technology

Index

Endorsement

Review

Supplemented By

Referenced By