Learning Neural SDEs for Bayesian Filtering and Smoothing
Publicerad
Författare
Typ
Examensarbete för masterexamen
Master's Thesis
Master's Thesis
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
This thesis investigates neural stochastic differential equations (neural SDEs) trained
within a Wasserstein Generative Adversarial Network (WGAN) framework to approximate
conditional probability distributions of trajectories, with applications in
radar-based tracking. The study focuses on how different loss functions impact model
performance for smoothing (estimating past states) and filtering (estimating current
state) from noisy observations. Experiments in one- and two-dimensions show that
neural SDEs effectively capture complex nonlinear dynamics and uncertainty relevant
to radar tracking. Future research directions include extending the state space with
additional physical quantities, incorporating Lévy jump processes, and refining loss
functions for better accuracy around observations. In general, the study demonstrates
the feasibility and versatility of WGAN-trained neural SDEs for Bayesian filtering
and smoothing.
Beskrivning
Ämne/nyckelord
Neural SDEs, Wasserstein GAN, Adversarial Training, Bayesian Inference, Path Signatures, Particle Filtering, Doob’s h-Transform, Girsanov’s Theorem, Filtering, Smoothing, Generative Models, Stochastic Processes.