Decentralized Deep Learning under Distributed Concept Drift: A Novel Approach to Dealing with Changes in Data Distributions Over Clients and Over Time

Typ
Examensarbete för masterexamen
Master's Thesis
Program
Complex adaptive systems (MPCAS), MSc
Data science and AI (MPDSC), MSc
Publicerad
2023
Författare
Klefbom, Emilie
Örtenberg Toftås, Marcus
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
In decentralized deep learning, clients train local models in a peer-to-peer fashion by sharing model parameters, rather than data. This allows collective model training in cases where data may be sensitive or for other reasons unable to be transferred. In this setting, variations in data distributions across clients have been extensively studied, however, variations over time have received no attention. This project proposes a solution to address decentralized learning where the data distributions vary both across clients and over time. We propose a novel algorithm that can adapt to the evolving concepts in the network without any prior knowledge or estimation of the number of concepts. Evaluation of the algorithm is done using standard benchmarks adapted to the temporal setting, where it outperforms previous methods for decentralized learning.
Beskrivning
Ämne/nyckelord
Federated Learning, Decentralized Learning, Machine Learning, Data Heterogeneity, Non-IID, Personalization, Concept Drift
Citation
Arkitekt (konstruktör)
Geografisk plats
Byggnad (typ)
Byggår
Modelltyp
Skala
Teknik / material
Index