Decentralized Deep Learning under Distributed Concept Drift: A Novel Approach to Dealing with Changes in Data Distributions Over Clients and Over Time

Publicerad

Typ

Examensarbete för masterexamen
Master's Thesis

Modellbyggare

Tidskriftstitel

ISSN

Volymtitel

Utgivare

Sammanfattning

In decentralized deep learning, clients train local models in a peer-to-peer fashion by sharing model parameters, rather than data. This allows collective model training in cases where data may be sensitive or for other reasons unable to be transferred. In this setting, variations in data distributions across clients have been extensively studied, however, variations over time have received no attention. This project proposes a solution to address decentralized learning where the data distributions vary both across clients and over time. We propose a novel algorithm that can adapt to the evolving concepts in the network without any prior knowledge or estimation of the number of concepts. Evaluation of the algorithm is done using standard benchmarks adapted to the temporal setting, where it outperforms previous methods for decentralized learning.

Beskrivning

Ämne/nyckelord

Federated Learning, Decentralized Learning, Machine Learning, Data Heterogeneity, Non-IID, Personalization, Concept Drift

Citation

Arkitekt (konstruktör)

Geografisk plats

Byggnad (typ)

Byggår

Modelltyp

Skala

Teknik / material

Index

item.page.endorsement

item.page.review

item.page.supplemented

item.page.referenced