Decentralized Deep Learning under Distributed Concept Drift: A Novel Approach to Dealing with Changes in Data Distributions Over Clients and Over Time

Loading...
Thumbnail Image

Date

Type

Examensarbete för masterexamen
Master's Thesis

Model builders

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

In decentralized deep learning, clients train local models in a peer-to-peer fashion by sharing model parameters, rather than data. This allows collective model training in cases where data may be sensitive or for other reasons unable to be transferred. In this setting, variations in data distributions across clients have been extensively studied, however, variations over time have received no attention. This project proposes a solution to address decentralized learning where the data distributions vary both across clients and over time. We propose a novel algorithm that can adapt to the evolving concepts in the network without any prior knowledge or estimation of the number of concepts. Evaluation of the algorithm is done using standard benchmarks adapted to the temporal setting, where it outperforms previous methods for decentralized learning.

Description

Keywords

Federated Learning, Decentralized Learning, Machine Learning, Data Heterogeneity, Non-IID, Personalization, Concept Drift

Citation

Architect

Location

Type of building

Build Year

Model type

Scale

Material / technology

Index

Endorsement

Review

Supplemented By

Referenced By