Investigating a Byzantine Resilient Framework for the Adam Optimizer

Typ
Examensarbete för masterexamen
Master's Thesis
Program
Data science and AI (MPDSC), MSc
Publicerad
2023
Författare
Fabris, Basil
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
Over the past few years, the utilization of Machine Learning has experienced tremendous growth across various domains, ranging from engineering to marketing. This widespread adoption of Machine Learning has been made possible by advancements in hardware, which have facilitated the training of increasingly large machine learning models. However, these models have given rise to larger datasets and raised concerns regarding data safety and privacy. To address these challenges, Distributed Machine Learning has emerged as a promising solution. By training models locally on participants’ devices, Distributed Machine Learning enhances privacy as raw data remains on the respective devices, while also reducing the need for specialized and novel hardware, as most of the computation takes place on participants’ devices. Nonetheless, due to the lack of control over participants, Distributed Machine Learning is susceptible to attacks carried out by misbehaving (byzantine) participants. This research introduces two Adam-based optimization frameworks for Distributed Machine Learning. Both frameworks are evaluated through empirical analysis using homogeneous and heterogeneous datasets, and their performance is assessed against multiple state-of-the-art attacks. Additionally, we present preliminary evidence of convergence for DRA (Distributed Robust Adam) on homogeneously distributed data.
Beskrivning
Ämne/nyckelord
Machine Learning, Distributed, Byzantine Resilience, Adam Optimisation Algorithm.
Citation
Arkitekt (konstruktör)
Geografisk plats
Byggnad (typ)
Byggår
Modelltyp
Skala
Teknik / material
Index