Investigating a Byzantine Resilient Framework for the Adam Optimizer
dc.contributor.author | Fabris, Basil | |
dc.contributor.department | Chalmers tekniska högskola / Institutionen för matematiska vetenskaper | sv |
dc.contributor.examiner | Ringh, Axel | |
dc.contributor.supervisor | Farhadkhani, Sadegh | |
dc.contributor.supervisor | Ringh, Axel | |
dc.date.accessioned | 2023-08-15T13:30:19Z | |
dc.date.available | 2023-08-15T13:30:19Z | |
dc.date.issued | 2023 | |
dc.date.submitted | 2023 | |
dc.description.abstract | Over the past few years, the utilization of Machine Learning has experienced tremendous growth across various domains, ranging from engineering to marketing. This widespread adoption of Machine Learning has been made possible by advancements in hardware, which have facilitated the training of increasingly large machine learning models. However, these models have given rise to larger datasets and raised concerns regarding data safety and privacy. To address these challenges, Distributed Machine Learning has emerged as a promising solution. By training models locally on participants’ devices, Distributed Machine Learning enhances privacy as raw data remains on the respective devices, while also reducing the need for specialized and novel hardware, as most of the computation takes place on participants’ devices. Nonetheless, due to the lack of control over participants, Distributed Machine Learning is susceptible to attacks carried out by misbehaving (byzantine) participants. This research introduces two Adam-based optimization frameworks for Distributed Machine Learning. Both frameworks are evaluated through empirical analysis using homogeneous and heterogeneous datasets, and their performance is assessed against multiple state-of-the-art attacks. Additionally, we present preliminary evidence of convergence for DRA (Distributed Robust Adam) on homogeneously distributed data. | |
dc.identifier.coursecode | MVEX03 | |
dc.identifier.uri | http://hdl.handle.net/20.500.12380/306857 | |
dc.language.iso | eng | |
dc.setspec.uppsok | PhysicsChemistryMaths | |
dc.subject | Machine Learning, Distributed, Byzantine Resilience, Adam Optimisation Algorithm. | |
dc.title | Investigating a Byzantine Resilient Framework for the Adam Optimizer | |
dc.type.degree | Examensarbete för masterexamen | sv |
dc.type.degree | Master's Thesis | en |
dc.type.uppsok | H | |
local.programme | Data science and AI (MPDSC), MSc |