Studying Imperfect Communication In Distributed Optimization Algorithm
Ladda ner
Typ
Examensarbete för masterexamen
Master's Thesis
Master's Thesis
Program
Computer systems and networks (MPCSN), MSc
High-performance computer systems (MPHPC), MSc
High-performance computer systems (MPHPC), MSc
Publicerad
2024
Författare
Math, Swati
Venkatesan, Madhumitha
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
Distributed optimization methods are essential in machine learning, especially when
data is distributed across multiple nodes or devices. These algorithms enable effective
model training without data consolidation, improving privacy and reducing
communication costs. However, their performance is greatly influenced by the quality
of communication, which may degrade due to factors such as quantization and
erasure. Quantization, which involves estimating values during transmission, can result
in loss of information and requires strategic optimization to manage distortion
and communication expenses. Similarly, erasure causes loss of transmitted information,
leading to delays in convergence ,increased energy usage. This study explores
how communication imperfections affect the performance of distributed optimization
algorithms, emphasizing convergence rates, scalability, and overall efficiency. The
research examines how quantization and erasure impact different distributed architectures
like Federated Learning and push-pull gradient methods under different
network topologies and suggests ways to reduce their effects.
Beskrivning
Ämne/nyckelord
Distributed optimization , machine learning , quantization , erasure , convergence , communication overhead , scalability , Federated Learning , push-pull gradient methods , distributed systems