Studying Imperfect Communication In Distributed Optimization Algorithm

dc.contributor.authorMath, Swati
dc.contributor.authorVenkatesan, Madhumitha
dc.contributor.departmentChalmers tekniska högskola / Institutionen för data och informationstekniksv
dc.contributor.departmentChalmers University of Technology / Department of Computer Science and Engineeringen
dc.contributor.examinerPanahi, Ashkan
dc.contributor.supervisorPanahi, Ashkan
dc.date.accessioned2025-01-13T09:10:19Z
dc.date.available2025-01-13T09:10:19Z
dc.date.issued2024
dc.date.submitted
dc.description.abstractDistributed optimization methods are essential in machine learning, especially when data is distributed across multiple nodes or devices. These algorithms enable effective model training without data consolidation, improving privacy and reducing communication costs. However, their performance is greatly influenced by the quality of communication, which may degrade due to factors such as quantization and erasure. Quantization, which involves estimating values during transmission, can result in loss of information and requires strategic optimization to manage distortion and communication expenses. Similarly, erasure causes loss of transmitted information, leading to delays in convergence ,increased energy usage. This study explores how communication imperfections affect the performance of distributed optimization algorithms, emphasizing convergence rates, scalability, and overall efficiency. The research examines how quantization and erasure impact different distributed architectures like Federated Learning and push-pull gradient methods under different network topologies and suggests ways to reduce their effects.
dc.identifier.coursecodeDATX05
dc.identifier.urihttp://hdl.handle.net/20.500.12380/309073
dc.language.isoeng
dc.setspec.uppsokTechnology
dc.subjectDistributed optimization
dc.subjectmachine learning
dc.subjectquantization
dc.subjecterasure
dc.subjectconvergence
dc.subjectcommunication overhead
dc.subjectscalability
dc.subjectFederated Learning
dc.subjectpush-pull gradient methods
dc.subjectdistributed systems
dc.titleStudying Imperfect Communication In Distributed Optimization Algorithm
dc.type.degreeExamensarbete för masterexamensv
dc.type.degreeMaster's Thesisen
dc.type.uppsokH
local.programmeComputer systems and networks (MPCSN), MSc
local.programmeHigh-performance computer systems (MPHPC), MSc

Ladda ner

Original bundle

Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
CSE 24-187 SM MV.pdf
Storlek:
1.49 MB
Format:
Adobe Portable Document Format

License bundle

Visar 1 - 1 av 1
Hämtar...
Bild (thumbnail)
Namn:
license.txt
Storlek:
2.35 KB
Format:
Item-specific license agreed upon to submission
Beskrivning: