Home ยป “Distributed Subgradient Method with Quantization and Flexible Weights for Large-scale Convex Optimization: Convergence Analysis and Privacy”

“Distributed Subgradient Method with Quantization and Flexible Weights for Large-scale Convex Optimization: Convergence Analysis and Privacy”

by satcit

https://pubmed.ncbi.nlm.nih.gov/38117628

The study proposes a distributed subgradient method with random quantization and flexible weights, providing convergence analysis for convex and weakly convex objective functions in the context of large-scale distributed optimization problems, addressing communication privacy and quality issues.

You may also like

Leave a Comment