Abdurakhmon Sadiev
Abdurakhmon Sadiev
Home
Publications
Light
Dark
Automatic
3
SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Non-convex Cross-Device Federated Learning
Cross-device training is a crucial subfield of federated learning, where the number of clients can reach into the billions. Standard …
Avetik Karagulyan
,
Egor Shulgin
,
Abdurakhmon Sadiev
,
Peter Richtárik
PDF
Cite
A Unified Theory of Stochastic Proximal Point Methods without Smoothness
This paper presents a comprehensive analysis of a broad range of variations of the stochastic proximal point method (SPPM). Proximal …
Peter Richtárik
,
Abdurakhmon Sadiev
,
Yury Demidovich
PDF
Cite
Stochastic proximal point methods for monotone inclusions under expected similarity
Monotone inclusions have a wide range of applications, including minimization, saddle-point, and equilibria problems. We introduce new …
Abdurakhmon Sadiev
,
Laurent Condat
,
Peter Richtárik
PDF
Cite
High-Probability Convergence for Composite and Distributed Stochastic Minimization and Variational Inequalities with Heavy-Tailed Noise
High-probability analysis of stochastic first-order optimization methods under mild assumptions on the noise has been gaining a lot of …
Eduard Gorbunov
,
Abdurakhmon Sadiev
,
Marina Danilova
,
Samuel Horváth
,
Gauthier Gidel
,
Pavel Dvurechensky
,
Alexander Gasnikov
,
Peter Richtárik
PDF
Cite
Adaptive Compression for Communication-Efficient Distributed Training
We propose Adaptive Compressed Gradient Descent (AdaCGD) - a novel optimization algorithm for communication-efficient training of …
Maksim Makarenko
,
Elnur Gasanov
,
Rustem Islamov
,
Abdurakhmon Sadiev
,
Peter Richtárik
PDF
Cite
Federated Optimization Algorithms with Random Reshuffling and Gradient Compression
Gradient compression is a popular technique for improving communication complexity of stochastic first-order methods in distributed …
Abdurakhmon Sadiev
,
Grigory Malinovsky
,
Eduard Gorbunov
,
Igor Sokolov
,
Ahmed Khaled
,
Konstantin Burlachenko
,
Peter Richtárik
PDF
Cite
Stochastic gradient methods with preconditioned updates
This work considers non-convex finite sum minimization. There are a number of algorithms for such problems, but existing methods often …
Abdurakhmon Sadiev
,
Aleksandr Beznosikov
,
Abdulla Jasem Almansoori
,
Dmitry Kamzolov
,
Rachael Tappenden
,
Martin Takáč
PDF
Cite
AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods
We present AI-SARAH, a practical variant of SARAH. As a variant of SARAH, this algorithm employs the stochastic recursive gradient yet …
Zheng Shi
,
Abdurakhmon Sadiev
,
Nicolas Loizou
,
Peter Richtárik
,
Martin Takáč
PDF
Cite
Decentralized personalized federated min-max problems
Personalized Federated Learning (PFL) has recently seen tremendous progress, allowing the design of novel machine learning applications …
Ekaterina Borodich
,
Aleksandr Beznosikov
,
Abdurakhmon Sadiev
,
Vadim Sushko
,
Nikolay Savelyev
,
Martin Takáč
,
Alexander Gasnikov
PDF
Cite
Cite
×