Publications

(2023). High-Probability Convergence for Composite and Distributed Stochastic Minimization and Variational Inequalities with Heavy-Tailed Noise. arXiv preprint arXiv:2310.01860.

PDF Cite

(2023). High-Probability Bounds for Stochastic Optimization and Variational Inequalities: the Case of Unbounded Variance. arXiv preprint arXiv:2302.00999.

PDF Cite

(2022). Adaptive Compression for Communication-Efficient Distributed Training. arXiv preprint arXiv:2211.00188.

PDF Cite

(2022). Communication acceleration of local gradient methods via an accelerated primal-dual algorithm with inexact prox. Advances in Neural Information Processing Systems 35 (NeurIPS 2022).

PDF Cite

(2022). Federated Optimization Algorithms with Random Reshuffling and Gradient Compression. arXiv preprint arXiv:2206.07021.

PDF Cite

(2022). Stochastic gradient methods with preconditioned updates. arXiv preprint arXiv:2206.00285.

PDF Cite

(2022). An Approach for Non-convex Uniformly Concave Structured Saddle Point Problem. Computer Research and Modeling.

PDF Cite

(2022). Optimal algorithms for decentralized stochastic variational inequalities. Advances in Neural Information Processing Systems 35 (NeurIPS 2022).

PDF Cite

(2022). Decentralized personalized federated learning: Lower bounds and optimal algorithm for all personalization modes. EURO Journal on Computational Optimization.

PDF Cite

(2022). AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods. arXiv preprint arXiv:2102.09700.

PDF Cite

(2021). Decentralized personalized federated min-max problems. arXiv preprint arXiv:2106.07289.

PDF Cite

(2021). Zeroth-order algorithms for smooth saddle-point problems. Mathematical Optimization Theory and Operations Research: Recent Trends: 20th International Conference, MOTOR 2021, Irkutsk, Russia, July 5–10, 2021, Revised Selected Papers 20.

PDF Cite

(2021). Solving smooth min-min and min-max problems by mixed oracle algorithms. Mathematical Optimization Theory and Operations Research: Recent Trends: 20th International Conference, MOTOR 2021, Irkutsk, Russia, July 5–10, 2021, Revised Selected Papers 20.

PDF Cite

(2020). Gradient-free methods with inexact oracle for convex-concave stochastic saddle-point problem. Mathematical Optimization Theory and Operations Research: 19th International Conference, MOTOR 2020, Novosibirsk, Russia, July 6–10, 2020, Revised Selected Papers 19.

PDF Cite