develop own website


Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities

Aleksandr Beznosikov, Alexander Gasnikov

February 2023

Randomized gradient-free methods in convex optimization

Alexander Gasnikov, Darina Dvinskikh, Pavel Dvurechensky, Eduard Gorbunov, Aleksander Beznosikov, Alexander Lobanov

November 2022

Decentralized optimization over time-varying graphs: a survey

October 2022

SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum Cocoercive Variational Inequalities

Aleksandr Beznosikov, Alexander Gasnikov

October 2022

Smooth Monotone Stochastic Variational Inequalities and Saddle Point Problems - Survey

August 2022

Compression and Data Similarity: Combination of Two Techniques for Communication-Efficient Solving of Distributed Variational Inequalities

Aleksandr Beznosikov, Alexander Gasnikov

June 2022

On Scaled Methods for Saddle Point Problems

June 2022

Stochastic Gradient Methods with Preconditioned Updates

Abdurakhmon Sadiev, Aleksandr Beznosikov, Abdulla Jasem Almansoori, Dmitry Kamzolov, Rachael Tappenden, Martin Takáč

June 2022

Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity

Poster at NeurIPS 2022 (virtual), proceedings

May 2022

Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods

Poster at AISTATS 2023 (Valencia), proceedings

February 2022

Optimal Algorithms for Decentralized Stochastic Variational Inequalities

Dmitry Kovalev, Aleksandr Beznosikov, Abdurakhmon Sadiev, Michael Persiianov, Peter Richtárik, Alexander Gasnikov

Poster at NeurIPS 2022 (virtual), proceedings

February 2022

The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems

Alexander Gasnikov, Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takáč, Pavel Dvurechensky, Bin Gu

Short talk at ICML 2022, proceedings

January 2022

A Unified Analysis of Variational Inequality Methods: Variance Reduction, Sampling, Quantization and Coordinate Descent

Aleksandr Beznosikov, Alexander Gasnikov, Karina Zainulina, Alexander Maslovskiy, Dmitry Pasechnyuk

January 2022

Random-reshuffled SARAH does not need a full gradient computations

Aleksandr Beznosikov, Martin Takac

Poster at NeurIPS 2021 Workshop on Optimization for Machine Learning (virtual)

November 2021

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees

Poster at NeurIPS 2022 (virtual), proceedings

October 2021

Distributed Saddle-Point Problems Under Similarity

Poster at NeurIPS 2021 (virtual), proceedings

July 2021

Decentralized and Personalized Federated Learning

Abdurakhmon Sadiev, Ekaterina Borodich, Aleksandr Beznosikov, Darina Dvinskikh, Martin TakacAlexander Gasnikov

July 2021

Near-Optimal Decentralized Algorithms for Saddle Point Problems over Time-Varying Networks

July 2021

One-Point Gradient-Free Methods for Composite Optimization with Applications to Distributed Optimization

Ivan Stepanov, Artyom Voronov, Aleksandr Beznosikov, Alexander Gasnikov

July 2021

Decentralized Local Stochastic Extra-Gradient for Variational Inequalities

Poster at NeurIPS 2022 (virtual), proceedings

June 2021

Decentralized Personalized Federated Learning: Lower Bounds and Optimal Algorithm for All Personalization Modes

Ekaterina Borodich, Aleksandr Beznosikov, Abdurakhmon Sadiev, Vadim Sushko, Nikolay Savelyev, Martin TakacAlexander Gasnikov

June 2021

One-Point Gradient-Free Methods for Smooth and Non-Smooth Saddle-Point Problems

Aleksandr Beznosikov, Vasilii Novitskii and Alexander Gasnikov

MOTOR 2021 (Irkutsk, Russia), LNCS series

March 2021

Solving smooth min-min and min-max problems by mixed oracle algorithms

Egor Gladin, Abdurakhmon Sadiev, Alexander Gasnikov, Pavel Dvurechensky, Aleksandr Beznosikov, Mohammad Alkousa

MOTOR 2021 (Irkutsk, Russia), CCIS series

March 2021

Distributed Saddle-Point Problems: Lower Bounds, Optimal Algorithms and Robust Algorithms

Aleksandr Beznosikov, Valentin Samokhin and Alexander Gasnikov

February 2021

Decentralized Distributed Optimization for Saddle Point Problems

Alexander Rogozin, Aleksandr Beznosikov, Darina Dvinskikh, Dmitry Kovalev, Pavel Dvurechensky and Alexander Gasnikov

February 2021

Recent theoretical advances in decentralized distributed convex optimization

High Dimensional Optimization and Probability Journal

November 2020

Zeroth-Order Algorithms for Smooth Saddle-Point Problems

Abdurakhmon Sadiev, Aleksandr Beznosikov, Pavel Dvurechensky, Alexander Gasnikov

MOTOR 2021 (Irkutsk, Russia), CCIS series

September 2020

Linearly Convergent Gradient-Free Methods for Minimization of Symmetric Parabolic Approximation

Aleksandra Bazarova, Aleksandr Beznosikov and Alexander Gasnikov

Computer Research and Modeling

September 2020

Gradient-Free Methods for Saddle-Point Problem

Aleksandr Beznosikov, Abdurakhmon Sadiev and Alexander Gasnikov

MOTOR 2020 (Novosibirsk, Russia), CCIS series

May 2020

On Biased Compression for Distributed Learning

Aleksandr Beznosikov, Samuel Horváth, Peter Richtárik and Mher Safaryan

Oral talk at NeurIPS 2020 Workshop on Scalability, Privacy and Security in Federated Learning (virtual)

Febuary 2020

Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization

Aleksandr Beznosikov, Eduard Gorbunov and Alexander Gasnikov

Poster at IFAC World Congress 2020 (Berlin, Germany), IFAC Papers Online

November 2019

© 2020-2022 Aleksandr Beznosikov