free portfolio site templates
Mobirise

Biography

I am a first year PhD student at Moscow Institute of Physics and Technology, Phystech School of Applied Mathematics and Informatics. I work under supervision of professor Alexander Gasnikov.  My research interests include Stochastic Optimization, Distibuted Optimization and their applications to Machine and Deep Learning.

My contacts

Email №1:

anbeznosikov@gmail.com

Email №2:

beznosikov.an@phystech.edu

Computer skills:

Operating systems:
Mac OSX, Linux, Microsoft Windows
Programming languages:
Python, LaTeX, SQL, C, C# 

Languages:

Russian (native)
English (upper intermediate) 

Interests:

Basketball
Tennis

Education

Sep 2016 - Aug 2020

BSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics/Department of Control and Applied Mathematics

Thesis (in Russian): Distributed decentralized gradient-free methods for solving non-smooth stochastic convex optimization problems

Advisor: Alexander Gasnikov 

Sep 2020 - Aug 2022

MSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics

Thesis: Methods for solving distributed saddle point problems: lower bounds, optimal and practical algorithms 

Advisor: Alexander Gasnikov 

Sep 2022 - Now

PhD at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics

Work Experience

Sep 2017 - Now

Moscow Institute of Physics and Technology

Teaching assistant for the following courses: Stochastic Process, Probability Theory, Discrete Analysis, Databases.

Feb 2021 - Now

MADE: Big Data Academy Mail.Ru group

Teaching assistant.

Feb 2021 - Now

International Laboratory of Stochastic Algorithms and High-Dimensional Inference, HSE

Research assistant. Heads: Eric Moulines, Alexey Naumov.

Mar 2021 - Now

Laboratory of Advanced Combinatorics and
Network Applications, MIPT

Junior Researcher. Head: Andrei Raigorodskii.

Jul 2021 - Now                         

Yandex Research

Research intern in the Yandex Research team.

Sep 2022 - Now

Laboratory of Mathematical Methods of Optimization, MIPT

Researcher. Head: Alexander Gasnikov

Oct 2022 - Now

Artificial Intelligence Research Centre, Innopolis University

Leading expert

Jan 2023 - Now

MIPT-Yandex Fundamental Research Laboratory

Laboratory head

Current posts

Mobirise

New paper out

"Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities" - joint work with Alexander Gasnikov.

Feb 15, 2023

Mobirise

Teacher on the Probability Theory course

In the new semester I teach the Stochastic Process and Numerical Optimization courses in MIPT as a seminar teacher. Also I am a co-lecturer of "Optimization in Machine Learning" course in VK Academy.

Feb 6, 2023

Mobirise

AISTATS: 1 accepted paper

We have 1 paper at the main conference (poster):

Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods (with Eduard Gorbunov, Hugo Berard and Nicolas Loizou) - arxiv

Jan 20, 2023

Mobirise

New paper out

"Randomized gradient-free methods in convex optimization" - joint work with Alexander Gasnikov, Darina Dvinskikh, Pavel Dvurechensky, Eduard Gorbunov and Alexander Lobanov.

Nov 24, 2021

Mobirise

New paper out

"Decentralized optimization over time-varying graphs: a survey" - joint work with Alexander Rogozin, Alexander Gasnikov and Dmitry Kovalev.

Oct 18, 2022

Mobirise

New paper out

"SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum Cocoercive Variational Inequalities" - joint work with Alexander Gasnikov.

Oct 12, 2022

Mobirise

NeurIPS: 4 accepted papers

We have 4 papers at the main conference (posters). These papers are

1) Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity (with Dmitry Kovalev, Ekaterina Borodich, Gesualdo Scutari, Alexander Gasnikov) - arxiv

2) Optimal Algorithms for Decentralized Stochastic Variational Inequalities (with Dmitry Kovalev, Abdurakhmon Sadiev, Michael Persiianov, Peter Richtárik, Alexander Gasnikov) - arxiv

3) Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees (with Peter Richtárik, Michael Diskin, Max Ryabinin, Alexander Gasnikov) - arxiv

4) Decentralized Local Stochastic Extra-Gradient for Variational Inequalities (with Pavel Dvurechensky, Anastasia Koloskova, Valentin Samokhin, Sebastian U Stich, Alexander Gasnikov) - arxiv

Sep 15, 2022

Current publications

Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities

Aleksandr Beznosikov, Alexander Gasnikov

February 2023

Randomized gradient-free methods in convex optimization

Alexander Gasnikov, Darina Dvinskikh, Pavel Dvurechensky, Eduard Gorbunov, Aleksander Beznosikov, Alexander Lobanov

November 2022

Decentralized optimization over time-varying graphs: a survey


October 2022

SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum Cocoercive Variational Inequalities

Aleksandr Beznosikov, Alexander Gasnikov

October 2022

© 2020-2022 Aleksandr Beznosikov