free website templates
Mobirise

Biography

I am a second year student of master's program at Moscow Institute of Physics and Technology, Phystech School of Applied Mathematics and Informatics. I work under supervision of professor Alexander Gasnikov.  My research interests include Stochastic Optimization, Distibuted Optimization and their applications to Machine and Deep Learning.

My contacts

Email №1:

anbeznosikov@gmail.com

Email №2:

beznosikov.an@phystech.edu

Computer skills:

Operating systems:
Mac OSX, Linux, Microsoft Windows
Programming languages:
Python, LaTeX, SQL, C, C# 

Languages:

Russian (native)
English (upper intermediate) 

Interests:

Basketball:
Candidate Master of Sports in Russia

Education

Sep 2020 - Now

MSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics/Department of Control and Applied Mathematics

Sep 2016 - Aug 2020

BSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics/Department of Control and Applied Mathematics

Thesis (in Russian): Distributed decentralized gradient-free methods for solving non-smooth stochastic convex optimization problems

Advisor: Alexander Gasnikov 

Work Experience

Sep 2017 - Now

Moscow Institute of Physics and Technology

Teaching assistant for the following courses: Stochastic Process, Probability Theory, Discrete Analysis, Databases.

Feb 2021 - Now

MADE: Big Data Academy Mail.Ru group

Teaching assistant.

Feb 2021 - Now

International Laboratory of Stochastic Algorithms and High-Dimensional Inference

Research assistant. Heads: Eric Moulines, Alexey Naumov.

Mar 2021 - Now

Laboratory of Advanced Combinatorics and
Network Applications

Junior Researcher. Head: Andrei Raigorodskii.

Jul 2021 - Now                         

Yandex Research

Research intern in the Yandex Research team.

Current posts

Mobirise

Grant for young researchers

Our team: Eduard Gorbunov, Alexander Rogozin, Vladislav Matyukhin and me, became the winner of a grant for young research groups.

Nov 30, 2021

Mobirise

New paper out

"Random-reshuffled SARAH does not need a full gradient computations" - joint work with Martin Takac.

Nov 26, 2021

Mobirise

NeurIPS: 4 accepted papers

We have one paper at the main conference. Also we have submitted 3 papers to OPT 2021: Optimization for Machine Learning and New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership workshops. All were accepted! These papers are

1) Distributed Saddle-Point Problems Under Similarity (with Gesualdo Scutari, Alexander Rogozin and Alexander Gasnikov) - main conference (poster) - arxiv

2) Decentralized Personalized Federated Learning: Lower Bounds and Optimal Algorithm for All Personalization Modes (with Abdurakhmon Sadiev, Ekaterina Borodich, Darina Dvinskikh, Martin Takac and Alexander Gasnikov) - OPT ML workshop (spotlight) - arxiv

3) Random-reshuffled SARAH does not need a full gradient computations (with Martin Takac) - OPT ML workshop (poster)

4) Decentralized Personalized Federated Min-Max Problems (with Ekaterina Borodich, Abdurakhmon Sadiev, Vadim Sushko, Nikolay Savelyev, Martin Takac and Alexander Gasnikov) - NF in FL workshop - arxiv

Oct 22, 2021

Mobirise

New paper out

"Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees" - joint work with Peter Richtárik, Michael Diskin, Max Ryabinin and Alexander Gasnikov.

Oct 7, 2021

Mobirise

I am in MBZUAI till December

My internship as Research Assistant at Martin Takac's group has started today! It is a great pleasure for me to be a part of Martin's team and MBZUAI!

Oct 3, 2021

Current publications

Random-reshuffled SARAH does not need a full gradient computations

Aleksandr Beznosikov, Martin Takac

Poster at NeurIPS 2021 Workshop on Optimization for Machine Learning (virtual)

November 2021

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees


October 2021

Distributed Saddle-Point Problems Under Similarity


Poster at NeurIPS 2021 (virtual)

July 2021

Decentralized and Personalized Federated Learning


July 2021

© 2020-2021 Aleksandr Beznosikov