how to create a web page
Mobirise

Biography

I am a head of three labs: Theoretical Aspects of Federated Learning Lab at Ivannikov Institute for System Programming (ISP RAS), Mathematical Foundations of Distributed and Federated Optimization Lab at Innopolis University and and MIPT-Yandex Fundamental Research Lab. I also do research as a scientists at  Moscow Institute of Physics and Technology, Innopolis University, IITP RAS. In addition to research, I give lectures at Moscow Institute of Physics and Technology, Moscow State University, Skoltech, Innopolis University and CC RAS

I received my PhD in Computer Science from Moscow Institute of Physics and Technology, Phystech School of Applied Mathematics and Informatics, where worked under the supervision of professor Alexander Gasnikov.  

My research interests include Stochastic Optimization, Distibuted Optimization and their applications to Machine, Deep and Federated Learning. Feel free to email me and discuss possible mutual research collabarations.

My contacts

Email №1:

anbeznosikov@gmail.com

Email №2:

beznosikov.an@phystech.edu

Computer skills:

Operating systems:
Mac OSX, Linux, Microsoft Windows
Programming languages:
Python, LaTeX, SQL, C, C# 

Languages:

Russian (native)
English (upper intermediate) 

Interests:

Basketball
Tennis

Education

Sep 2016 - Aug 2020

BSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics/Department of Control and Applied Mathematics

Thesis (in Russian): Distributed decentralized gradient-free methods for solving non-smooth stochastic convex optimization problems

Advisor: Alexander Gasnikov 

Sep 2020 - Aug 2022

MSc at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics

Thesis: Methods for solving distributed saddle point problems: lower bounds, optimal and practical algorithms 

Advisor: Alexander Gasnikov 

Sep 2022 -  Aug 2023

PhD at Moscow Institute of Physics and Technology

Phystech School of Applied Mathematics and Informatics

Thesis: Gradient-Free Methods for Saddle-Point Problems and Beyond

Advisor: Alexander Gasnikov

Work Experience

Aug 2023 - Now

Center of Artificial Intelligence Techology, Skoltech

Research engineer. Head: Evgeny Burnaev

Oct 2023 - Now

Department of Mathematical Mathods of Forcasting, MSU

Mathematician

Nov 2023 - Now

Sber AI Lab, Sber

Researcher. Heads: Gleb Gusev, Andrey Savchenko.

Sep 2017 - Now

Moscow Institute of Physics and Technology

Senior instructor for the following courses: Stochastic Process, Probability Theory, Discrete Analysis, Databases.

Feb 2021 - Now

MADE: Big Data Academy Mail.Ru group

Teaching assistant.

Feb 2021 - Now

International Laboratory of Stochastic Algorithms and High-Dimensional Inference, HSE

Research assistant. Heads: Eric Moulines, Alexey Naumov.

Mar 2021 - Now

Laboratory of Advanced Combinatorics and
Network Applications, MIPT

Junior Researcher. Head: Andrei Raigorodskii.

Jul 2021 - Now                         

Yandex Research

Research intern in the Yandex Research team.

Sep 2022 - Now

Laboratory of Mathematical Methods of Optimization, MIPT

Researcher. Head: Alexander Gasnikov

Oct 2022 - Now

Artificial Intelligence Research Centre, Innopolis University

Leading expert

Jan 2023 - Now

MIPT-Yandex Fundamental Research Laboratory

Laboratory head

Mar 2023 - Now

Machine Learning Department, MBZUAI

Researcher. Supervisor: Martin Takac

Jun 2023 - Now

Lab 10, IITP RAS

Researcher. Head: Alexander Gasnikov

Jul 2023 - Now

Department of Data Analysis, MIPT

Senior instructor

Aug 2023 - Now

Department of Discrete Mathematics, MIPT

Lecturer

Current posts

Mobirise

AISTATS: 1 accepted paper

We have 1 paper at the main conference (poster):

Stochastic Frank-Wolfe: Unified Analysis and New Faces of Classical Method (with Ruslan Nazykov, Aleksandr Shestakov, Vladimir Solodkin, Gauthier Gidel, Alexander Gasnikov)

Jan 20, 2024

Mobirise

ICLR: 1 accepted paper

We have 1 paper at the main conference (poster):

Ito Diffusion Approximation of Universal Ito Chains for Sampling, Optimization and Boosting (with Aleksei Ustimenko)

Jan 20, 2024

Mobirise

New paper out

"Activations and Gradients Compression for Model-Parallel Training" - joint work with Mikhail Rudakov, Yaroslav Kholodov, Alexander Gasnikov.

Jan 18, 2024

Mobirise

New paper out

"Optimal data splitting in distributed optimization for machine learning" - joint work with Daniil Medyakov, Gleb Molodtsov, Aleksandr Beznosikov, Alexander Gasnikov.

Jan 18, 2024

Mobirise

New paper out

"Optimal Analysis of Method with Batching for Monotone Stochastic Finite-Sum Variational Inequalities" - joint work with Alexander Pichugin, Maksim Pechin, Alexander Gasnikov.

Jan 18, 2024

Mobirise

Yandex ML Prize

I received the Yandex ML Prize award in the "Yandex researchers" category! Thank you to everyone who was there for me!

Dec 16, 2023

Current publications

Activations and Gradients Compression for Model-Parallel Training

Mikhail Rudakov, Aleksandr Beznosikov, Yaroslav Kholodov, Alexander Gasnikov

Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravlenia

AI Journey best paper award

December 2023

Optimal Data Splitting in Distributed Optimization for Machine Learning



December 2023

Optimal Analysis of Method with Batching for Monotone Stochastic Finite-Sum Variational Inequalities

Alexander Pichugin, Maksim Pechin, Aleksandr Beznosikov, Alexander Gasnikov


December 2023

About some works of Boris Polyak on convergence of gradient methods and their development

Seydamet Ablaev, Aleksandr Beznosikov, Alexander Gasnikov, Darina Dvinskikh, Aleksandr Lobanov, Sergei Puchinin, Fedor Stonyakin

November 2023

Bregman Proximal Method for Efficient Communications under Similarity


November 2023

© 2020-2024 Aleksandr Beznosikov