create your own web page

Talks

NeurIPS 2021, workshop on Optimization for Machine Learning
"Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization"

poster
Sydney, Australia (online)
11 December 2021

NeurIPS 2021, main conference
"Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization"

poster
Sydney, Australia (online)
8 December 2021

Martin Takac's class
"Derivative-Free Methods"

2 hour oral talk
Abu Dhabi, UAI
20-21 November 2021

OPTIMA 2021
"Near-Optimal Decentralized Algorithms for Saddle Point Problems over Time-Varying Networks"

20 min oral talk
Petrovac, Montenegro (online)
27 September 2021

All-Russian Optimization Seminar
"Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization"

45 min oral talk
Moscow, Russia (online)
30 June 2021

Control, Information and Optimization Summer School (OZON)
"On Saddle-Point Problems and Variational Inequalities"

1,5 hour lecture
Moscow, Russia
11 June 2021

MOCCA 2020
"On Distributed Saddle-Point Problems"

30 min oral talk
Moscow, Russia (online)
2 June 2021

MADE: Big Data Academy Mail.Ru group
"Saddle-Point Problems and Variational Inequalities: Theory and Practice"

3 hour lecture
Moscow, Russia
13 April 2021

Communication Efficient Distributed Optimization Workshop
"Distributed Saddle-Point Problems: Lower Bounds, Optimal Algorithms and Federated GANs"

Poster session
Online
9 April 2021

MOTOR 2020
"Gradient-Free Methods for Saddle-Point Problem"

15 min oral talk
Novosibirsk, Russia (online)
15 July 2020

IFAC 2020
"Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization"

video talk and poster
Berlin, Germany (online)
12 July, 2020

QIPA 2019
"A Derivative Free Method for Distributed Optimization"

15 min oral talk
Moscow, Russia
2 December, 2019

The 62th MIPT Conference (winner)
"Безградиентый вариант слайдинга для задач распределённой оптимизации"

15 min oral talk
Moscow, Russia
23 November, 2019

© 2020-2022 Aleksandr Beznosikov