simple site templates

Talks

[6] "On the convergence of SGD-like methods for convex and non-convex optimization problems"
Russian Optimization Seminar, online, 8 July, 2020 (in Russian)
[slides] [video]

[5] "A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent"
SIERRA, INRIA, Paris, France, 18 October, 2019
[slides]

[4] 23rd International Symposium on Mathematical Programming
Section "New methods for stochastic optimization and variational inequalities"
Talk ”An Accelerated Directional Derivative Method for Smooth Stochastic Convex Optimization”
Bordeaux, 6 July, 2018
[slides]

[3] Workshop ”Optimization at Work”
Talk ”An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization” 
Moscow, Russia, 14 April, 2018
[slides] [video]

[2] 60th Scientific Conference of MIPT
Section of information transmission problems, data analysis and optimization
Talk ”About accelerated Directional Search with non-Euclidean prox-structure”
Moscow, Russia, 25 November, 2017
[slides]

[1] Workshop ”Optimization at Work”
Talk ”Accelerated Directional Search with non-Euclidean prox-structure”
Moscow, Russia, 27 October, 2017
[slides]

Posters

[6] Machine Learning Summer School 2020
Virtual poster "Linearly Converging Error Compensated SGD"
[video] [slides]
Online, 8 July, 2020

[5] ICLR 2020 
Virtual poster "A Stochastic Derivative Free Optimization Method with Momentum"
Online, 27 April, 2020

[4] NeurIPS2019 workshop "Optimization Foundations for Reinforcement Learning"
Poster "A Stochastic Derivative Free Optimization Method with Momentum"
Based on the joint work with Adel Bibi, Ozan Sener, El Houcine Bergou and Peter Richtárik
Vancouver, Canada, 14 December, 2019

[3] NeurIPS2019 workshop "Beyond First Order Methods in ML"
Poster "An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization"
Based on the joint work with Pavel Dvurechensky and Alexander Gasnikov
Vancouver, Canada, 13 December, 2019

[2] Traditional Youth School ”Control, Information and Optimization” organized by Boris Polyak and Elena Gryazina
Poster "An Accelerated Directional Derivative Method for Smooth Stochastic Convex Optimization"
Voronovo, Russia, 10-15 June, 2018
Also my work was chosen and I gave a talk there.  I won third prize for this talk in competitions of best talks among participants.
[slides of the talk]

[1] KAUST Research Workshop on Optimization and Big Data 
Poster ”Stochastic Spectral Descent Methods”
Dmitry Kovalev, Eduard Gorbunov, Elnur Gasanov, Peter Richtárik
KAUST, Thuwal, KSA, 5 - 7 February, 2018