Papers

99% of Parallel Optimization is Inevitably a Waste of Time
Konstantin Mishchenko, Filip Hanzely, Peter Richtárik
Submitted, 2019

Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization
Filip Hanzely, Peter Richtárik, Lin Xiao
Submitted, 2018

Slides

SEGA: Variance Reduction via Gradient Sketching 
Filip Hanzely, Konstantin Mishchenko, Peter Richtárik
NIPS 2018
Poster

Accelerated Coordinate Descent with Arbitrary Sampling and Best Rates for Minibatches 
Filip Hanzely, Peter Richtárik
Submitted, 2018

Slides

A Nonconvex Projection Method for Robust PCA
Aritra Dutta, Filip Hanzely, Peter Richtárik
AAAI 2019 (oral)

Slides, poster

Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization
Robert Gower, Filip Hanzely, Sebastian Stich, Peter Richtárik
NIPS 2018
Poster

Fastest Rates for Stochastic Mirror Descent
Filip Hanzely, Peter Richtárik
Submitted, 2018

Slides

Privacy Preserving Randomized Gossip Algorithms via Controlled Noise Insertion
Filip Hanzely, Jakub Konečný, Nicolas Loizou, Peter Richtárik, Dmitry Grishchenko
Accepted to NIPS 2018 Privacy Preserving Machine Learning workshop 

Poster

Privacy Preserving Randomized Gossip Algorithms
Filip Hanzely, Jakub Konečný, Nicolas Loizou, Peter Richtárik, Dmitry Grishchenko
Submitted, 2017

Slides

Testing for Causality in Reconstructed State Spaces by Optimized Mixed Prediction Method
Anna Krakovská, Filip Hanzely
Physical review E, 2016