NeurIPS (2. Dec – 9. Dec 2018)
I am attending NeurIPS. 2 posters are presented for the main track and one poster for PPML workshop.
Microsoft Research (21. Oct – 16. Nov 2018)
I have visited Lin Xiao in Seattle for almost a month. During the say, I gave talk at Microsoft Research on following paper: Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization
Informs annual meeting (3. Nov – 7. Nov 2018)
I have attended Informs annual meeting in Phoenix. I have organized a session with the following speakers: Naman Agarwal and Majid Jahani. I will also give a talk on Accelerated Coordinate Descent with Arbitrary Sampling and Best Rates for Minibatches, (joint work with Peter Richtárik)
Paper accepted to NIPS 2018 Privacy Preserving Machine Learning workshop (31. Oct 2018)
Title: A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion, joint work with Jakub Konecny, Nicolas Loizou, Peter Richtárik and Dmitry Grishenko
NIPS Travel award ($1500) (Oct 2018)
I have received a travel award to attend NIPS (2-8 Oct 2018) in Montreal.
Paper out (10. Aug 2018)
Title: Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization, joint work with Peter Richtárik and Lin Xiao
Amazon Research (15. Jun – 30. Sep 2018)
I have done an internship with an Amazon research team in Berlin for 3.5 months, under Rodolphe Jenatton. It was also my pleasure to work with Matthias Seeger and Cedric Archambeau. This is how Amazon stock dropped after when I left:
3 papers out (May 2018)
Title: A Nonconvex Projection Method for Robust PCA, joint work with Aritra Dutta and Peter Richtárik
Title: Accelerated Coordinate Descent with Arbitrary Sampling and Best Rates for Minibatches, joint work with Peter Richtárik
Title: SEGA: Variance Reduction via Gradient Sketching, joint work with Konstantin Mishchenko, Peter Richtárik
Microsoft Research (Mar 2018)
I am visiting Lin Xiao for a week. We are working on the following project: Accelerated Relative Gradient Descent
Informs conference on Optimization (Mar 2018)
I am attending Informs Optimization conference (Denver, CO) chairing one session. Talk title: Randomized and Accelerated Algorithms for Minimizing Relatively Smooth Functions.
New paper out (Feb 2018)
Title: Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization, joint work with Robert Gower, Sebastian Stich and Peter Richtárik
Optimization and Big Data (Feb 2018)
I am attending OBD conference, giving a short talk together with a poster presentation on the following topic: Randomized and Accelerated Algorithms for Minimizing Relatively Smooth Functions
End of Semester (Dec 2017)
I have passed KAUST qualifying exams, consisting of the following courses: Probability and Statistics, Numerical Linear Algebra and Partial Differential Equations.
OMS conference (Dec 2017)
I am attending Optimization Methods and Software conference on Cuba. I am co-organizing one minisymposium, and giving a talk on: Randomized and Accelerated Algorithms for Minimizing Relatively Smooth Functions
KAUST (Sep 2017 – ???)
I am joining KAUST. Officially I am starting a fresh PhD, practically I am transferring my PhD from University of Edinburgh.