ABSTRACT
In this talk I will survey recent advances in improving the asymptotic running time of solving several fundamental problems in convex optimization. In particular, I will discuss several key results in improving the theoretical performance of algorithms for regression, linear programming, and convex function minimization. Along the way I will highlight the key geometric and algorithmic insights underlying these results and discuss how they have been leveraged to improve the running time of core problems in combinatorial optimization ranging, from maximum flow to submodular function minimization and beyond.