About Me
I am a postdoctoral researcher at MIT, where I work with Ali Jadbabaie and Suvrit Sra on algorithms for optimization in theoretical computer science, machine learning, and control theory. I completed my Ph.D. in 2023 from the University of Washington, Seattle, where I was mentored by Yin Tat Lee. [dissertation]
During my Ph.D., I interned with Jelena Diakonikolas at the University of Wisconsin, Madison, with Di Wang and Zhe Feng (Google Market Algorithms), and with Richard Q. Zhang and David P. Woodruff (DeepMind).
Before starting optimization research, I studied Electronics/Electrical Engineering, with research on rapid diagnostic tests for malaria, and spent some time in industry as a signal processing engineer working on pulse oximeters and bluetooth speakers.
Publications
Improved Sample Complexity of Imitation Learning for Barrier Model Predictive Control [arXiv]
Daniel Pfrommer*, Swati Padmanabhan*, Kwangjun Ahn, Jack Umenberger, Tobia Marcucci, Zakaria Mhammedi, Ali Jadbabaie
On the Hardness of Meaningful Local Guarantees in Nonsmooth Nonconvex Optimization [arXiv]
Guy Kornowski*, Swati Padmanabhan*, Ohad Shamir
In submission at Mathematics of Operations Research
First-Order Methods for Linearly Constrained Bilevel Optimization [arXiv]
Guy Kornowski*, Swati Padmanabhan*, Kai Wang*, Zhe Zhang*, Suvrit Sra
Neural Information Processing Systems (NeurIPS) 2024 (to appear)
On the Sample Complexity of Imitation Learning for Smoothed Model Predictive Control [arXiv]
Daniel Pfrommer*, Swati Padmanabhan*, Kwangjun Ahn, Jack Umenberger, Tobia Marcucci, Zakaria Mhammedi, Ali Jadbabaie
Conference on Decision and Control (CDC) 2024 (to appear)
Improving the Bit Complexity of Communication for Distributed Convex Optimization [arXiv]
with Mehrdad Ghadiri, Yin Tat Lee, William Swartworth, David P. Woodruff, and Guanghao Ye
Symposium on Theory of Computing (STOC) 2024 [our talk]
Computing Approximate $\ell_p$ Sensitivities [arXiv], [paper]
with David P. Woodruff and Qiuyi Zhang
Neural Information Processing Systems (NeurIPS) 2023 [slides]
Online Bidding Algorithms for Return-on-Spend Constrained Advertisers [arXiv]
with Zhe Feng and Di Wang
The WebConf (formerly WWW) 2023 [slides]
A Fast Scale-Invariant Algorithm for Non-negative Least Squares with Non-negative Data [arXiv]
with Jelena Diakonikolas, Chenghui Li, and Chaobing Song
Neural Information Processing Systems (NeurIPS) 2022
A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions [arXiv], [paper]
with Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Guanghao Ye
Neural Information Processing Systems (NeurIPS) 2022, Selected for Oral Presentation [slides]
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity [arXiv]
with Sally Dong, Haotian Jiang, Yin Tat Lee, and Guanghao Ye
Neural Information Processing Systems (NeurIPS) 2022
Computing Lewis Weights to High Precision [arXiv]
with Maryam Fazel, Yin Tat Lee, and Aaron Sidford
Symposium on Discrete Algorithms (SODA) 2022 [slides]
A Faster Interior-Point Method for Semidefinite Programming [arXiv]
with Haotian Jiang, Tarun Kathuria, Yin Tat Lee, and Zhao Song
Foundations of Computer Science (FOCS) 2020 [talk]
An O(m/epsilon^3.5) Algorithm for Semidefinite Programs with Diagonal Constraints [arXiv]
with Yin Tat Lee
Conference on Learning Theory (COLT) 2020 [talk]
Manuscripts
Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent [arXiv]
with Arun Jambulapati, Yin Tat Lee, Jerry Li, Kevin Tian
[5min talk | 20min talk]
This paper appeared in STOC 2020. However, after the publication, we found an error in the analysis (see arxiv), and we are trying to fix it.