About Me

I am a postdoctoral researcher at MIT, where I work with Ali Jadbabaie and Suvrit Sra on algorithms for optimization in theoretical computer science and machine learning. I completed my Ph.D. in 2023 from the University of Washington, Seattle, mentored by Yin Tat Lee. [dissertation]

During my Ph.D., I interned with Jelena Diakonikolas at the University of Wisconsin, Madison, with Di Wang and Zhe Feng (Google Market Algorithms), and with Richard Q. Zhang and David P. Woodruff (Google DeepMind Vizier).

Before starting optimization research, I studied Electronics/Electrical Engineering, with research on rapid diagnostic tests for malaria, and spent some time in industry as a signal processing engineer working on pulse oximeters and bluetooth speakers.

Publications

Improving the Bit Complexity of Communication for Distributed Convex Optimization [arXiv]
with Mehrdad Ghadiri, Yin Tat Lee, William Swartworth, David P. Woodruff, and Guanghao Ye
Symposium on Theory of Computing (STOC) 2024

Computing Approximate $\ell_p$ Sensitivities [arXiv], [paper]
with David P. Woodruff and Qiuyi Zhang
Neural Information Processing Systems (NeurIPS) 2023 [slides]

Online Bidding Algorithms for Return-on-Spend Constrained Advertisers [arXiv]
with Zhe Feng, Di Wang
The WebConf (formerly WWW) 2023 [slides]

A Fast Scale-Invariant Algorithm for Non-negative Least Squares with Non-negative Data [arXiv]
with Jelena Diakonikolas, Chenghui Li, Chaobing Song
Neural Information Processing Systems (NeurIPS) 2022

A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions [arXiv], [paper]
with Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Guanghao Ye
Neural Information Processing Systems (NeurIPS) 2022, Selected for Oral Presentation [slides]

Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity [arXiv]
with Sally Dong, Haotian Jiang, Yin Tat Lee, Guanghao Ye
Neural Information Processing Systems (NeurIPS) 2022

Computing Lewis Weights to High Precision [arXiv]
with Maryam Fazel, Yin Tat Lee, Aaron Sidford
Symposium on Discrete Algorithms (SODA) 2022 [slides]

A Faster Interior-Point Method for Semidefinite Programming [arXiv]
with Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Zhao Song
Foundations of Computer Science (FOCS) 2020 [talk]

An O(m/epsilon^3.5) Algorithm for Semidefinite Programs with Diagonal Constraints [arXiv]
with Yin Tat Lee
Conference on Learning Theory (COLT) 2020 [talk]

Manuscripts

Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent [arXiv]
with Arun Jambulapati, Yin Tat Lee, Jerry Li, Kevin Tian
[5min talk | 20min talk]
This paper appeared in STOC 2020. However, after the publication, we found an error in the analysis (see arxiv), and we are trying to fix it.