Hi, I'm Hunter Lang. Starting in Fall 2020 I will be a PhD student at MIT advised by David Sontag. You can reach me at hjl at mit dot edu. I'm interested in approximate inference, stochastic optimization, and weak supervision.

Preprints:

Statistical adaptive stochastic gradient methods (in submission).
P. Zhang, HL, Q. Liu, L. Xiao.

Publications:

Using statistics to automate stochastic optimization (NeurIPS 2019).
HL, P. Zhang, L. Xiao.

Understanding the role of momentum in stochastic gradient methods (NeurIPS 2019).
I. Gitman, HL, P. Zhang, L. Xiao.

Block stability for MAP inference (AISTATS 2019, oral presentation).
HL, A. Vijayaraghavan, D. Sontag.

Optimality of approximate inference algorithms on stable instances (AISTATS 2018).
HL, A. Vijayaraghavan, D. Sontag.

Perturbation stability for approximate MAP inference (M.Eng. thesis, 2019, pdf).