Research Interests

My research interests include Bayesian machine learning, deep learning, stochastic algorithms and generative models. Recently, I'm working on scalable Bayesian inference with theoretical guarantees and applications in Bayesian deep learning.

Conference Papers

AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC

Ruqi Zhang, A. Feder Cooper, Christopher De Sa
AISTATS, 2020 [Coming soon]

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees Spotlight

Ruqi Zhang, Christopher De Sa
NeurIPS, 2019 [Paper] [Code] [Poster] [Slides]

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning Oral

Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
ICLR, 2020 [Paper] [Code]

Large Scale Sparse Clustering

Ruqi Zhang, Zhiwu Lu
IJCAI, 2016 [Paper]

Workshop Papers

Meta-Learning for Variational Inference

Ruqi Zhang, Yingzhen Li, Christopher De Sa, Sam Devlin, Cheng Zhang
Symposium on Advances in Approximate Bayesian Inference (AABI), 2019 [Paper]

Contact

Email: rz297@cornell.edu