Research Interests

My research interests include Bayesian machine learning, deep learning, stochastic algorithms and generative models. Recently, I'm working on scalable Bayesian inference with theoretical guarantees and Bayesian deep learning.

Conference Papers

AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC

Ruqi Zhang, A. Feder Cooper, Christopher De Sa
AISTATS, 2020 [Paper] [Code]

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning Oral

Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
ICLR, 2020 [Paper] [Code]

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees Spotlight

Ruqi Zhang, Christopher De Sa
NeurIPS, 2019 [Paper] [Code] [Poster] [Slides]

Large Scale Sparse Clustering

Ruqi Zhang, Zhiwu Lu
IJCAI, 2016 [Paper]

Workshop Papers

Meta-Learning for Variational Inference

Ruqi Zhang, Yingzhen Li, Christopher De Sa, Sam Devlin, Cheng Zhang
Symposium on Advances in Approximate Bayesian Inference (AABI), 2019 [Paper]

Talks

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning

Oral presentation at ICLR, April 2020
[Video]

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

Spotlight presentation at NeurIPS, December 2019
[Slides]

Contact

Email: rz297@cornell.edu