Research Interests

I am interested in building scalable, reliable and efficient probabilistic models for machine learning and data science. Currently, I focus on developing fast and robust inference methods with theoretical guarantees and their applications with deep neural networks on real-world big data.

Conference Papers

Meta-Learning Divergences for Variational Inference

Ruqi Zhang, Yingzhen Li, Christopher De Sa, Sam Devlin, Cheng Zhang
Artificial Intelligence and Statistics (AISTATS), 2021
[Coming soon]

Asymptotically Optimal Exact Minibatch Metropolis-Hastings

Ruqi Zhang, A. Feder Cooper, Christopher De Sa
Neural Information Processing Systems (NeurIPS), 2020
Spotlight, acceptance rate 2.96%
[Paper][Code][Video][Slides]

AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC

Ruqi Zhang, A. Feder Cooper, Christopher De Sa
Artificial Intelligence and Statistics (AISTATS), 2020
[Paper] [Code]

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning

Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
International Conference on Learning Representations (ICLR), 2020
Oral, acceptance rate 1.85%
[Paper] [Code][Video][Slides]

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

Ruqi Zhang, Christopher De Sa
Neural Information Processing Systems (NeurIPS), 2019
Spotlight, acceptance rate 2.43%
[Paper] [Code] [Poster] [Slides]

Large Scale Sparse Clustering

Ruqi Zhang, Zhiwu Lu
International Joint Conference on Artificial Intelligence (IJCAI), 2016
[Paper]

Workshop Papers

Meta-Learning for Variational Inference

Ruqi Zhang, Yingzhen Li, Christopher De Sa, Sam Devlin, Cheng Zhang
Symposium on Advances in Approximate Bayesian Inference (AABI), 2019
[Paper]

Talks

Asymptotically Optimal Exact Minibatch Metropolis-Hastings

Spotlight talk in Rising Stars in Data Science Workshop at University of Chicago, January 2021
Spotlight presentation at NeurIPS, December 2020 [Video][Slides]

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning

Oral presentation at ICLR, April 2020 [Video][Slides]

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

Spotlight presentation at NeurIPS, December 2019 [Slides]

Contact

Email: rz297@cornell.edu