Thang Bui

I'm a lecturer (equivalent to tenure-track Assistant Professor) in Machine Learning and Data Science at the University of Sydney, Australia.
I'm taking leave to spend time at Uber AI, working on probabilistic modelling, approximate inference, continual learning, model-based reinforcement learning, and bandits.

I completed my doctoral training at the Cambridge Machine Learning group, supervised by Richard Turner and advised by Carl Rasmussen. I worked on Gaussian process models and generic approximation methods.

Preprints

Connecting the Thermodynamic Variational Objective and Annealed Importance Sampling

Thang Bui
note

Variational autoregressive Gaussian processes for continual learning

Sanyam Kapoor, Theofanis Karaletsos, Thang Bui
https://arxiv.org/abs/2006.05468

Partitioned Variational Inference: A unified framework encompassing federated and continual learning

Thang Bui, Cuong Nguyen, Siddharth Swaroop, Rich Turner
https://arxiv.org/abs/1811.11206

Conference and journal papers

Hierarchical Gaussian process priors for Bayesian neural network weights

Theofanis Karaletsos and Thang Bui
NeurIPS 2020
arxiv

Variational Continual Learning

Cuong Nguyen, Yingzhen Li, Thang Bui, and Rich Turner,
ICLR 2018
arxiv

Neural graph learning: Training neural networks using graphs

Thang Bui, Sujith Ravi, and Vivek Ramavajjala
WSDM 2018
arxiv

A Unifying Framework for Sparse Gaussian Process Approximation using Power Expectation Propagation

Thang Bui, Josiah Yan, Rich Turner
JMLR 2017
arxiv code

Streaming Sparse Gaussian Process Approximations

Thang Bui, Cuong Nguyen, and Rich Turner
NIPS 2017
arxiv code

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, and Rich Turner
ICML 2016
paper code

Black-box alpha-divergence minimization

José Miguel Hernández-Lobato, Yingzhen Li, Mark Rowland, Daniel Hernández-Lobato, Thang Bui, and Rich Turner
ICML 2016
paper code

Learning stationary time series using Gaussian processes with nonparametric kernels

Felipe Tobar, Thang Bui, and Rich Turner
NIPS 2015 (Spotlight, acceptance rate = 3.6%)
paper

Tree-structured Gaussian process approximations

Thang Bui and Rich Turner
NIPS 2014 (Spotlight, acceptance rate = 3.6%)
paper code

Thesis

Efficient Deterministic Approximate Bayesian Inference for Gaussian Process Models

PhD thesis
University of Cambridge
pdf
typos in the print version

Workshop papers

Partitioned variational inference for federated Bayesian deep learning

Thang Bui, Cuong Nguyen, Siddharth Swaroop, and Rich Turner
NeurIPS Bayesian Deep Learning Workshop, 2018

Understanding and improving variational continual learning

Siddharth Swaroop, Cuong Nguyen, Thang Bui, and Rich Turner
NeurIPS Continual Learning Workshop, 2018

Natural variational continual learning

Hanna Tseran, Emtiyaz Khan, Tatsuya Harada and Thang Bui
NeurIPS Continual Learning Workshop, 2018

Variational continual learning for deep models

Cuong Nguyen, Yingzhen Li, Thang Bui, and Rich Turner
NIPS Bayesian Deep Learning Workshop, 2017

Online variational Bayesian inference: Algorithms for sparse Gaussian processes and theoretical bounds

Cuong Nguyen, Thang Bui, Yingzhen Li, and Rich Turner
ICML Time Series Workshop, 2017

Importance weighted autoencoders with random neural network parameters

Daniel Hernández-Lobato, Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner
NIPS Workshop on Bayesian Deep Learning, 2016

Black-box alpha divergence for generative models

Thang Bui, Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2016

Circular Pseudo-point approximations for scaling Gaussian processes

Will Tebbutt, Thang Bui and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2016

Bayesian Gaussian process state space models via Power-EP

Thang Bui, Carl Rasmussen and Rich Turner
ICML Workshop on Data efficient Machine Learning, 2016

Training deep Gaussian processes using stochastic expectation propagation and probabilistic backpropagation

Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2015

Stochastic variational inference for Gaussian process latent variable models using back constraints

Thang Bui and Rich Turner
NIPS Workshop on Black Box Learning and Inference, 2015

Black-box alpha-divergence minimisation

José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui and Rich Turner
NIPS Workshops on Advances in Approximate Bayesian Inference and Black Box Learning and Inference, 2015.

Stochastic expectation propagation for large scale Gaussian process classification

Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, Thang Bui and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2015.

Design of covariance functions using inter-domain inducing variables

Felipe Tobar, Thang Bui and Rich Turner
NIPS Workshop on Time Series, 2015 Best paper prize

Misc

Sparse Approximations for Non-Conjugate Gaussian Process Regressions

Thang Bui and Rich Turner
report

Other things

Reviewer for JMLR (2016, 2017, 2018), NIPS (2016-20), ICLR (2017-21), ICML (2017-20), AISTATS (2018-21), UAI (2018) and various NIPS and ICML workshops

Contact

thang.buivn at at at at gmail.com