Preprints
Conference and journal papers
Grokking beyond neural networks: An empirical exploration with model complexity Jack Miller, Charles O'Neill, Thang Bui TMLR 2024
q-Paths: Generalizing the geometric annealing path using power means Vaden Masrani, Rob Brekelmans, Thang Bui, Frank Nielsen , Aram Galstyan, Greg Ver Steeg, and Frank Wood UAI 2021
Variational autoregressive Gaussian processes for continual learning Sanyam Kapoor, Theofanis Karaletsos, and Thang Bui ICML 2021 code
Hierarchical Gaussian process priors for Bayesian neural network weights Theofanis Karaletsos and Thang Bui NeurIPS 2020
Variational continual learning Cuong Nguyen, Yingzhen Li, Thang Bui, and Rich Turner, ICLR 2018 code
Neural graph learning: Training neural networks using graphs Thang Bui, Sujith Ravi, and Vivek Ramavajjala WSDM 2018
A unifying framework for sparse Gaussian process approximations using Power Expectation Propagation Thang Bui, Josiah Yan, Rich Turner JMLR 2017 code
Streaming sparse Gaussian process approximations Thang Bui, Cuong Nguyen, and Rich Turner NIPS 2017 code
Deep Gaussian processes for regression using approximate Expectation Propagation Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, and Rich Turner ICML 2016 code
Black-box alpha-divergence minimization José Miguel Hernández-Lobato, Yingzhen Li, Mark Rowland, Daniel Hernández-Lobato, Thang Bui, and Rich Turner ICML 2016 code
Learning stationary time series using Gaussian processes with nonparametric kernels Felipe Tobar, Thang Bui, and Rich Turner NIPS 2015 (Spotlight, acceptance rate = 3.6%)
Tree-structured Gaussian process approximations Thang Bui and Rich Turner NIPS 2014 (Spotlight, acceptance rate = 3.6%) code
Thesis
Workshop papers
AstroLLaMA: Towards specialised foundation models in Astronomy Tuan Dung Nguyen, Yuan-Sen Ting, Ioana Ciuca, et al. ACL Workshop on Information Extraction from Scientific Publications, 2023
Annealed importance sampling with q-paths Rob Brekelmans, Vaden Masrani, Thang Bui, Frank Wood, Aram Galstyan, Greg Ver Steeg, and Frank Nielsen, NeurIPS workshop on Deep Learning through Information Geometry, 2020 Best paper award
Variational autoregressive Gaussian processes for continual learning Sanyam Kapoor, Theofanis Karaletsos, and Thang Bui ICML workshop on Continual Learning, 2020
Gaussian process meta-representations for hierarchical neural network priors Theofanis Karaletsos and Thang Bui 2nd Symposium on Advances in Approximate Bayesian Inference, 2019
Partitioned variational inference for federated Bayesian deep learning Thang Bui, Cuong Nguyen, Siddharth Swaroop, and Rich Turner NeurIPS Bayesian Deep Learning Workshop, 2018
Understanding and improving variational continual learning Siddharth Swaroop, Cuong Nguyen, Thang Bui, and Rich Turner NeurIPS Continual Learning Workshop, 2018
Natural variational continual learning Hanna Tseran, Emtiyaz Khan, Tatsuya Harada and Thang Bui NeurIPS Continual Learning Workshop, 2018
Variational continual learning for deep models Cuong Nguyen, Yingzhen Li, Thang Bui, and Rich Turner NIPS Bayesian Deep Learning Workshop, 2017
Online variational Bayesian inference: Algorithms for sparse Gaussian processes and theoretical bounds Cuong Nguyen, Thang Bui, Yingzhen Li, and Rich Turner ICML Time Series Workshop, 2017
Importance weighted autoencoders with random neural network parameters Daniel Hernández-Lobato, Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner NIPS Workshop on Bayesian Deep Learning, 2016
Black-box alpha divergence for generative models Thang Bui, Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner NIPS Workshop on Advances in Approximate Bayesian Inference, 2016
Circular Pseudo-point approximations for scaling Gaussian processes Will Tebbutt, Thang Bui and Rich Turner NIPS Workshop on Advances in Approximate Bayesian Inference, 2016
Bayesian Gaussian process state space models via Power-EP Thang Bui, Carl Rasmussen and Rich Turner ICML Workshop on Data efficient Machine Learning, 2016
Training deep Gaussian processes using stochastic expectation propagation and probabilistic backpropagation Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato and Rich Turner NIPS Workshop on Advances in Approximate Bayesian Inference, 2015
Stochastic variational inference for Gaussian process latent variable models using back constraints Thang Bui and Rich Turner NIPS Workshop on Black Box Learning and Inference, 2015
Black-box alpha-divergence minimisation José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui and Rich Turner NIPS Workshops on Advances in Approximate Bayesian Inference and Black Box Learning and Inference, 2015.
Stochastic expectation propagation for large scale Gaussian process classification Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, Thang Bui and Rich Turner NIPS Workshop on Advances in Approximate Bayesian Inference, 2015.
Design of covariance functions using inter-domain inducing variables Felipe Tobar, Thang Bui and Rich Turner NIPS Workshop on Time Series, 2015 Best paper award
Misc
|