Stochastic tree ensembles for regularized nonlinear regression

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

10 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)551–570
Journal / PublicationJournal of the American Statistical Association
Volume118
Issue number541
Online published14 Jun 2021
Publication statusPublished - Mar 2023

Abstract

This article develops a novel stochastic tree ensemble method for nonlinear regression, referred to as accelerated Bayesian additive regression trees, or XBART. By combining regularization and stochastic search strategies from Bayesian modeling with computationally efficient techniques from recursive partitioning algorithms, XBART attains state-of-the-art performance at prediction and function estimation. Simulation studies demonstrate that XBART provides accurate point-wise estimates of the mean function and does so faster than popular alternatives, such as BART, XGBoost, and neural networks (using Keras) on a variety of test functions. Additionally, it is demonstrated that using XBART to initialize the standard BART MCMC algorithm considerably improves credible interval coverage and reduces total run-time. Finally, two basic theoretical results are established: the single tree version of the model is asymptotically consistent and the Markov chain produced by the ensemble version of the algorithm has a unique stationary distribution.

Research Area(s)

  • Machine learning, Markov chain Monte Carlo, Regression trees, Supervised learning, Bayesian

Citation Format(s)

Stochastic tree ensembles for regularized nonlinear regression. / He, Jingyu; Hahn, P. Richard.
In: Journal of the American Statistical Association, Vol. 118, No. 541, 03.2023, p. 551–570.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review