XBART : Accelerated Bayesian Additive Regression Trees

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)

View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Pages1130-1138
Volume89
Publication statusPublished - Apr 2019
Externally publishedYes

Conference

Title22nd International Conference on Artificial Intelligence and Statistics
LocationNaha
PlaceJapan
CityNaha
Period16 - 18 April 2019

Abstract

Bayesian additive regression trees (BART) (Chipman et. al., 2010) is a powerful predictive model that often outperforms alternative models at out-of-sample prediction. BART is especially well-suited to settings with unstructured predictor variables and substantial sources of unmeasured variation as is typical in the social, behavioral and health sciences. This paper develops a modified version of BART that is amenable to fast posterior estimation. We present a stochastic hill climbing algorithm that matches the remarkable predictive accuracy of previous BART implementations, but is many times faster and less memory intensive. Simulation studies show that the new method is comparable in computation time and more accurate at function estimation than both random forests and gradient boosting.

Citation Format(s)

XBART : Accelerated Bayesian Additive Regression Trees. / He, Jingyu; Yalov, Saar; Hahn, P. Richard.

Proceedings of Machine Learning Research. Vol. 89 2019. p. 1130-1138.

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)