Machine Learning Regression Marginal Effect Estimation: Extrapolation and Efficiency presented by Rodney Sparapani
Abstract: Machine learning regression techniques like deep learning and ensembles of trees are the best currently known for out-of-sample predictive performance. However, these methods can be viewed as black-box models, i.e., a vast number of parameters and details so complex that their meaning can only be gleaned from predictions. This has sparked a lot of interest in attempts to explain these predictions via marginal effects. In this talk, I will focus on two popular marginal effect methods: Friedman's partial dependence function and Shapley values. These approaches are widely applicable to machine learning/nonparametric regression. Here, I will focus on one particular method, Bayesian Additive Regression Trees, but these results likely hold in the wider genre.