Multivariate Bayesian Inversion for Classification and Regression

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

We propose the statistical modelling approach to supervised learning (i.e. predicting labels from features) as an alternative to algorithmic machine learning (ML). The approach is demonstrated by employing a multivariate general linear model (MGLM) describing the effects of labels on features, possibly accounting for covariates of no interest, in combination with prior distributions on the model parameters. ML “training” is translated into estimating the MGLM parameters via Bayesian inference and ML “testing” or application is translated into Bayesian model comparison – a reciprocal relationship we refer to asmultivariate Bayesian inversion(MBI). We devise MBI algorithms for the standard cases of supervised learning, discrete classification and continuous regression, derive novel classification rules and regression predictions, and use practical examples (simulated and real data) to illustrate the benefits of the statistical modelling approach: interpretability, incorporation of prior knowledge, and probabilistic predictions. We close by discussing further advantages, disadvantages and the future potential of MBI.

Related articles

Related articles are currently not available for this article.