High-Dimensional Bayesian Regularised Regression with the BayesReg Package

In conjunction with Enes Makalic I have recently finished writing MATLAB and R code to implement efficient, high dimensional Bayesian regression with continuous shrinkage priors. The package is very flexible, fast and highly numerically stable, particularly in the case of the horseshoe/horseshoe+, for which the heavy tails of the prior distributions cause problems for most other implementations. It supports the following data models:

  1. Gaussian (“L2 errors”)
  2. Laplace (“L1 errors”)
  3. Student-t (very heavy tails)
  4. Logistic regression (binary data)

It also supports a range of state-of-the-art continuous shrinkage priors to handle different underlying regression model structures:

  1. Ridge regression (“L2″ shrinkage/regularisation)
  2. LASSO regression (“L1″ shrinkage/regularisation)
  3. Horseshoe regression (global-local shrinkage for sparse models)
  4. Horseshoe+ regression (global-local shrinkage for ultra-sparse models)

The MATLAB code for Version 1.2 of the package can be downloaded here, and the R code can be obtained from CRAN under the package name “bayesreg”. This R package can also be installed from within R by using the command “install.packages(“bayesreg”)”.If you use the package, and wish to cite it in your work, please use the reference below.

 

References

  1. “High-Dimensional Bayesian Regularised Regression with the BayesReg Package”, E. Makalic and D. F. Schmidt, arXiv:1611.06649 [stat.CO], 2016
Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>