This page contains links to several software tools/packages I have written. This code is provided as-is, with no guarantees. If you do happen to find any bugs, please let me know!


High-Dimensional Bayesian Regularised Regression with the BayesReg Package

MATLAB and R code to implement efficient, high dimensional Bayesian regression with continuous shrinkage priors. The package is very flexible, fast and highly numerically stable, particularly in the case of the horseshoe/horseshoe+, for which the heavy tails of the prior distributions cause problems for many other implementations.

In addition to efficiency, we have also attempted to make the software as user-friendly as possible. Most features are straightforward to use and the toolbox can work directly with MATLAB tables (including automatically handling categorical variables), or you can use standard MATLAB matrices.

We have just released Version 1.6, which now includes the ability to block sample the coefficients to allow the code to be applied to situations with very large numbers of predictors as well as an efficient MATLAB implementation of the logistic regression sampling code, and  new parallelised implementation of the logistic sampling algorithms written in C++. Using compiled C++ code can result in substantial improvements in speed and is recommended.

Precompiled MEX files for Windows, Linux and MacOSX can be obtained here. To use these, all you need to do is download this file and unzip the contents into the “bayesreg” folder.

Key features of BayesReg:

  1. Gaussian (“L2 errors”),
  2. Laplace (“L1 errors”),
  3. Student-t (very heavy tails)
  4. Logistic regression (binary data)
  5. Ridge regression (“L2″ shrinkage/regularisation)
  6. LASSO regression (“L1″ shrinkage/regularisation)
  7. Horseshoe regression (global-local shrinkage for sparse models)
  8. Horseshoe+ regression (global-local shrinkage for ultra-sparse models)
  9. User-friendly, robust code that directly supports MATLAB tables
  10. Includes a flexible prediction function that also provides prediction statistics
  11. Support for categorical predictor variables
  12. Can handle multiple groupings of variables into potentially overlapping groups using the ‘groups’ option (MATLAB only)
  13. Grouping works with HS, HS+ and lasso priors (MATLAB only)
  14. Ability to sample the coefficients in blocks, so that the code can be applied to very high dimensional problems even on standard PCs
  15. Reporting of goodness-of-fit statistics include the WAIC
  16. Extensive example code demonstrating how to use the main features of the toolbox

The MATLAB code for Version 1.5 of the package can be downloaded from the MATLAB file exchange and the R code can be obtained from CRAN under the package name “bayesreg”. This R package can also be installed from within R by using the command “install.packages(“bayesreg”)”.

If you use the package, and wish to cite it in your work, please use the reference below. If you find this package useful, it would be great if you could leave a comment or rating on the MathWorks File Exchange page.

Previous Releases:

Version 1.4

Version 1.3

Version 1.2

Version 1.1


  1. “High-Dimensional Bayesian Regularised Regression with the BayesReg Package”, E. Makalic and D. F. Schmidt, arXiv:1611.06649 [stat.CO], 2016


Robust lasso regression with Student-t residuals

This MATLAB code implements lasso based estimation of linear models in which the residuals follow a Student-t distibution using the expectation-maximisation algorithm. By varying the degrees-of-freedom parameter of the Student-t likelihood, the model can be made more resistant to outlying observations.

The software has the following features:

  1. Automatically generate complete lasso regularisation paths for a given degrees-of-freedom
  2. Selection of lasso regularisation parameter and degrees-of-freedom using either cross-validation or information criteria.

The code is straightforward to run, efficient and comes with several examples that recreate the analyses from the paper below. To cite this toolbox, please use the reference below:

The code can be obtained from MathWorks File Exchange.



  1. “Robust Lasso Regression with Student-t Residuals”, D. F. Schmidt and E. Makalic, Lecture Notes in Artificial Intelligence, Vol. 10400, pp. pp. 365-374, 2017


Bayesian LASSO for estimation of stationary autoregressive models

This MATLAB code implements the Bayesian LASSO sampling hierarchy for inference of autoregressive models from an observed time series. The idea behind the approach is place Laplace prior distributions over the partial autocorrelations of an AR(k) model, which leads to a relatively simple Gibbs’ sampling scheme, and guarantees stationarity. Both empirical Bayes and fully Bayesian estimation of the shrinkage hyperparameter is available.

Once downloaded and extracted, all 3 folders/subfolders should be added to the MATLAB path. [code]



  1. "Estimation of Stationary Autoregressive Models with the Bayesian LASSO", D. F. Schmidt and E. Makalic, Journal of Time Series Analysis, Vol. 34, No. 5, pp. 517--531, 2013


Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>