*** Welcome to piglix ***

Least-angle regression


In statistics, least-angle regression (LARS) is an algorithm for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani.

Suppose we expect a response variable to be determined by a linear combination of a subset of potential covariates. Then the LARS algorithm provides a means of producing an estimate of which variables to include, as well as their coefficients.

Instead of giving a vector result, the LARS solution consists of a curve denoting the solution for each value of the L1 norm of the parameter vector. The algorithm is similar to forward stepwise regression, but instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one's correlations with the residual.

The advantages of the LARS method are:

The disadvantages of the LARS method include:

The basic steps of the Least-angle regression algorithm are:

Least-angle regression is implemented in R via the lars package, in python with the scikit-learn package [1], and in SAS via the GLMSELECT procedure.


...
Wikipedia

...