10Rank‐Based Shrinkage Estimation

This chapter introduces the R‐estimates and provides a comparative study of ridge regression estimator (RRE), least absolute shrinkage and selection operator (LASSO), preliminary test estimator (PTE) and the Stein‐type estimator based on the theory of rank‐based statistics and the nonorthogonality design matrix of a given linear model.

10.1 Introduction

It is well known that the usual rank estimators (REs) are robust in the linear regression models, asymptotically unbiased with minimum variance. But, the data analyst may point out some deficiency with the R‐estimators when one considers the “prediction accuracy” and “interpretation.” To overcome these concerns, we propose the rank‐based least absolute shrinkage and selection operator (RLASSO) estimator. It defines a continuous shrinking operation that can produce coefficients that are exactly “zero” and competitive with the rank‐based “subset selection” and RRE, retaining the good properties of both the R‐estimators. RLASSO simultaneously estimates and selects the coefficients of a given linear regression model.

However, there are rank‐based PTEs and Stein‐type estimators (see Saleh 2006; Jureckova and Sen 1996, and Puri and Sen 1986). These R‐estimators provide estimators which shrink toward the target value and do not select coefficients for appropriate prediction and interpretations. Hoerl and Kennard (1970) introduced ridge regression based on the Tikhonov (1963) regularization, and Tibshirani ...

Get Theory of Ridge Regression Estimation with Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.