Superiority of the MCRR Estimator Over Some Estimators In A Linear Model

Modified (r, k) class ridge regression (MCRR) which includes unbiased ridge regression (URR), (r, k) class, principal components regression (PCR) and the ordinary least squares (OLS) estimators is proposed in regression analysis, to overcome the problem of multicollinearity. In this paper, we derive the necessary and sufficient conditions for the superiority of the MCRR estimator over each of these estimators under the Mahalanobis loss function by the average loss criterion. Then, we compare these estimators with each other using the same criterion. Finally, a numerical example is done to illustrate the theoretical results.


INTRODUCTION
According to the Gauss Markov theorem, this is based on the general linear regression model, Y=XB+ε , where Y is an n × 1 vector of responses, X is an n × p observed matrix of the regressor variables, assumed to have full rank, i.e., rank(X) =p, B is a p × 1 vector of unknown parameters (to be estimated) and ε is an n × 1 vector of error terms assumed to be multivariate normally distributed with mean 0 and variance covariance σ 2 I p .It is well known that the ordinary least squares (OLS) estimator of B, ( ) . The standard regression model assumes that the column vectors in X are not linearly dependent.In many practical applications, however, engineering in particular, we often find that these column vectors are nearly linearly dependent.We then say that the multicollinearity problem is present.This problem causes the diagonal elements of ( ) t o i n fl a t e i m p l y i n g t h a t t h e estimated variance of LS B will be large.Multicollinearity could be present if small changes in the design matrix, X, causes the estimated coefficients to vary in sign.(see, e.g., Hoerl and Kennard (1970); Marquardt (1970); Mayer and Willke (1973); Swindel (1976); Batah andGore (2008, 2009); Batah et al. (2008Batah et al. ( , 2009))).The OLS estimation is not stable in the existence of multicollinearity.Hence, alternative estimation techniques were designed to eliminate the instability in the estimates which Education Collage for Pure Science\University of Al-anbar, Ramadi, Iraq Lecturer\ results in biased estimators and reduce the variance of the regression coefficients.Consequently, there is considerable interest in various biased estimators of B. The well known of these is the ordinary ridge regression (ORR) estimator of Hoerl and Kennard (1970).The ORR estimator proposed by Hoerl and Kennard (1970), is intended to overcome the problem of multicollinearity by adding a positive value k, usually 0 < k < 1, to the diagonal elements of the matrix XX ¢ .The ORR estimator of B ()( ) contains the OLS estimator when k=0.For increasing k, the ORR estimator approaches 0 which is stable but biased estimator of B. Swindel (1976) illustrated a technique for combining prior information with ridge regression, namely, modified ridge regression (MRR) estimator, extending Hoerl and Kennard's (1970) model as where b being a fixed vector of prior estimate of B. Crouse et al. (1995) illustrated to incorporate prior information in the ORR, namely, the unbiased ridge regression (URR) estimator as follows: for k > 0. Another possibility for the removal of information which is responsible for increase of impreciseness in estimation is offered by the principal components regression (PCR) estimator (see Massy (1965), Marquardt (1970) and Gunst and Mason (1977)).For this, let us consider the spectral decomposition of the matrix XX ¢ given as  Baye and Parker (1984) proposed the application of ridge methods to improve the PCR estimator, namely (r, k) class estimator as Thus, for suitable choices of the incorporated principal components, r, the ridge parameter, k , and the prior information, J, the MCRR estimator is a general estimator which includes the URR, the PCR, the (r, k) class and the OLS estimators as special cases.It is interesting to the note that the studies on the biased estimators use the mean square error (MSE) criterion, or equivalently the quadratic loss function as a measure of estimators performance.This article extends these studies by choice of the loss function used to decide on a preferred estimator of B .We considered the loss function is Mahalanobis loss function in order to comparing the MCRR estimator the OLS estimator, the PCR estimator and the ORR estimator.Mahalanobis loss function is previously used by Peddada et al. (1989) for comparing generalized ridge regression (GRR) estimator and the OLS estimator.In this article we compare the MCRR estimator to the OLS estimator, the PCR estimator and the ORR estimator by the average loss criterion.The average loss criterion may be defined as follows: Let ˆB1 and ˆB2 be two estimators for a parameter B .The estimator 1 B is superior to 2 B iff 12 ˆÊ(L(B ,B))<E(L(B ,B)) where L denotes the loss function.Clearly, when using mean squares as loss function, this reduces to the MSE criterion.For an estimator 1 B of B ,Mahalanobis loss function is defined as In this paper we compare the MCRR estimator to the OLS estimator, the PCR estimator and the ORR estimator.In special case, we get the comparison of the ORR estimator to the OLS estimator by the average loss criterion.We can also get the comparison of the PCR estimator to the OLS estimator and comparison of the PCR estimator to the ORR estimator under the Mahalanobis loss function by the average loss criterion.Finally, we consider a numerical example to justify the superiority of the mentioned estimator.
Superiority of the MCRR Estimator Over Some Estimators In A Linear Model ] 114 [

THE PERFORMANCE OF THE MCRR ESTIMATOR UNDER THE AVEARGE LOSS CRITERION
MCRR estimator is biased (Batah et al.(2009)) and the appropriate criterion for gauging the performance of this estimator is the quadratic loss function criterion.We considered the loss function is Mahalanobis loss function in order to comparing the MCRR estimator the OLS, PCR, ORR estimators.Let us denote covariance matrix of the MCRR estimator is  .It can be seen that p -r is positive.Also, Since C is nonnegative definite matrix.We obtain that the necessary and sufficient condition for the superiority of the OLS estimator to the MCRR estimator under the Mahalanobis loss function as follow: ( ) 2 -2 E Δ 0 p -r k σ B TCT B, ¢¢ £ Û£ Then the proof is completed.Also, we give an estimate for k in order that the OLS estimator cab be superior to the MCRR estimator.Theorem 2: For k > 0, the OLS estimator is better than the MCRR estimator under Mahalanobis loss function iff ( ) 2 σ p-r k , with B TCT B>0 BTCTB ¢¢ ³ ¢¢ Similarly, by using same proof of Theorem 1, we can compare between MCRR and other estimators as special cases from MCRR which is maintained above under the same criterion.
matrices such that the main diagonal elements of the r×r matrix r Λ are the r largest eigenvalues of XX ¢ , while the main diagonal elements of the (p − r) × (p − r) matrix p-r Λ are the remaining p − r eigenvalues.The p × p matrix remaining p -r columns of the matrix T. The PCR estimator for B can be written as Batah et al.(2009)  suggested a new class of estimators for the regression parameter by modifying the URR estimator in the line of the PCR estimator is called the modified (r, k) class ridge regression (MCRR) estimator denoted by : It may be noted the proposed estimator is obtained as the case of the (r, k) class estimator but with the URR estimator instead of the ORR estimator.The MCRR estimator Bkis the (r, k) class estimator.
J .It is known that the covariance matrix of the MCRR estimator is same to r-k class estimator when J = 0 as follows: B k =σ T T X XT +kI Λ T X XT +kI T ¢¢ ¢¢ ¢Now, we can write the Mahalanobis loss function of the MCRR estimator as follows: