文档介绍:Chapter 10
Multicollinearity: What Happens if Explanatory Variables are Correlated
One of the CLRM assumptions is: there is no perfect multicollinearity—no exact linear relationships among explanatory variables, Xs, in a multiple regression.
In practice, one rarely encounters perfect multicollinearity, but cases of near or very high multicollinearity where explanatory variables are approximately linearly related frequently arise in many applications.
The objects of this chapter:
●The Nature of multicollinearity;
● Is multicollinearity really a problem?
● The theoretical consequences of multicollinearity;
● How to detect multicollinearity?
● The remedial measures which can be used to eliminate multicollinearity.
: The Nature of Multicollinearity: The Case of Perfect Multicollinearity
In cases of perfect linear relationship or perfect multicollinearity among explanatory variables, we cannot obtain unique estimates of all parameters. And since we cannot obtain their unique estimates, we cannot draw any statistical inferences (., hypothesis testing) about them from a given sample.
Yi=A1+A2X2i+A3X3i+μi
Transformation: X3i =300-2X2i
Yi=A1+A2X2i+A3 ( 300-2X2i ) +μi
=(A1+300 A3 ) +(A2 -2 A3 ) X2i +μi
= C1 + C2 X2i +μi
Estimation: get the OLS estimators
C1 =A1+300 A3 , C2 =A2 -2 A3,
So from the estimators of C1 , C2 , we can not get the estimators of A1 , A2 and A3
That is : in cases of perfect multicollinearity, estimation and hypothesis testing about individual regression coefficients in a multiple regression are not possible. We can just obtain estimates of a bination of the original coefficients, but not each of them individually.
The Case of Near, or Imperfect, or High Multicollinearity
When we talk about multicollinearity, we usually refer it to imperfect multicollinearity.
X3i=B1+B2X2i+ei
If there are just two explanatory variables, the coefficient of correlation r can be used as a measure of the degree or strength of collinearity. Bu