Misplaced Pages

Durbin–Wu–Hausman test

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Durbin-Wu-Hausman test) Statistical hypothesis test in econometrics

The Durbin–Wu–Hausman test (also called Hausman specification test) is a statistical hypothesis test in econometrics named after James Durbin, De-Min Wu, and Jerry A. Hausman. The test evaluates the consistency of an estimator when compared to an alternative, less efficient estimator which is already known to be consistent. It helps one evaluate if a statistical model corresponds to the data.

Details

Consider the linear model y = Xb + e, where y is the dependent variable and X is vector of regressors, b is a vector of coefficients and e is the error term. We have two estimators for b: b0 and b1. Under the null hypothesis, both of these estimators are consistent, but b1 is efficient (has the smallest asymptotic variance), at least in the class of estimators containing b0. Under the alternative hypothesis, b0 is consistent, whereas b1 isn't.

Then the Wu–Hausman statistic is:

H = ( b 1 b 0 ) ( Var ( b 0 ) Var ( b 1 ) ) ( b 1 b 0 ) , {\displaystyle H=(b_{1}-b_{0})'{\big (}\operatorname {Var} (b_{0})-\operatorname {Var} (b_{1}){\big )}^{\dagger }(b_{1}-b_{0}),}

where denotes the Moore–Penrose pseudoinverse. Under the null hypothesis, this statistic has asymptotically the chi-squared distribution with the number of degrees of freedom equal to the rank of matrix Var(b0) − Var(b1).

If we reject the null hypothesis, it means that b1 is inconsistent. This test can be used to check for the endogeneity of a variable (by comparing instrumental variable (IV) estimates to ordinary least squares (OLS) estimates). It can also be used to check the validity of extra instruments by comparing IV estimates using a full set of instruments Z to IV estimates that use a proper subset of Z. Note that in order for the test to work in the latter case, we must be certain of the validity of the subset of Z and that subset must have enough instruments to identify the parameters of the equation.

Hausman also showed that the covariance between an efficient estimator and the difference of an efficient and inefficient estimator is zero.

Derivation

Exclamation mark with arrows pointing at each otherThis article or section appears to contradict itself. Please see the talk page for more information. (July 2020)

Assuming joint normality of the estimators.

N [ b 1 b b 0 b ] d N ( [ 0 0 ] , [ Var ( b 1 ) Cov ( b 1 , b 0 ) Cov ( b 1 , b 0 ) Var ( b 0 ) ] ) {\displaystyle {\sqrt {N}}{\begin{bmatrix}b_{1}-b\\b_{0}-b\end{bmatrix}}{\xrightarrow {d}}{\mathcal {N}}\left({\begin{bmatrix}0\\0\end{bmatrix}},{\begin{bmatrix}\operatorname {Var} (b_{1})&\operatorname {Cov} (b_{1},b_{0})\\\operatorname {Cov} (b_{1},b_{0})&\operatorname {Var} (b_{0})\end{bmatrix}}\right)}

Consider the function : q = b 0 b 1 plim q = 0 {\displaystyle q=b_{0}-b_{1}\Rightarrow \operatorname {plim} q=0}

By the delta method

N ( q 0 ) d N ( 0 , [ 1 1 ] [ Var ( b 1 ) Cov ( b 1 , b 0 ) Cov ( b 1 , b 0 ) Var ( b 0 ) ] [ 1 1 ] ) Var ( q ) = Var ( b 1 ) + Var ( b 0 ) 2 Cov ( b 1 , b 0 ) {\displaystyle {\begin{aligned}&{\sqrt {N}}(q-0){\xrightarrow {d}}{\mathcal {N}}\left(0,{\begin{bmatrix}1&-1\end{bmatrix}}{\begin{bmatrix}\operatorname {Var} (b_{1})&\operatorname {Cov} (b_{1},b_{0})\\\operatorname {Cov} (b_{1},b_{0})&\operatorname {Var} (b_{0})\end{bmatrix}}{\begin{bmatrix}1\\-1\end{bmatrix}}\right)\\&\operatorname {Var} (q)=\operatorname {Var} (b_{1})+\operatorname {Var} (b_{0})-2\operatorname {Cov} (b_{1},b_{0})\end{aligned}}}

Using the commonly used result, showed by Hausman, that the covariance of an efficient estimator with its difference from an inefficient estimator is zero yields

Var ( q ) = Var ( b 0 ) Var ( b 1 ) {\displaystyle \operatorname {Var} (q)=\operatorname {Var} (b_{0})-\operatorname {Var} (b_{1})}

The chi-squared test is based on the Wald criterion

H = χ 2 [ K 1 ] = ( b 1 b 0 ) ( Var ( b 0 ) Var ( b 1 ) ) ( b 1 b 0 ) , {\displaystyle H=\chi ^{2}=(b_{1}-b_{0})'{\big (}\operatorname {Var} (b_{0})-\operatorname {Var} (b_{1}){\big )}^{\dagger }(b_{1}-b_{0}),}

where denotes the Moore–Penrose pseudoinverse and K denotes the dimension of vector b.

Panel data

The Hausman test can be used to differentiate between fixed effects model and random effects model in panel analysis. In this case, Random effects (RE) is preferred under the null hypothesis due to higher efficiency, while under the alternative Fixed effects (FE) is at least as consistent and thus preferred.

H0 is true H1 is true
b1 (RE estimator) Consistent
Efficient
Inconsistent
b0 (FE estimator) Consistent
Inefficient
Consistent

See also

References

  1. Durbin, James (1954). "Errors in variables". Review of the International Statistical Institute. 22 (1/3): 23–32. doi:10.2307/1401917. JSTOR 1401917.
  2. Wu, De-Min (July 1973). "Alternative Tests of Independence between Stochastic Regressors and Disturbances". Econometrica. 41 (4): 733–750. doi:10.2307/1914093. ISSN 0012-9682. JSTOR 1914093.
  3. ^ Hausman, J. A. (November 1978). "Specification Tests in Econometrics". Econometrica. 46 (6): 1251–1271. doi:10.2307/1913827. hdl:1721.1/64309. ISSN 0012-9682. JSTOR 1913827.
  4. Nakamura, Alice; Nakamura, Masao (1981). "On the Relationships Among Several Specification Error Tests Presented by Durbin, Wu, and Hausman". Econometrica. 49 (6): 1583–1588. doi:10.2307/1911420. JSTOR 1911420.
  5. Greene, William (2012). Econometric Analysis (7th ed.). Pearson. pp. 234–237. ISBN 978-0-273-75356-8.
  6. ^ Greene, William H. (2012). Econometric Analysis (7th ed.). Pearson. pp. 379–380, 420. ISBN 978-0-273-75356-8.

Further reading

Categories: