Misplaced Pages

Working–Hotelling procedure

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Method of simultaneous inference
Part of a series on
Regression analysis
Models
Estimation
Background

In statistics, particularly regression analysis, the Working–Hotelling procedure, named after Holbrook Working and Harold Hotelling, is a method of simultaneous estimation in linear regression models. One of the first developments in simultaneous inference, it was devised by Working and Hotelling for the simple linear regression model in 1929. It provides a confidence region for multiple mean responses, that is, it gives the upper and lower bounds of more than one value of a dependent variable at several levels of the independent variables at a certain confidence level. The resulting confidence bands are known as the Working–Hotelling–Scheffé confidence bands.

Like the closely related Scheffé's method in the analysis of variance, which considers all possible contrasts, the Working–Hotelling procedure considers all possible values of the independent variables; that is, in a particular regression model, the probability that all the Working–Hotelling confidence intervals cover the true value of the mean response is the confidence coefficient. As such, when only a small subset of the possible values of the independent variable is considered, it is more conservative and yields wider intervals than competitors like the Bonferroni correction at the same level of confidence. It outperforms the Bonferroni correction as more values are considered.

Statement

Simple linear regression

Consider a simple linear regression model Y = β 0 + β 1 X + ε {\displaystyle Y=\beta _{0}+\beta _{1}X+\varepsilon } , where Y {\displaystyle Y} is the response variable and X {\displaystyle X} the explanatory variable, and let b 0 {\displaystyle b_{0}} and b 1 {\displaystyle b_{1}} be the least-squares estimates of β 0 {\displaystyle \beta _{0}} and β 1 {\displaystyle \beta _{1}} respectively. Then the least-squares estimate of the mean response E ( Y i ) {\displaystyle E(Y_{i})} at the level X = x i {\displaystyle X=x_{i}} is Y i ^ = b 0 + b 1 x i {\displaystyle {\hat {Y_{i}}}=b_{0}+b_{1}x_{i}} . It can then be shown, assuming that the errors independently and identically follow the normal distribution, that an 1 α {\displaystyle 1-\alpha } confidence interval of the mean response at a certain level of X {\displaystyle X} is as follows:

y ^ i [ b 0 + b 1 x i ± t α / 2 , df = n 2 ( 1 n 2 j = 1 n e j 2 ) ( 1 n + ( x i x ¯ ) 2 j = 1 n ( x j x ¯ ) 2 ) ] , {\displaystyle {\hat {y}}_{i}\in \left,}

where ( 1 n 2 j = 1 n e j 2 ) {\displaystyle \left({\frac {1}{n-2}}\sum _{j=1}^{n}e_{j}^{\,2}\right)} is the mean squared error and t α / 2 , df = n 2 {\displaystyle t_{\alpha /2,{\text{df}}=n-2}} denotes the upper α 2 th {\displaystyle {\frac {\alpha }{2}}^{\text{th}}} percentile of Student's t-distribution with n 2 {\displaystyle n-2} degrees of freedom.

However, as multiple mean responses are estimated, the confidence level declines rapidly. To fix the confidence coefficient at 1 α {\displaystyle 1-\alpha } , the Working–Hotelling approach employs an F-statistic:

y ^ i [ b 0 + b 1 x i ± W ( 1 n 2 j = 1 n e j 2 ) ( 1 n + ( x i x ¯ ) 2 j = 1 n ( x j x ¯ ) 2 ) ] , {\displaystyle {\hat {y}}_{i}\in \left,}

where W 2 = 2 F α , df = ( 2 , n 2 ) {\displaystyle W^{2}=2F_{\alpha ,{\text{df}}=(2,n-2)}} and F {\displaystyle F} denotes the upper α th {\displaystyle \alpha ^{\text{th}}} percentile of the F-distribution with ( 2 , n 2 ) {\displaystyle (2,n-2)} degrees of freedom. The confidence level of is 1 α {\displaystyle 1-\alpha } over all values of X {\displaystyle X} , i.e. x i R {\displaystyle x_{i}\in \mathbb {R} } .

Multiple linear regression

The Working–Hotelling confidence bands can be easily generalised to multiple linear regression. Consider a general linear model as defined in the linear regressions article, that is,

Y = X β + ε , {\displaystyle \mathbf {Y} =\mathbf {X} {\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }},\,}

where

Y = ( Y 1 Y 2 Y n ) , X = ( x 1 T x 2 T x n T ) = ( x 11 x 1 p x 21 x 2 p x n 1 x n p ) , β = ( β 1 β 2 β p ) , ε = ( ε 1 ε 2 ε n ) . {\displaystyle \mathbf {Y} ={\begin{pmatrix}Y_{1}\\Y_{2}\\\vdots \\Y_{n}\end{pmatrix}},\quad \mathbf {X} ={\begin{pmatrix}\mathbf {x} _{1}^{\rm {T}}\\\mathbf {x} _{2}^{\rm {T}}\\\vdots \\\mathbf {x} _{n}^{\rm {T}}\end{pmatrix}}={\begin{pmatrix}x_{11}&\cdots &x_{1p}\\x_{21}&\cdots &x_{2p}\\\vdots &\ddots &\vdots \\x_{n1}&\cdots &x_{np}\end{pmatrix}},{\boldsymbol {\beta }}={\begin{pmatrix}\beta _{1}\\\beta _{2}\\\vdots \\\beta _{p}\end{pmatrix}},\quad {\boldsymbol {\varepsilon }}={\begin{pmatrix}\varepsilon _{1}\\\varepsilon _{2}\\\vdots \\\varepsilon _{n}\end{pmatrix}}.}

Again, it can be shown that the least-squares estimate of the mean response E ( Y i ) = x i T β {\displaystyle E(Y_{i})=\mathbf {x} _{i}^{\rm {T}}{\boldsymbol {\beta }}} is Y ^ i = x i T b {\displaystyle {\hat {Y}}_{i}=\mathbf {x} _{i}^{\rm {T}}\mathbf {b} } , where b {\displaystyle \mathbf {b} } consists of least-square estimates of the entries in β {\displaystyle {\boldsymbol {\beta }}} , i.e. b = ( X T X ) 1 X T Y {\displaystyle \mathbf {b} =(\mathbf {X} ^{\rm {T}}\mathbf {X} )^{-1}\mathbf {X} ^{\rm {T}}\mathbf {Y} } . Likewise, it can be shown that a 1 α {\displaystyle 1-\alpha } confidence interval for a single mean response estimate is as follows:

y ^ i [ x i T b ± t α / 2 , df = n p MSE ( x i T ( X T X ) 1 x i ) ] , {\displaystyle {\hat {y}}_{i}\in \left,}

where MSE {\displaystyle \operatorname {MSE} } is the observed value of the mean squared error ( Y T Y b T X T Y ) {\displaystyle (Y^{\rm {T}}Y-\mathbf {b} ^{\rm {T}}X^{\rm {T}}Y)} .

The Working–Hotelling approach to multiple estimations is similar to that of simple linear regression, with only a change in the degrees of freedom:

y ^ i [ x i T b ± W MSE ( x i T ( X T X ) 1 x i ) ] , {\displaystyle {\hat {y}}_{i}\in \left,}

where W 2 = 2 F α , df = ( p , n p ) {\displaystyle W^{2}=2F_{\alpha ,{\text{df}}=(p,n-p)}} .

Graphical representation

In the simple linear regression case, Working–Hotelling–Scheffé confidence bands, drawn by connecting the upper and lower limits of the mean response at every level, take the shape of hyperbolas. In drawing, they are sometimes approximated by the Graybill–Bowden confidence bands, which are linear and hence easier to graph:

β 0 + β 1 ( x i x ¯ ) [ b 0 + b 1 ( x i x ¯ ) ± m α , 2 , df = n 2 ( 1 n + | x i x ¯ | j = 1 n ( x j x ¯ ) ) ] {\displaystyle \beta _{0}+\beta _{1}(x_{i}-{\bar {x}})\in \left}

where m α , 2 , df = n 2 {\displaystyle m_{\alpha ,2,{\text{df}}=n-2}} denotes the upper α th {\displaystyle \alpha ^{\text{th}}} percentile of the Studentized maximum modulus distribution with two means and n 2 {\displaystyle n-2} degrees of freedom.

The simple linear regression model with a Working–Hotelling confidence band.

Numerical example

The same data in ordinary least squares are utilised in this example:

Height (m) 1.47 1.50 1.52 1.55 1.57 1.60 1.63 1.65 1.68 1.70 1.73 1.75 1.78 1.80 1.83
Weight (kg) 52.21 53.12 54.48 55.84 57.20 58.57 59.93 61.29 63.11 64.47 66.28 68.10 69.92 72.19 74.46

A simple linear regression model is fit to this data. The values of b 0 {\displaystyle b_{0}} and b 1 {\displaystyle b_{1}} have been found to be −39.06 and 61.27 respectively. The goal is to estimate the mean mass of women given their heights at the 95% confidence level. The value of W 2 {\displaystyle W^{2}} was found to be F 0.95 , df = ( 2 , 15 2 ) = 2.758828 {\displaystyle F_{0.95,{\text{df}}=(2,15-2)}=2.758828} . It was also found that x ¯ = 1.651 {\displaystyle {\bar {x}}=1.651} , j = 1 n e j 2 = 7.490558 {\displaystyle \sum _{j=1}^{n}e_{j}^{\,2}=7.490558} , MSE = 0.5761968 {\displaystyle \operatorname {MSE} =0.5761968} and j = 1 n ( x j x ¯ ) 2 = 693.3726 {\displaystyle \sum _{j=1}^{n}(x_{j}-{\bar {x}})^{2}=693.3726} . Then, to predict the mean mass of all women of a particular height, the following Working–Hotelling–Scheffé band has been derived:

y ^ i [ 39.06 + 61.27 x i ± 2.758828 0.5761968 ( 1 15 + ( x i 1.651 ) 2 693.3726 ) ] , {\displaystyle {\hat {y}}_{i}\in \left,}

which results in the graph on the left.

Comparison with other methods

Bonferroni bands for the same linear regression model, based on estimating the response variable given the observed values of X. The confidence bands are noticeably tighter.

The Working–Hotelling approach may give tighter or looser confidence limits compared to the Bonferroni correction. In general, for small families of statements, the Bonferroni bounds may be tighter, but when the number of estimated values increases, the Working–Hotelling procedure will yield narrower limits. This is because the confidence level of Working–Hotelling–Scheffé bounds is exactly 1 α {\displaystyle 1-\alpha } when all values of the independent variables, i.e. x i R {\displaystyle x_{i}\in \mathbb {R} } , are considered. Alternatively, from an algebraic perspective, the critical value ± W {\displaystyle \pm {\sqrt {W}}} remains constant as the number estimates of increases, whereas the corresponding values in Bonferonni estimates, ± t 1 α / g , df = n p {\displaystyle \pm t_{1-\alpha /g,{\text{df}}=n-p}} , will be increasingly divergent as the number g {\displaystyle g} of estimates increases. Therefore, the Working–Hotelling method is more suited for large-scale comparisons, whereas Bonferroni is preferred if only a few mean responses are to be estimated. In practice, both methods are usually used first and the narrower interval chosen.

Another alternative to the Working–Hotelling–Scheffé band is the Gavarian band, which is used when a confidence band is needed that maintains equal widths at all levels.

The Working–Hotelling procedure is based on the same principles as Scheffé's method, which gives family confidence intervals for all possible contrasts. Their proofs are almost identical. This is because both methods estimate linear combinations of mean response at all factor levels. However, the Working–Hotelling procedure does not deal with contrasts but with different levels of the independent variable, so there is no requirement that the coefficients of the parameters sum up to zero. Therefore, it has one more degree of freedom.

See also

Footnotes

  1. Miller (1966), p. 1
  2. ^ Miller (2014)
  3. ^ Neter, Wasserman and Kutner, pp. 163–165
  4. ^ Neter, Wasserman and Kutner, pp. 244–245
  5. ^ Miller (1966), pp. 123–127
  6. ^ Westfall, Tobias and Wolfinger, pp. 277–280

Bibliography

Least squares and regression analysis
Computational statistics
Correlation and dependence
Regression analysis
Regression as a
statistical model
Linear regression
Predictor structure
Non-standard
Non-normal errors
Decomposition of variance
Model exploration
Background
Design of experiments
Numerical approximation
Applications
Categories: