6.4 One-Way ANOVA
6.4.1 One-Way ANOVA
General form of One-Way ANOVA model is
\[ y_{ij} = \mu + \alpha_{i} + \epsilon_{ij}, \; \; \; \; \; i=1, \cdots, a \; \; \; \; \; j=1, \cdots, N_i \]
\[ n=\sum_{i=1}^a N_i \\ E(\epsilon_{ij})=0, \; Var(\epsilon_{ij})=\sigma^2, \; Cov(\epsilon_{ij}, \epsilon_{ab})=0 \]
- i-th treatment (group) effect \(a_i\)
- Balanced model is \(\forall i: N_i = b\)
- Unbalanced model is \(\forall i: N_i\)’s are different
6.4.2 More About Models
- Example 4.1.1:
\(a = 3, \; N_1 = 5, \; N_2 = 3, \; N_3 = 3\),
\[ Y = X \beta + \epsilon = \begin{pmatrix} J_5 & J_5 & 0 & 0 \\ J_3 & 0 & J_3 & 0 \\ J_3 & 0 & 0 & J_3 \end{pmatrix} \begin{pmatrix} \mu \\ \alpha_1 \\ \alpha_2 \\ \alpha_3 \end{pmatrix} + \begin{pmatrix} \epsilon_{11} \\ \epsilon_{12} \\ \vdots \\ \epsilon_{33} \end{pmatrix} \]
let \(N_1 = N_2 = N_3 = 5\). then
\[ X = \begin{pmatrix} J_3 \otimes J_5 & I_3 \otimes J_5 \end{pmatrix} \]
In general, balanced design such as \(i = 1, \cdots, a \; \; \; \; \; j = 1, \cdots, b\):
\[ X = \begin{pmatrix} J_a \otimes J_b & I_a \otimes J_b \end{pmatrix} \]
- Notation: \(J_r^c \equiv J_r J_c' = J_r \otimes J^c\) is a \(r \times c\) matrix of \(1\)’s.
Let \(Z\) be the model matrix for the alternative one-way analysis of variance model
\[ y_{ij} = \mu_i + \epsilon_{ij} \; \; \; \; \; i=1, \cdots, a \; \; \; \; \; k= 1, \cdots, N_i \]
then, letting \(X_i X_j = \delta_{ij}\) with 1 for \(i=j\) and 0 for \(i \not = j\),
\[ \begin{align} X &= \begin{bmatrix}J & Z\end{bmatrix} &&= \begin{bmatrix}J & (X_1 , \cdots, X_a)\end{bmatrix} \\ \Longrightarrow \; \; \; \; \; \mathcal{C}(X) &=\mathcal{C}(Z) \\ Z'Z &= diag(N_1 , N_2 , \cdots, N_a) \\ Z(Z'Z)^{-1}Z' &=Blk \; \; diag \Big[ N_i^{-1} J_{N_i}^{N_i} \Big] \\ M &=X (X'X)^{-1}X' \\ M_\alpha &= Z_\ast(Z_\ast ' Z_\ast)^{-1} Z_\ast ' &&=M- M_J = M-\dfrac{1}{n}J_n^n \\ Z_\ast &=(I-M_j)Z \\ M &= M_j + M_\alpha \end{align} \]
6.4.3 Estimating and Testing Contrasts
A contrast in the one-way ANOVA
\[ \lambda ' \beta = \sum_{i=1}^a \lambda_i \alpha_i \; \; \; \; \; with \; \; \; \lambda ' J_{a+1} = \sum_{i=1}^a \lambda_i = 0 \]
For estimable \(\lambda ' \beta\), find \(\rho\) so that $‘X = ’ $, \(\rho ' = \begin{pmatrix} \dfrac{J_{N_i} ' \lambda_i}{N_i} \end{pmatrix}\).
- Proposition 4.2.1.
\(\lambda ' \alpha = \rho ' X \beta\) is a contrast \(\iff\) \(\rho ' J = 0\).
- Proposition 4.2.2.
\(\lambda ' \alpha = \rho ' X \beta\) is a contrast \(\iff\) \(M_\rho \in \mathcal{C}(M_\alpha)\).
since \(\sum_{i=1}^a \lambda_i =0\),
$ _{i=1}^a _i i ={i=1}^a _i {+ i} = {i=1}^a i y{i+} $ because \(\mu + \alpha_i\) is estimable, and its unique LSE is \(\bar y_{i+}\).
At significance level \(\alpha\), \(H_0: \lambda ' \alpha=0\) is rejected if
$ \[\begin{alignat}{2} &F &&= \dfrac { \dfrac{ \Big( \sum_{i=1}^a \lambda_i \bar y_{i+} \Big) ^2} {\dfrac{\sum_{i=1}^a \lambda_i^2}{N_i}} } {MSE} &&> F \Big(1-\alpha, \; \; 1, \; \; dfE \Big) \\ \\ \\ \iff \; \; \; \; \; & t \ &&= \dfrac {\Bigg \vert \sum_{i=1}^a \lambda_i \bar y_{i+} \Bigg \vert} {\sqrt{MSE \left( \sum_{i=1}^a\dfrac{\lambda_i^2}{N_i}\right) }} &&> t \left( 1-\dfrac{\alpha}{2}, \; \; dfE \right) \end{alignat}\] $
6.4.4 Cochran’s Theorem
let \(A_1 , \cdots, A_m\) be \(n \times n\) symmetric Matrices, and \(A = \sum_{j=1}^m A_j\) with \(rank(A_j) = n_j\). consider the following four statements:
- \(A_j\) is an orthogonal projection for all \(j\).
- \(A\) is an orthogonal projection (possibly \(A=I\)).
- \(A_j A_k = 0\) for all \(j \not = k\).
- \(\sum_{j=1}^m n_j = n\).
If any two of these conditions hold, then all four hold.
- Note: Cochran’s theorem is a standard result that is the basis of the ANalysis Of VAriance. If we can write the total sum of squares as a sum of sum of squares components, and if the degree of freedom add up, then the \(A_j\) must be projections, they are orthogonal to each other, and they jointly span \(\mathbb{R}^n\).