Exercise 1
(a light theoretical 'warming-up')

Question 1.

  1. In a two-group randomized experimental design, a response y is measured, and the model

    yiii
    is to be fitted, where

    μi01 xi,
    εi are i.i.d. with zero means and

    xi=0 in Group 1, xi=1 in Group 2.

    Express the parameters β0 and β1 in terms of the group means μ1 and μ2.

  2. The same model is fitted, but now xi is defined by

    xi=-1/2 in Group 1, xi=1/2 in Group 2

    What do parameters β0 and β1 represent now?

  3. A third definition of xi is

    xi=a in Group 1, xi=b in Group 2

    where a and b are arbitrary constants. What do parameters β0 and β1 represent in this general case?

Question 2.

For the regression model

yi=g(xi)+εi

with a continuous explanatory variable x1 and a factor x2: x2i=0, i=1,...,n1; x2i=1, i=n1+1,...,n1+n2, give plots that represent the following linear models, where x3=x12

  1. g(x)=β01 x12 x2
  2. g(x)=β01 x13 x3
  3. g(x)=β01 x12 x23 x3
  4. g(x)=β01 x13 x34 x1 x2
  5. g(x)=β01 x13 x35 x3 x2

Question 3.

For a simple linear regression model yi01 xii, i=1,...,n show that the diagonal entries of the projection (hat) matrix H are

$h_{ii}=\frac{1}{n}+\frac{(x_i-\bar{x})^2}{\sum_{j=1}^n(x_j-\bar{x})^2}$

Question 4.

Consider a standard linear model but with correlated errors: y=Xβ+ε, where εi's have zero means and Var(ε)=V, i.e. Var(εi)=Vii, Cov(εij)=Vij
  1. Find a vector of Weighted Least Squares Estimators for β that minimizes (y-Xβ)t V-1 (y-Xβ)
  2. Is it unbiased estimator of β?
  3. Find a variance-covariance matrix for the vector of weighted least squares estimators.
  4. What is the hat-matrix H for weighted least squares linear regression?
  5. Show that if, in addition, εi are normal, then weighted least squares estimators are also maximum likelihood estimators.