back to Home Page

Linear Models


Lecturer Prof. Felix Abramovich (
Lecture Hours Wednesday 15-18, Shenkar 104


Regression analysis plays a central role in statistics being one of its most powerful and commonly used techniques. Regression analysis deals with problems of finding appropriate models to represent relationships between a response variable and a set of explanatory variables based on data collected from a series of experiments. These models are used to represent existing data and also to predict new observations. The basic regression models are linear ones. Although they are the simplest and (hence) most well studied models, they nevertheless do work in numerious problems. Sometimes even for non-linear models it is possible to transfer the original non-linear model to a linear one after certain transformations of variables; in some other cases linearization of complex non-linear models may be used. In this course we'll try to understand how linear models work and when it is possible to use them efficiently.


  1. Introduction
    • regression models
    • linear regression models, examples of linear regression models
  2. Least Squares Estimates
    • derivation of LSE for regression coefficients
    • statistical properties of LSE
    • Gauss-Markov theorem
    • geometrical interpretaion of LSE
    • multiple correlation coefficient
  3. Statistical Inference
    • maximum likelihood estimators for normal models
    • confidence intervals and confidence regions for regression coefficients
    • hypothesis testing: t-test, LRT-test ( F-test)
  4. Model Criticism
    • analysis of residuals
    • influential observations
    • the Box-Cox transformation family
  5. Prediction and Forecasting
  6. Model Selection
    • criteria for model selection: correlation coefficient, penalized least squares, cross-validation
    • model selection and dimensionality reduction in high-dimensions: stepwise procedures, lasso, principle component regression, partial least squares
  7. Some Special Topics:
    • ridge regression
    • polynomial regression, orthogonal polynomials
    • piecewise-polynomial regression, splines
  8. Generalized Least Squares
    • motivation, derivation of generalized LSE for regression coefficients
    • some special cases: unequal variances, repeated measurements, hierarchical models
  9. Random and mixed effects models
    • ANOVA models with fixed and random effects
    • variance component (mixed effects) models
  10. Robust Regression
    • robustness and resistance
    • M-estimators
    • robust regression
  11. Nonlinear Regression
    • least squares estimation, the Gauss-Newton method
    • statistical inference


  • Draper, N. and Smith, H. Applied Regression Analysis.
  • Faraway, J.J. Linear Models with R.
  • Rao, C.R. and Toutenburg, H. Linear Models. Least Squares and Alternatives.
  • Ryan, T.P. Modern Regression Methods.
  • Seber, G. A. Linear Regression Analysis.
  • Sen, A. and Srivastava, M. Regression Analysis: Theory, Methods and Applications.
  • much-much more

Example files:

Homework Exercises:


The course assumes an extensive use of computer. There are no limitations on using various statistical packages and software for this course, although the data-examples considered in the class will be "R-oriented". Installation instructions and manuals for R can be found on the R Home page . The following R based books may be helpful for this course: