Lecturer | Prof. Felix Abramovich (felix@math.tau.ac.il) |

Lecture Hours
| Tuesday 16-19, Dan David 211 |

- syllabus
- literature
- exercises
- exams
### Purpose

Regression analysis plays a central role in statistics being one of its most powerful and commonly used techniques. Regression analysis deals with problems of finding appropriate models to represent relationships between a response variable and a set of explanatory variables based on data collected from a series of experiments. These models are used to represent existing data and also to predict new observations. The basic regression models are*linear*ones. Although they are the simplest and (hence) most well studied models, they nevertheless*do*work in numerious problems. Sometimes even for non-linear models it is possible to transfer the original non-linear model to a linear one after certain transformations of variables; in some other cases linearization of complex non-linear models may be used. In this course we'll try to understand how linear models work and when it is possible to use them efficiently.## Topics:

- Introduction
- regression models
- linear regression models, examples of linear regression models
- ANOVA, ANCOVA as linear regression models

- Least Squares Estimates
- derivation of LSE for regression coefficients
- statistical properties of LSE
- Gauss-Markov theorem
- geometrical interpretaion of LSE
- multiple correlation coefficient

- Statistical Inference
- Maximum Likelihood Estimates for normal models
- confidence intervals and confidence regions for regression coefficients
- hypothesis testing:
t-test, LRT-test ( *F*-test)

- Model Criticism
- analysis of residuals
- influential observations
- the Box-Cox transformation family

- Prediction and Forecasting, Calibration.
- Model Selection
- "goodness-of-model": correlation coefficient,
*AIC*,*C*of Mallows, cross-validation_{p} - strategies for model building

- "goodness-of-model": correlation coefficient,
- Some Special Topics:
- ridge regression
- polynomial regression, orthogonal polynomials
- piecewise-polynomial regression, splines

- Generalized Least Squares
- motivation, derivation of generalized LSE for regression coefficients
- some special cases: unequal variances, repeated measurements, hierarchical models

- Random and mixed effects models
- ANOVA models with fixed and random effects
- variance component (mixed effects) models

- Robust Regression
- robustness and resistance
- M-estimators
- robust regression

- Nonlinear Regression
- least squares estimation, the Gauss-Newton method
- statistical inference

### Literature

- Draper, N. and Smith, H. Applied Regression Analysis.
- Rao, C.R. and Toutenburg, H. Linear Models. Least Squares and Alternatives.
- Ryan, T.P. Modern Regression Methods.
- Seber, G. A. Linear Regression Analysis.
- Sen, A. and Srivastava, M. Regression Analysis: Theory, Methods and Applications.
- Faraggi D. and Goldenshluger A. Applied Regression (in Hebrew).
- much-much more

### Homework Exercises:

- Exercise 1 (5 November)
- Exercise 2 (19 November)
- Exercise 3 (3 December)
- Exercise 4 (31 December)
- Exercise 5 (21 January)

### Exams:

- Final Project (12 February)
- Theoretical Part (4 February, 12:00)

### Computing:

The course assumes an extensive use of computer. There are no limitations on using various statistical packages and software for this course, although the data-examples considered in the class will be R-``oriented". Installation instructions and manuals for R can be found on the R Home page . The following R based books may be helpful for this course:- Aitkin, M., Francis, B., Hinde, J. and Darnell, R. Statistical Modelling in R.
- Faraway, J.J. Linear Models with R.
- Venables, W.N. and Ripley, B.D. Modern Applied Statistics with S.

- Introduction