STAT 242b/542b  THEORY OF STATISTICS:  DRAFT OF COURSE SYLLABUS, 2007 spring

 

Instructor: Harrison H. Zhou.
e-mail: huibin.zhou@yale.edu
Office hours: Tuesday 11:00am-12:00pm and 5:30pm-6:30pm (RM 204, 24 Hillhouse)

Class Time: MWF 9:30AM-10:20AM.

T.A.: James Hu.
e-mail: xing.hu@yale.edu.
TA session: Tuesday 6:30pm-7:30pm. (Homework discussion. We meet in 24 Hillhouse Avenue, Room 107. Optional, but recommended. Start from the second Tuesday, Jan 23, 2007.)

Textbook: “All of Statistics” by Larry Wasserman
We will cover almost all material from chapters 6, 9, 10 and 13, and some from chapters 8 and 11.
Recommended reference: “Mathematical Statistics and Data Analysis” by John Rice.

Grade:
Weekly Homework: 25%
Midterm: 25%
Final Exam: 40%
Participation: 10%

Course Homepage: http://www.stat.yale.edu/~hz68/242/

 

Schedule:

 

WEEK 1: PROBABILITY REVIEW(Ch: 1, 2, 3, 4, 5).

* Overview of this course (Point Estimation, Confidence set, Hypothesis Testing, Linear Model).

* Normal, Chi-square, t, and F distributions for statistics based on samples from a normal. Multinomial. Exponential. Gamma. Poisson. Uniform. Quantile function.

* Expected values and variances of sample means. CLT (Central Limit Theorem). (Ch. 3).

 

WEEK 2: PRELIMINARIES ON INFERENCE (Ch. 6).

* CLT. Confidence set. Hypothesis Testing

* Point Estimation. Overview of Statistical inference (Examples and Questions: Parametric and Nonparametric, Frequentist and Bayesian, Consistency and Efficiency).

 

WEEK 3:  PRELIMINARIES ON INFERENCE (Ch. 9.1).

* Method of moments.

* Maximum likelihood estimator.

* Comparison of method of moments and Maximum likelihood estimator.

 

WEEK 4:  Parametric Inference. (Sections 9.5, 9.7, 9.8, 9.9, 9.10)

* Log-likelihood Function. Fisher information. Efficiency.

* Asymptotic Normality of the MLE. [Idea based on Taylor expansion, CLT, and Fisher information.]

* Estimation of Standard deviation of MLE.

 

 WEEK 5:  PARAMETRIC INFERENCE. (Section 9.5, 9.6. 9.7, 9.8, 11.1, 11.2). 

* Delta Method.

* Cramer-Rao inequality.

* Bayes method

 

WEEK 6:  PARAMETRIC INFERENCE. (Section 9.8, 9.9, 9.11, 9.13).

* Bayes method

* Large Sample Properties of Bayes’ procedure.

* Compare MSE of Bayes estimator and MLE estimator. Posterior interval.

 

WEEK 7:  TESTING STATISTICAL HYPOTHESES (Sec.10.1, 10.2).

* Sufficient statistics and likelihood factorization.

* Notions of simple and composite hypotheses concerning distributions and their parameters. The Wald Test.

* Neyman-Pearson Lemma for optimal tests in simple versus simple cases.

 

WEEK 8: MORE ON TESTING HYPOTHESES. (Sec. 10.3, 10.4, 10.6)

* Questions about Midterm exam. Neyman-Pearson Lemma.

* MIDTERM EXAM

* Neyman-Pearson Lemma.

 

SPRING BREAK

 

WEEK 9: MORE ON TESTING HYPOTHESES AND REVIEW (Sec. 10.6, 10.8, 10.5)

* Review

* The Likelihood Ratio test

* p-values.

  -- Accounting for degrees of freedom.

  -- Example.

* The Chi-Square test. The Goodness-of-fit Test.

 

WEEK 10:  Linear Model. (Sec. 13.1, 13.2)

* Simple Linear Regression

* LSE and MLE

 

WEEK 11: Linear Model. (Sec. 13.3, 13.4)

* LSE.

* Transformation.

* Residual plot. Standard Error. Confidence interval. Testing. R^2.

 

WEEK 12: Linear Model. (Sec. 13.5)

* Prediction Interval.

* Multiple Regression.

* LSE. Its Properties.

 

WEEK 13: Linear Model. (Sec. 13.6, 13.7)

* Confidence interval. Testing. Prediction Interval.

* Residual plot. Standard Error. Final Prediction Error. R^2.

 

WEEK 14: READING WEEK.

* Review.

* Review problems.