Yale University
Department of Statistics
Monday, October 7, 1996
Marten Wegkamp
Department of Statistics
Yale University
Seminar to be held in Room 107, 24 Hillhouse
We shall study the regression model
$$ y_i = g(x_i) + e_i, $$
where $ e_i $ are i.i.d. random variables with zero means and finite
variances. We express our a priori knowledge about the regression function
by writing
$$ g \in {\cal G}, $$
and estimate it by employing the method of least squares, possibly in a
non-parametric context.
If we have little information about $g$, i.e. the class ${\cal G}$ is ``too
large'', it is impossible to estimate it consistently. Hence the question
arises for which classes we can obtain consistent estimates. In other
words, we are interested in the connection between geometric features of
the class ${\cal G}$ and statistical properties of the least squares
estimator. In particular we shall show that consistency can be stated in
terms of necessary and sufficient metric entropy conditions on the class
${\cal G}$.
If time allows, we shall draw our attention to the problem of the rates of
convergence. They follow from metric entropy considerations as well, and in
many cases they are optimal. This result can be proved using techniques
borrowed from the theory of empirical processes.