Statistical procedures may be evaluated by their asymptotic performance, by their risk behaviour for fixed sample sizes for different values of a parameter, or by their conformity to some assumed prior distribution over the parameters. In this course , we will consider all three methods of evaluating procedures.
The central problem is the asymptotic risk of Bayes procedures for various loss functions and prior distributions. We will develop technique in using multivariate cumulants and multivariate Edgeworth Expansions; since Bayes procedures agree in the low order terms of the risk, it is necessary to consider asymptotic approximations considerably more accurate than the usual asymptotic normal theory.
With the aid of these asymptotic Bayes risks, we can examine standard procedures to see how well they may be approximated by Bayes procedures. For example, estimation by maximum likelihood in the case of exponential families , with a Kullback-Leibler loss, corresponds to a certain standard prior distribution. In general, in two or more dimensions maximum likelihood corresponds asymptotically to no prior distribution, and so must be expected to be asymptotically inadmissible.
Density estimation by plugging in an estimate of an unknown parameter is shown to be asymptotically inferior to Bayesian averaging. Other kinds of procedures have similar asymptotic evaluations. The general line is to identify weaknesses in the procedures by comparing their asymptotic risks with the risks of all Bayes procedures.
The course will require mathematical skills at the level of advanced calculus.