Andrew R. Barron
Professor of Statistics and Electrical Engineering
The familiar Cramer-Rao inequality shows for unbiased estimators Y=g(X)
of a d-dimensional parameter vector a that the risk E[(Y-a) I(a) (Y-a)]
is never less than d, where I(a) is the Fisher information of p(x|a).
Moreover, an extension of this inequality due to Van Trees to deal with
Bayes risk may be used to show that the minimax risk satisfies the same
inequality. Here we take the analysis further to provide similar lower
bounds for the Bayes risk with Hellinger or Kullback-Leibler loss in
estimating the density function p(.|a). Motivation for this work comes
from predictive density estimation, data compression, and model selection.
Seminar to be held in Room 107, 24 Hillhouse Avenue at 4:15 pm