Yale University
Department of Statistics
Seminar

Monday, November 4, 2002

INFORMATION INEQUALITIES AND CENTRAL LIMIT THEOREMS

Andrew R. Barron
Department of Statistics
Yale University

We present and explain new inequalities quantifying the
entropy distance and the Fisher information distance from
the normal distribution for standardized sums of independent
random variables as arise in central limit theorems.  Explicit
and simple bounds of the order (1/n) are given for these
distances for random variables with finite Poincare constants
and finite Fisher information.  This is joint work with Oliver
Johnson from Cambridge University.


Seminar to be held in Room 107, 24 Hillhouse Avenue at 4:15 pm