Monday, November 4, 2002 INFORMATION INEQUALITIES AND CENTRAL LIMIT THEOREMS
Andrew R. Barron
Department of Statistics
Yale UniversityWe present and explain new inequalities quantifying the
entropy distance and the Fisher information distance from
the normal distribution for standardized sums of independent
random variables as arise in central limit theorems. Explicit
and simple bounds of the order (1/n) are given for these
distances for random variables with finite Poincare constants
and finite Fisher information. This is joint work with Oliver
Johnson from Cambridge University.