Pollard preprints

Tusnady3.pdf =  Tusnady's inequality revisited (with Andrew Carter)
Tusnady's inequality is the key ingredient in the KMT/Hungarian coupling of the empirical distribution function with a Brownian Bridge. We present an elementary proof of a result that sharpens the Tusnady inequality, modulo constants. Our method uses the beta integral representation of Binomial tails, simple Taylor expansion, and some novel bounds for the ratios of normal tail probabilities.

thoughts.pdf  = Some thoughts on Le Cam's statistical decision theory
The paper contains some musings about the abstractions introduced by Lucien Le Cam into the asymptotic theory of statistical inference and decision theory. A short, selfcontained proof of a key result (existence of randomizations via convergence in distribution of likelihood ratios), and an outline of a proof of a local asymptotic minimax theorem, are presented as an illustration of how Le Cam's approach leads to conceptual simplifications of asymptotic theory.

pollardBDI.1july02.pdf = Maximal inequalities via bracketing with adaptive truncation
Abstract. The paper provides a recursive interpretation for the technique known as bracketing with adaptive truncation. By way of illustration, a simple bound is derived for the expected value of the supremum of an empirical process, thereby leading to a simpler derivation of a functional central limit limit due to Ossiander. The recursive method is also abstracted into a framework that consists of only a small number of assumptions about processes and functionals indexed by sets of functions. In particular, the details of the underlying probability model are condensed into a single inequality involving finite sets of functions. A functional central limit theorem of Doukhan, Massart and Rio, for empirical processes defined by absolutely regular sequences, motivates the generalization.

convex.pdf  =  Asymptotics for minimisers of convex processes (with Nils Lid Hjort, May 1993)
University of Oslo and Yale University Abstract. By means of two simple convexity arguments we are able to develop a general method for proving consistency and asymptotic normality of estimators that are defned by minimisation of convex criterion functions. This method is then applied to a fair range of different statistical estimation problems, including Cox regression, logistic and Poisson regression, least absolute deviation regression outside model conditions, and pseudo-likelihood estimation for Markov chains. Our paper has two aims. The ¯rst is to exposit the method itself, which in many cases, under reasonable regularity conditions, leads to new proofs that are simpler than the traditional proofs. Our second aim is to exploit the method to its limits for logistic regression and Cox regression, where we seek asymptotic results under as weak regularity conditions as possible. For Cox regression in particular we are able to weaken previously published regularity conditions substantially.