Publications and Presentations of Andrew Barron

Ph.D. Dissertation:

    A. R. Barron (1985). Logically smooth density estimation. Stanford Univ., Stanford, CA.

Journal Publications:

  1. D. Cleveland, A. R. Barron, A. N. Mucciardi (1980). Methods for determining the depth of near-surface defects. Journal of Nondestructive Evaluation, Vol.1, pp.21-36.
  2. A. R. Barron (1985). The strong ergodic theorem for densities: generalized Shannon-McMillan-Breiman theorem. Annals of Probability, Vol.13, pp.1292-1303. (Finalist for the best paper prize by the IEEE Information Theory Society.)

  3. A. R. Barron (1986). Entropy and the central limit theorem. Annals of Probability, Vol.14, pp.336-342. (Finalist for the best paper prize by the IEEE Information Theory Society.)

  4. A. R. Barron (1986). Discussion on Diaconis and Freedman: the consistency of Bayes estimates. Annals of Statistics, Vol.14, pp.26-30.

  5. A. R. Barron and T. M. Cover (1988). A bound on the financial value of information. IEEE Transactions on Information Theory, Vol.34, pp.1097-1100.

  6. A. R. Barron (1989). Uniformly powerful goodness of fit tests. Annals of Statistics, Vol.17, pp.107-124.

  7. B. Clarke and A. R. Barron (1990). Information-theoretic asymptotics of Bayes methods. IEEE Transactions on Information Theory, Vol.IT-38, pp.453-471. (Winner 1992 Browder J. Thompson Memorial Prize award for the best paper in all IEEE journals for authors of age 30 or under at time of submission).

  8. A. R. Barron and X. Xiao (1991). Discussion on Friedman's multivariate adaptive regression. Annals of Statistics, Vol.19, pp.67-82.

  9. A. R. Barron and C. Sheu (1991). Approximation of density functions by sequences of exponential families. Annals of Statistics, Vol.19, pp.1347-1369.

  10. A. R. Barron and T. M. Cover (1991). Minimum complexity density estimation. IEEE Transactions on Information Theory, Vol.IT-37, pp.1034-1054.

  11. A. R. Barron, L. Gyorfi, and E. C. van der Meulen (1992). Distribution estimation consistent in total variation and in two types of information divergence. IEEE Transactions on Information Theory, Vol.IT-38, pp.1437-1454.

  12. A. R. Barron (1993). Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory, Vol.IT-39, pp.930-944.

  13. A. R. Barron (1994). Approximation and estimation bounds for artificial neural networks. Machine Learning, Vol.14, pp.113-143.

  14. A. R. Barron (1994). Comment on Cheng and Titterington: Neural Networks, A Review from a Statistical Perspective. Statistical Science, Vol.9, No. 1, pp.33-35.

  15. B. Clarke and A. R. Barron (1994). Jeffreys' prior is asymptotically least favorable under entropy risk. Journal of Statistical Planning and Inference, Vol.41, pp.37-60.

  16. Q. Xie and A. R. Barron (1997). Minimax redundancy for the class of memoryless sources. IEEE Transactions on Information Theory, Vol.43, pp.646-657.

  17. Y. Yang and A. R. Barron (1998). An asymptotic property of model selection criteria. IEEE Transactions on Information Theory, Vol.44, pp.117-133.

  18. A. R. Barron, J. Rissanen and B. Yu (1998). The minimum description length principle in coding and modeling. (Invited Paper. Special issue in honor of 50 years since Claude Shannon's seminal work.) IEEE Transactions on Information Theory, Vol.44, pp.2734-2760.

  19. A. R. Barron and N. Hengartner (1998). Information theory and superefficiency. Annals of Statistics, Vol.26, pp.1800-1825.

  20. A. R. Barron, L. Birge and P. Massart (1999). Risk bounds for model selection by penalization. Probability Theory and Related Fields, Vol.113, pp.301-413.

  21. A. R. Barron, M. Schervish and L. Wasserman (1999). The consistency of posterior distributions in nonparametric problems. Annals of Statistics, Vol.27, pp.536-651.

  22. Y. Yang and A. R. Barron (1999). Information-theoretic determination of minimax rates of convergence. Annals of Statistics, Vol.27, pp.1564-1599.

  23. Q. Xie and A. R. Barron (2000). Asymptotic minimax regret for data compression, gambling, and prediction. IEEE Transaction on Information Theory, Vol.46, pp.431-445.

  24. G. Cheang and A. R. Barron (2000). A Better Approximation for Balls. Journal of Approximation Theory, Vol.104, pp. 183-203.

  25. J. E. Cross and A. R. Barron (2003). Efficient Universal Portfolios for Past Dependent Target Classes. Mathematical Finance, Volume 13, Issue 2, Page 245-276.

  26. O. Johnson and A. R. Barron (2004). Fisher Information Inequalities and the Central Limit Theorem. Probability Theory Related Field , 129, Page 391-409.

  27. F. Liang and A. R. Barron (2004). Exact Minimax Strategies for Predictive Density Estimation, Data Compression, and Model Selection. IEEE Transactions on Information Theory, Vol. 50, Page 2708-2726 .

  28. G. Leung and A. R. Barron (2006). Information theory and mixing least-squares regressions. IEEE Transactions on Information Theory, Vol. 52. no.8, pp.3396-3410.

  29. M. Madiman and A. R. Barron (2007). Generalized entropy power inequalities and monotonicity properties of information. IEEE Transactions on Information Theory, Vol. 53. no.7, pp.2317-2329.

  30. A. R. Barron, A. Cohen, W. Dahmen and R. DeVore (2008). Approximation and learning by greedy algorithms. Annals of Statistics, Vol. 36, pp. 64-94.

  31. J. Takeuchi, T. Kawabata and A. R. Barron (2012). Properties of Jeffreys Mixture for Markov Sources. IEEE Transactions on Information Theory . Vol.58, no.12.

  32. A. Joseph and A. R. Barron (2012). Least Squares Superposition Codes of Moderate Dictionary Size Are Reliable at Rates up to Capacity. IEEE Transactions on Information Theory , vol.58, no.5. May, pp.2541-2557.

  33. A. Joseph and A. R. Barron (2014). Fast Sparse Superposition Codes have Exponentially Small Error Probability for R < C. IEEE Transactions on Information Theory, vol.60.

  34. A. M. Kagan, Tinghui Yu, A. R. Barron and M. Madiman (2014). Contributions to the theory of Pitman estimators . Journal of Mathematical Sciences. vol.199, no.2, pp.202-214.

  35. L. Jakobek, M. Boc and A. R. Barron (2015). Optimization of Ultrasonic-Assisted Extraction of Phenolic Compounds from Apples. Food Anal. Methods, Vol. 8, pp. 2612-2625.

  36. L. Jakobek and A. R. Barron (2016). Ancient Apple Varieties from Croatia as a Source of Bioactive Polyphenolic Compounds. Journal of Food Composition and Analysis, Vol. 45, pp. 9-15.

  37. X. Yang and A. R. Barron (2017). Minimax Compression and Large Alphabet Approximation through Poissonization and Tilting. IEEE Transactions on Information Theory, Vol 63.

  38. A. R. Barron, M. Bensic and K. Sabo (2016). Standardizing the Empirical Distribution Function Yields the Chi-Square Statistic . Submitted.

  39. J. M. Klusowski and A. R. Barron (2017). Minimax Lower Bounds for Ridge Combinations Including Neural Nets Submitted.

  40. Book Chapters:

  41. A. R. Barron (1984). Predicted squared error: a criterion for automatic model selection. Chapter 4 in Self-Organizing Methods in Modeling, S. J. Farlow (Editor), Marcel Dekker, New York, pp. 87-103.

  42. R. L. Barron, A. N. Mucciardi, F. J. Cook, J. N. Craig, and A. R. Barron (1984). Adaptive learning networks. Chapter 2 in Self-Organizing Methods in Modeling, S. J. Farlow (Editor), Marcel Dekker, New York, pp.25-65.

  43. A. R. Barron (1987). Are Bayes rules consistent in information? In Open Problems in Communication and Computation, T. M. Cover and B. Gopinath (Editors), Springer-Verlag, New York, pp.85-91.

  44. A. R. Barron (1991). Complexity regularization with application to artificial neural networks. In Nonparametric Functional Estimation and Related Topics, G. Roussas (Editor), Kluwer Academic Publishers, Boston, MA and Dordrecht, The Netherlands, pp.561-576.

  45. A. R. Barron (1998). Information-theoretic Characterization of Bayes Performance and the Choice of Priors in Parametric and Nonparametric Problems. In Bayesian Statistics 6, J.M. Bernardo, J.O. Berger, A.P. Dawid and A.F.M. Smith (Editors). Oxford University Press. pp.27-52.

  46. J.Q. Li and A.R. Barron (2000). Mixture Density Estimation. In Advances in Neural Information Processing Systems, Vol.12, S.A. Solla, T.K. Leen and K-R. Mueller (Editors). MIT Press, Cambridge, Massachusetts, pp. 279-285.

  47. F. Liang and A.R. Barron (2005). Exact minimax predictive density estimation and MDL. [Plus the table with MDL parameter estimates.] In Advances in Minimum Description Length: Theory and Applications, P. Grunwald, I.J. Myung and M. Pitt (Editors). MIT Press, Cambridge, Massachusetts.

  48. A.R. Barron, C. Huang, J. Q. Li and Xi Luo (2008). MDL Principle, Penalized Likelihood, and Statistical Risk. In Festschrift for Jorma Rissanen. Peter Grunwald, Petri Myllymaki, Ioan Tabus, Marcelo Weinberger & Bin Yu (Editors). Tampere International Center for Signal Processing. TICSP series, #38. Tampere University of Technology, Tampere, Finland.

  49. Publications in Conference Proceedings: (4 to 27 pages)

  50. A. R. Barron, F. W. van Straten, and R. L. Barron (1977). Adaptive learning network approach to weather forcasting: a summary. Proceedings of the IEEE International Conference on Cybernetics and Society, Washington, DC, September 19-21. Published by IEEE, New York, pp.724-727.
  51. A. R. Barron and R. L. Barron (1988). Statistical learning networks: a unifying view. In Computing Science and Statistics: Proceedings of the 20th Symposium on the Interface, Reston, Virginia, April 20-23. E. Wegman, Ed., Published by the American Statistical Association, Alexandria, Virginia, pp.192-203. (Invited presentation).

  52. A. R. Barron (1989). Statistical properties of artificial neural networks. Proceedings of the IEEE International Conference on Decision and Control, Tampa, Florida, Dec.13-15. pp.280-285,vol.1, Published by IEEE, New York. (Invited presentation).

  53. R. L. Barron, R. L. Cellucci, P. R. Jordan, N. E. Beam, P. Hess, and A. R. Barron (1990). Applications of polynomial neural networks to fault detection, isolation, and estimation (FDIE) and reconfigurable flight control. Proceedings of the National Aerospace Electronics Conference, Dayton, Ohio, May 23-25, pp.507-519, vol.2 (Winner of the best paper prize, 1990 NAECON). Republished in Proceedings 1998 NAECON, pp. 348-360. IEEE

  54. A. R. Barron (1991). Approximation and estimation bounds for artificial neural networks. In Computational Learning Theory: Proceedings of the Fourth Annual ACM Workshop, Santa Cruz, CA, August 5-7. L. Valiant, Ed., Morgan Kaufmann Publishers, Inc., San Mateo, California, pp.243-249. (Honored as one of the four papers that appeared by invitation in expanded form in a special issue of Machine Learning, representing the top presentations at the workshop.)

  55. A. R. Barron (1992). Neural Net Approximation. Proceedings of the 7th Yale Workshop on Adaptive and Learning Systems, May 20-22, K. S. Narendra (Editor), Center for Systems Science, Yale University, pp. 69-72.

  56. D. Haussler and A. R. Barron (1993). How well do Bayes methods work for on-line prediction of + or -1 values? Computational Learning and Cognition: Proc. Third NEC Research Symposium, SIAM, Philadelphia, pp. 74-101.

  57. J. Takeuchi and A. R. Barron (1997). Asymptotically minimax regret for exponential families. . pp.665-668.

  58. J. Takeuchi and A. R. Barron (1998). Robustly Minimax Codes for Universal Data Compression. 21st Symposium on Information Theory and Its Applications. Gifu, Japan, December 2-5.

  59. G.H.L. Cheang and A. R. Barron (1999). Estimation with Two Hidden Layer Neural Nets. Proceedings of the 1999 International Joint Conference on Neural Networks (IJCNN), pp. 375-378, Vol.1.

  60. G.H.L. Cheang and A. R. Barron (2001). Penalized Least Squares, Model Selection, Convex Hull Classes, and Neural Nets. Proceedings of the 9th European Symposium on Artificial Neural Networks. M. Verleysen (Editor). pp. 371-376.

  61. A. R. Barron (2000). Limits of Information, Markov Chains and Projection. IEEE International Symposium on Information Theory. Sorrento, Italy, June 25-30.

  62. J. Takeuchi and A. R. Barron (2001). Properties of Jeffreys mixture for Markov sources. Proc. Workshop on Information Based Induction Sciences (IBIS), pp. 327-333.

  63. G. Leung and A. R. Barron (2005). Combining Least-squares Regressions: An Upper Bound on Mean Squared Error. Proc. International Symposium on Information Theory, September 4-9, pp. 1711-1715.
  64. M. Madiman and A. R. Barron (2006). The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities. Proceedings of the 2006 IEEE International Symposium on Information Theory. Seattle, Washington, July 2006. pp.1021-1025.

  65. J.-I. Takeuchi, A. R. Barron, T. Kawabata (2006). Statistical curvature and stochastic complexity. Proceedings of the 2nd Symposium on Information Geometry and its Applications, Tokyo, Japan, December 12-16, pp. 29-36.

  66. A. R. Barron and Xi Luo (2007). Adaptive Annealing. Proceedings 45th Annual Allerton Conference on Communication, Control, and Computing. Allerton House, UIUC, Illinois. September 26-28. pp.665-673.

  67. A. R. Barron, C. Huang, J. Q. Li, and Xi Luo (2008). MDL, Penalized Likelihood and Statistical Risk. IEEE Information Theory Workshop. Porto, Portugal, May 4-9. pp. 247-257.

  68. A. R. Barron and Xi Luo (2008). MDL procedures with l_1 penalty and their statistical risk. First Workshop on Information Theoretic Methods in Science and Engineering. Tampere, Finland, August 18-20, 2008.

  69. M. Madiman, A. R. Barron, A. Kagan, T. Yu (2009). A Model for Pricing Data Bundles by Minimax Risks for Estimation of a Location Parameter. Proceedings of the IEEE Workshop on Information Theory. Volos, Greece, June 10-12, pp. 106-109.

  70. A. R. Barron, A. Joseph (2010). Least Squares Superposition Codes of Moderate Dictionary Size, Reliable at Rates up to Capacity. Proc. IEEE International Symposium on Information Theory . Austin, Texas, June 13-18, 2010. pp.275-279.

  71. A. R. Barron, A. Joseph (2010). Towards fast reliable communication at rates near capacity with Gaussian noise. Proc. IEEE International Symposium on Information Theory. Austin, Texas, June 13-18, 2010. pp.315-319

  72. A. R. Barron, A. Joseph (2011). Analysis of fast sparse superposition codes. Proc. IEEE International Symposium on Information Theory. St Petersburg, Russia, August 1-6, 2011. pp.1772-1776.

  73. E. Abbe, A. R. Barron (2011). Polar coding schemes for the AWGN channel. Proc. IEEE International Symposium on Information Theory. St Petersburg, Russia, August 1-6, 2011. pp.194-198.

  74. A. R. Barron and Sanghee Cho (2012). High-rate sparse superposition codes with iteratively optimal estimates. Proc. IEEE International Symposium on Information Theory. Cambridge, MA, July 2012, pp.120-124.

  75. C. Rush and A. R. Barron (2013). Using the method of nearby measures in superposition coding with a Bernoulli dictionary. 6th Workshop on Information Theory Methods in Science and Engineering (WITMSE). Tokyo, Japan, August 2013.

  76. X. Yang and A. R. Barron (2013). Large alphabet coding and prediction through Poissonization and tilting. Proceedings 6th Workshop on Information Theory Methods in Science and Engineering (WITMSE). Tokyo, Japan, August 2013.

  77. X. Yang and A. R. Barron (2014). Compression and prediction for large alphabet iid and Markov models. Proceedings 7th Workshop on Information Theoretic Methods in Science and Engineering (WITMSE) . Honolulu, July 5-8. pp. 31-34.

  78. J.-I. Takeuchi and A. R. Barron (2013).Asymptotically minimax regret by Bayes mixtures for non-exponential families. Proc. IEEE Information Theory Workshop. pp. 1-5.

  79. J.-I. Takeuchi and A. R. Barron (2014). Asymptotically Minimax Regret for Models with Hidden Variables. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014. pp. 3037-3041.

  80. J.-I. Takeuchi and A. R. Barron (2014).Stochastic Complexity for Tree Models. Proc. IEEE Information Theory Workshop. November 2-4, pp. 222-226.

  81. S.i Chatterjee and A. R. Barron (2014). Information Theoretic Validity of Penalized Likelihood. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014. pp. 3027-3031.

  82. A. Barron, Teemu Roos, Kazuho Watanabe (2014). Bayesian Properties of Normalized Maximum Likelihood and its Fast Computation. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014. pp. 1667-1671.

  83. X. Yang and A. R. Barron (2014). Compression and Predictive Distributions for Large Alphabet i.i.d. and Markov Models. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June. pp. 2504-2508.

  84. Newsletter Article:

  85. R. Venkataramanan, S. Tatikonda, A. Barron (2016). Sparse Regression Codes. IEEE Information Theory Society Newsletter.December 2016, pp. 7-15. [Based on ISIT Tutorial by R. Venkataramanan and A. Barron, Barcelona, July 2016.]

Patents:

Technical Reports: (with details not in subsequent publications)

A Selection of Seminar Presentation Files (pdf format); to view on a computer or to project on a screen):

Other Conference Presentations (1983-2008): (proceedings containing not more than 1 page abstracts). A number of subsequent presentations are described in the links above.]

  • A. R. Barron (1983). Convergence of logically simple estimates of unknown probability densities. IEEE International Symposium on Information Theory, Saint Jovite, Canada, September 26-30.

  • A. R. Barron (1985). Entropy and the central limit theorem. IEEE International Symposium on Information Theory, Brighton, England, June 23-28.

  • A. R. Barron (1985). Ergodic theorem for densities: generalized Shannon-McMillan-Breiman theorem. IEEE International Symposium on Information Theory, Brighton, England, June 23-28.

  • A. R. Barron (1985). Logically smooth density estimation. Joint IMS, ASA Annual Meeting, Las Vegas, Nevada, August 5-8.

  • A. R. Barron (1987). Applications of large deviations in statistics. Conference on Asymptotic Methods for Stochastic Systems: Large Deviations Theory and Practice, University of Maryland, October 25-27. (Invited Presentation).

  • A. R. Barron (1988). The convergence in information of probability density estimators. IEEE International Symposium on Information Theory, Kobe, Japan, June 19-24.

  • T. M. Cover and A. R. Barron (1988). A bound on the financial value of information. IEEE International Symposium on Information Theory, Kobe, Japan, June 19-24.

  • A. R. Barron (1989). Minimum complexity density estimation. IMS Regional Meeting, Lexington, Kentucky, March 19-22. (Invited Presentation).

  • A. R. Barron (1989). Portfolio selection based on nonparametric density estimates for the stock market. 21st Symposium on the Interface: Computing Science and Statistics, Orlando, Florida, April 9-12. (Invited Presentation).

  • A. R. Barron (1989). Minimum complexity estimation. IEEE Workshop on Information Theory, Center for Applied Math, Cornell University, June 26-30. (Session Organizer).

  • A. R. Barron (1989). Some statistical properties of polynomial networks and other artificial neural networks. Conference on Neural Information Processing Systems, Denver, Colorado, November 27-30. (Invited Plenary Presentation).

  • A. R. Barron (1990). An index of resolvability of probability density estimators. IEEE International Symposium on Information Theory, San Diego, California, January 14-19.

  • A. R. Barron (1990). Some statistical convergence properties of artificial neural networks. IEEE International Symposium on Information Theory, San Diego, California, January 14-19.

  • A. R. Barron (1990). The index of resolvability: statistical convergence of minimum complexity estimation. AAAI Symposium on Minimal-Length Encoding, Stanford California, March 27-29 (Invited Presentation).

  • A. R. Barron (1990). Some approximation and estimation theorems for artificial neural networks. IEEE Information Theory Workshop, Veldhoven, The Netherlands, June 10-15 (Invited Presentation).

  • A. R. Barron (1990). Statistical properties of artificial neural networks. SIAM Annual Meeting, Chicago, Illinois, July 20 (Invited Presentation).

  • A. R. Barron (1991). Information theory and the stock market: the effect of side information. Workshop on Coordination of Distributed Information and Decisions, Cornell University, April 11-13. (Invited Presentation).

  • A. R. Barron (1991). Approximation and estimation results for adaptively synthesized neural network architectures. Workshop on Theoretical Issues in Neural Nets, Center for Discrete Mathematics and Theoretical Computer Science, Rutgers, University, May 20-23. (Invited Presentation).

  • A. R. Barron and R. L. Barron (1991). Artificial neural networks in industry: some developments using statistical techniques. IMS Special Topics Meeting on Statistics in Industry, Philadelphia, Pennsylvania, June 9-12 (Invited Presentation).

  • A. R. Barron (1991). Universal approximation bounds for superpositions of a sigmoidal function. IEEE International Symposium on Information Theory, Budapest, Hungary, June 23-29.

  • A. R. Barron (1991). Approximation and estimation bounds for sigmoidal and polynomial networks. Joint Statistical Meetings, Atlanta, Georgia, August 19-22. (Session Organizer).

  • A. R. Barron (1991). Approximation and estimation bounds for neural networks and projection pursuit. Workshop on Neural Information Processing Systems, Vail, Colorado, December 10-11. (Invited Presentation).

  • A. R. Barron (1991). Risk estimation, risk optimality, regularization, and neural nets. Workshop on Neural Information Processing Systems, Vail, Colorado, December 10-11.

  • A. R. Barron (1992). Approximation, estimation, and computation results for artificial neural networks. 24th Symposium on the Interface: Computing Science and Statistics, College Station, Texas, March 19-21. (Invited Presentation).

  • A. R. Barron (1992). Artificial neural networks: stochastic analysis and engineering applications? Symposium on Stochastic Processes in Engineering Applications, Otaniemi, Finland, April 7-9. (Invited Plenary Presentation).

  • A. R. Barron, D. Olive, and Y. Yang (1992). Asymptotically optimal complexity-based model selection. IMS Regional Meeting, Corvallis, Oregon, June 14-15. (Invited Presentation).

  • A. R. Barron (1992). Statistical accuracy of neural nets. Conference on Neural Information Processing Systems: Natural and Synthetic. Denver, Colorado, November 30 - December 2. (Invited Tutorial).

  • A. R. Barron (1993). Neural net approximation. Annual Meeting of the American Math Society, San Antonio, Texas, January 13-16. (Invited Presentation).

  • A. R. Barron, L. Gyorfi, and E. C. van der Meulen (1993). Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. IEEE International Symposium on Information Theory, San Antonio, Texas, January 17-22.

  • A. R. Barron, B. Clarke, and D. Haussler (1993). Information bounds for the risk of Bayesian predictions and the redundancy of universal codes. IEEE International Symposium on Information Theory, San Antonio, Texas, January 17-22.

  • A. R. Barron (1993). Statistical accuracy of neural nets. Workshop on Information and Geometry, Hakone, Japan. March. (Invited Presentation).

  • A. R. Barron (1993). Optimal rate properties of minimum complexity estimation. Workshop on Descriptional Complexity, Schloss Dagstahl, Wadern, Germany. May. (Invited Presentation).

  • A. R. Barron (1993). Do neural nets avoid the curse of dimensionality? Congress on Statistics, Vannes, France, May. (Invited Presentation).

  • A. R. Barron (1993). Performance bounds for neural net estimation and classification. Annual Meeting of the Classification Society of North America, Pittsburgh, June. (Invited Presentation).

  • A. R. Barron (1993). Do neural nets avoid the curse of dimensionality? NATO ASI on Statistics and Neural Nets, Les Arc, France, June. (Invited Presentation).

  • A. R. Barron (1994). The accuracy of Bayes estimates of neural nets. 26th Symposium on the Interface: Computing Science and Statistics, Research Triangle Park, NC, June 15-18.

  • A. R. Barron, Y. Yang and B. Yu (1994). Asymptotically optimal function estimation by minimum complexity criteria. IEEE International Symposium on Information Theory, Trondheim, Norway, June 27 - July 1. (One page proceeding and three page summary available).

  • A. R. Barron (1994). Neural net approximation and estimation. IEEE Workshop on Information Theory, Moscow, Russia, July 3-6. (Invited Presentation).

  • A. R. Barron (1994) Minimum complexity estimation. ML/COLT: Workshop on Applications of Descriptional Complexity to Induction, Statistical and Visual Inference, New Brunswick, New Jersey, July 8-9. (Invited Presentation.)

  • A. R. Barron (1995). Statistics and neural nets. New England Statistics Conference, Storrs, Connecticut, April 22. (Invited Plenary Presentation).

  • A. R. Barron, Y. Yang (1995). Information-theoretic development of minimax rates of convergence. IEEE International Symposium on Information Theory, Whistler, British Columbia, September 17-22. (Added to session at recommendation of program chair; not in proceedings).

  • N. Hengartner and A. R. Barron (1995). Information theory and superefficiency. IMS regional meeting Stanford, California. (Invited Presentation).

  • B. S. Clarke and A. R. Barron (1995). Jeffreys' prior yields the asymptotic minimax redundancy. IEEE-IMS Workshop on Information Theory and Statistics, Alexandria, Virginia, October 27-29. (Invited Presentation).

  • A. R. Barron (1995). Asymptotically optimal model selection and neural nets. IEEE-IMS Workshop on Information Theory and Statistics, Alexandria, Virginia, October 27-29. (Invited Presentation).

  • Q. Xie and A. R. Barron (1996). Asymptotic minimax regret for data compression, gambling, and prediction. Workshop on sequence prediction, Santa Cruz, California, May 3-5. (Invited, jointly presented).

  • A. R. Barron (1996). The fundamental role of Kullback Information in large sample statistics. Solomon Kullback Memorial Conference, Washington, DC, May 23-24. (Invited Presentation).

  • A. R. Barron (1996). Adaptation and model selection. AMS-IMS-SIAM Summer Research Conference on adaptive selection of models and statistical procedures, Mount Holyoke, Massachusetts, June 22-28. (Conference Chairmen, A. Barron, P. Bickel, I. Johnstone, D. Donoho).

  • A. R. Barron and Y. Yang (1996). Information theory in nonparametric estimation. Nonparametric Estimation: The Road Ahead, Canberra, Australia, July 2-4. (Invited Presentation).

  • A. R. Barron (1996). Adaptive model selection and neural networks. Sydney Interational Statistics Congress, IMS regional meeting, Sydney, Australia, July 8-12. (Invited Presentation).

  • A. R. Barron (1996). Asymptotics of Bayes estimators. International Society of Bayesian Analysis, Regional Meeting, Chicago, Illinois, August 2-3. (Invited Presentation).

  • A. R. Barron and Y. Yang (1996). Adaptive model selection and the index of resolvability. Joint Statistical Meetings of the ASA and IMS, Chicago, Illinois, August 4-8. (Invited Presentation).

  • A. R. Barron and Q. Xie (1997). Asymptotic minimax regret for data compression, gambling and prediction. IEEE International Symposium on Information Theory, Ulm, Germany, June 29 - July 4.

  • A. R. Barron (1997). Information theory in probability, statistics, learning, and neural nets. Computational Learning Theory: Tenth Annual ACM Workshop, Nashville, Tennessee, July 6-9. (Invited Plenary Presentation).

  • A. R. Barron (1997). Information theory in probability, statistics, learning, and neural nets. International Conference on Combinatorics, Information Theory, and Statistics, University of Southern Maine, Portland, Maine, July 18-20. (Invited Presentation).

  • A. R. Barron and Y. Yang (1997). Information-theoretic determination of minimax rates of convergence. Symposium on Nonparametric Functional Estimation, Centre de Recherches Mathematiques, University of Montreal, October 16-18. (Invited Presentation).

  • A. R. Barron (1998). Nonlinear approximation, greedy algorithms, and neural networks. Ninth International Conference on Approximation Theory, January 4-6. (Invited Presentation).

  • A. R. Barron (1998). How information theory illuminates the behavior of risk functions of Bayes proceedures. Purdue Workshop on the Interface between Paradigms of Statistics. June 17-19. (Invited Presentation).

  • A. R. Barron and J.-I. Takauchi (1998). Mixture models achieving optimal coding regret. IEEE Information Theory Workshop. Killarney, Ireland, June 22-26.

  • J.-I. Takeuchi and A. R. Barron (1998). Asymptotic Minimax Regret by Bayes Mixtures. International Symposium on Information Theory. Cambridge, Ma, August 16-21.

  • A. R. Barron (1998). Information theory in probability and statistics; Approximation and estimation bounds for Gaussian mixtures. CIRM Workshop on Information Theory, Statistics, and Image Analysis, Marsielle, France, December 7-11. (Two Invited Presentations).

  • A. R. Barron (1999). Information theory in probability and statistics. IEEE Information Theory Workshop on Detection, Estimation, Classification, and Imaging, Sante Fe, New Mexico, February 24-26. (Invited Presentation).

  • A. R. Barron (1999). Decision theory of regret for universal coding, gambling, and prediction. DIMACS Workshop: Online Decision Making, Rutgers University, New Brunswick, NJ, July 12-15. (Invited Presentation).

  • A. R. Barron (2000). Limits of information, Markov chains and projection. IEEE International Symposium on Information Theory, Sorrento, Italy, June 26-30.

  • A. R. Barron, Laszlo Gyorfi, Micheal Nussbaum (2000). Nonparametric Estimation, Neural Nets and Risk Asymptotics. Short Course. Mathematical Research Institute, Oberwolfach, Germany, June 10-17.

  • A. R. Barron (2000). Information theory in probability; Information theory in statistics. J. Bolyia Society Conference on Information Theory in Mathematics, honoring 50th anniversary of formation of what is now known as the Renyi Institute of Mathematics of the Hungarian Academy of Sciences. Balatonelle, Hungary, July 3-7. (Two Invited Presentations).

  • A. R. Barron (2000). Prediction, data compression, gambling, and model selection: Do Bayes procedures nearly minimize the maximum of regret over all possible data sequences? AMS-IMS-SIAM Summer Research Conference on Bayes, Frequentist, and Likelihood Inference: a Synthesis. Mount Holyoke College, South Hadley, Massachusetts, July 9-13. (Invited Presentation.)

  • A. R. Barron (2001). Information-theoretic bounds for mixture modeling, model selection, and data compression. Workshop on Information Theory and Statistics, DIMACS, Rutgers University, March 2001. (Invited Presentation).

  • A. R. Barron (2001). Information theory in probability and statistics. 23rd European Meeting of Statistics, Funchal, Madeira, Portugal, August 13-18. (Invited Keynote Plenary Presentation).

  • F. Liang and A. R. Barron (2001). Minimax optimal predictive density estimation, data compression, and model selection. Workshop on MDL at Conference on Neural Information Processing Systems, Whistler, British Columbia, December 10. (Invited Presentation).

  • F. Liang and A. R. Barron (2002). Exact minimax strategies for predictive density estimation, data compression and model selection. IEEE International Symposium on Information Theory, Lausanne, Switzerland, July 1-5.

  • A. R. Barron, Jiangfeng Yu, and Wei Qui (2003). Maximum compounded wealth: portfolio estimation, option pricing, and stock selection. Workshop on Complexity and Inference, DIMACS, Rutgers University, June 2-5. (Invited Presentation).

  • A. R. Barron (2003). The role of information in the central limit problem. Symposium on Information Theory and Some Friendly Neighbors -- Ein Wunschkonzert, ZIF, Center for Interdisciplinary Research, Bielefeld, August 11-13. (Invited Presentation).

  • A. R. Barron (2003). Interplay of statistics and information theory in formulation and selection of models. Workshop on Model Building, Dortmund, Germany, November 13-14. (Invited Presentation).

  • G. Leung and A. R. Barron (2004). Information theory, model selection and model mixing for regression, Conference on Information Sciences and Systems, Princeton, NJ, March 17-19. (Invited Presentation in session on Information Theory, Computer Science, and Statistics.)

  • A. R. Barron and G. Leung (2004). Risk assessment for Bayes procedures and model mixing in regression. IVth Workshop on Bayesian Nonparametrics, Universita di Roma La Sapienza, Rome, Italy, June 12-16. (Invited Presentation).

  • A. R. Barron and G. Leung (2004). Risk assessment for model mixing. Workshop on Mathematical Foundations of Learning Theory, Barcelona, Spain, June 18-23. (Invited Presentation).

  • A. R. Barron (2004). Relative entropy in probability theory and mathematical statistics. Workshop on Entropy in the Mathematical, Physical, and Engineering Sciences, Padova, Italy, June 24-27. (Two Invited Presentations).

  • A. R. Barron (2004). Fitting functions of many variables: neural networks and beyond. 16th Conference on Computational Statistics (COMPSTAT 2004), Prague, Czech Republic, August 23-28. (Invited Keynote Plenary Presentation).

  • A. R. Barron (2005). Neural nets, mixture models, and adaptive kernel machines. Yale Workshop on Adaptive and Learning Systems, May 29-31, Center for Systems Science, Yale University. (Invited presentation).

  • A. R. Barron (2005). Challenges in high-dimensional function estimation and attempted solutions, Congress on Statistics, Pau, France, June 6-10. (Invited Presentation).

  • A. R. Barron (2005). Information theory and risk analysis. Medallion Lecture. Joint Statistical Meetings of the IMS and ASA, Minneapolis, Minnesota, August 7-11. (Presented with IMS Medallion Award; one-hour special invited presentation on Aug. 7).

  • A. R. Barron (2005). Information theory and statistics for machine learning, IEEE Workshop on Machine Learning for Signal Processing XV, Mystic, CT, October 28-30. (Invited Keynote Plenary Presentation).

  • M. Madiman and A. R. Barron (2006). Monotonicity of information in the central limit theorem, Workshop on Information Theory and its Applications, University of California, San Diego, February 6-9. (Invited Presentation).

  • A. R. Barron (2006). Simple risk bounds for mixing least squares regressions. Journees: Model Selection in Statistics: Different approaches, University de Nice, Sophia-Antipolis, Nice, France, March 14-19 (Two one-hour invited presentations).

  • A. R. Barron (2006). Simple risk bounds for mixing least squares regressions. International Workshop on Applied Probability, University of Connecticut, Storrs, CT, May 18. (Invited Presentation).

  • A. R. Barron and Wei Qiu (2007). Maximum wealth portfolios, Workshop on Information Theory and its Applications, University of California, San Diego, January 29 - February 2. (Invited Presentation).

  • A. R. Barron, Cong Huang, and Xi Luo (2008). Penalized squared error and likelihood: risk bounds and fast algorithms, Workshop on Sparsity in High Dimensional Statistics and Learning Theory, Georgia Istitute of Technology, Atlanta, Georgia, March 22-24. (Invited Three-Part Presentation).

  • A. R. Barron (2008). Information theory principles in probability and statistics, Elements of Information Theory Workshop, on the Occasion of Tom Cover 70th birthday, Stanford University, Stanford, CA, May 16. (Invited Presentation).

Invited Departmental Seminar Presentations (1985-2007): (These had short abstract announcements). [Some subsequent presentations in links above.]

  • Purdue University, Joint Statistics Colloquium, October 3, 1985. Topic: Entropy and the central limit theorem.
  • Michigan State University, Department of Statistics and Probability, January 28, 1986. Topic: Generalized Shannon-McMillan-Breiman theorem.
  • University of Chicago, Department of Statistics, October 20, 1986. Topic: Uniformly powerful tests.
  • University of Virginia, Department of Mathematics, March 5, 1987. Topic: Convergence of Bayes estimators of probability density functions.
  • Stanford University, Department of Statistics, October 20, 1987. Topic: Convergence of Bayes estimators of probability density functions.
  • University of Chicago, Department of Statistics, March 7, 1988. Topic: Convergence of Bayes estimators of probability density functions.
  • McGill University, Joint Statistics Seminar for Montreal universities, March 31, 1988. Topic: Approximation of densities by sequences of exponential families.
  • Dupont Research Center, Dover, Delaware, April 26, 1988. Topic: Statistical learning networks.
  • IBM T. J. Watson Research Center, Yorktown Heights, New York, August 10, 1988. Topic: Statistical learning networks.
  • Stanford University, Information Systems Laboratory, November 3, 1988. Topic: Minimum complexity density estimation.
  • IBM Technical Education Center, Thornwood, New York, January 11-12, 1989. Statistical learning networks. In the short course on Knowledge Acquisition from Data.
  • Cornell University, Department of Economics and Program of Statistics (Co-hosts), February 1, 1989. Topic: Convergence of Bayes estimators of probability density functions.
  • Purdue University, Department of Statistics, September 7, 1989. Topic: Minimum complexity density estimation.
  • University of Lowell, Massachusetts, Joint Seminar, Department of Mathematics and Department of Electrical Engineering, March14, 1990. Topic: Statistical properties of polynomial networks and other artificial neural networks.
  • Carnegie Mellon University, Department of Statistics, April 4, 1990. Topic: Statistical properties of polynomial networks and other artificial neural networks.
  • University of Chicago, Department of Statistics, October 15, 1990. Topic: Statistical properties of artificial neural networks.
  • University of California, San Diego, Department of Mathematics, January 7, 1991. Topic: Approximation bounds for artificial neural networks.
  • University of California, San Diego, Department of Mathematics, January 8, 1991. Topic: Complexity regularization for nonlinear model selection.
  • Siemens Corporation, Princeton, New Jersey, February 28, 1991. Topic: Universal approximation bounds for superpositions of a sigmoidal function.
  • University of Wisconsin, Department of Statistics, April 3, 1991. Topic: Complexity regularization for nonlinear model selection.
  • University of Wisconsin, Department of Mathematics, April 4, 1991. Topic: Approximation bounds for artificial neural networks.
  • Technical University of Budapest, Department of Electrical Engineering, July 2, 1991. Topic: Universal approximation bounds for superpositions of a sigmoidal function.
  • Mathematical Sciences Research Institute, Berkeley, California, September 25, 1991. Topic: Empirical process bounds for artificial neural networks.
  • Stanford University, Department of Statistics, October 15, 1991. Topic: Approximation and estimation bounds for artificial neural networks.
  • University of California, Santa Cruz, Department of Computer and Information Sciences, October 17, 1991. Topic: Computationally efficient approximation and estimation of functions using artificial neural networks.
  • Yale University, Department of Statistics, January 13, 1992. Topic: Neural network estimation.
  • University of Virginia, Department of Electrical Engineering, Eminent Speaker Series, February 21, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • North Carolina State University, Department of Statistics, February 29, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • Cornell University, Center for Applied Mathematics, March 6, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of North Carolina, Department of Statistics, March 30, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of Joenesu, Finland, Department of Statistics, April 9, 1992. Topic: Introduction to artificial neural networks.
  • University of Paris VI, Department of Statistics, April 15, 1992, and University of Paris, Orsay, Department of Statistics, April16, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of Paris VI, Department of Statistics, April 22, 1992, and University of Paris, Orsay, Department of Statistics, April23, 1992. Topic: Performance bounds for complexity-based model selection.
  • Princeton University, Department of Electrical Engineering, May 14, 1992. Topic: Overview of approximation results for sigmoidal networks.
  • University of Massachusetts at Lowell, Joint Seminar, Department of Mathematics and Department of Electrical Engineering, October 21, 1992. Topic: Statistical accuracy of neural nets.
  • University of Tokyo, Japan, Department of Information, Physics and Engineering, March 1993. Topic: Information theory and model selection.
  • University of Paris VI, Department of Statistics, May 1993. Topic: Optimal rate properties of minimum complexity estimation.
  • University of Paris VI, Department of Statistics, May 1993. Topic: Information-theoretic proof of martingale convergence.
  • University of Pennsylvania, Wharton School, October 21, 1993. Topic: Neural networks and statistics.
  • Massachusetts Institute of Technology, Center for Biological and Computational Learning, October 27, 1993. Topic: Neural networks and statistics.
  • Rutgers University, Department of Statistics, October 5, 1994, Topic: Statistical accuracy of neural nets.
  • University of South Carolina, Department of Mathematics, Spring 1995. Topic: Neural net approximation.
  • Carnegie Mellon University, Department of Statistics, Fall 1995. Topic: Information risk and superefficiency.
  • Massachusetts Institute of Technology, Department of Applied Mathematics, March 1996. Topic: Consistent and uniformly consistent classification.
  • Columbia University, Department of Statistics, Fall 1996. Topic: Consistency of posterior distributions in nonparametric problems.
  • Northeastern University, Joint Mathematics Colloquium with MIT, Harvard, and Brandiess, February 27, 1997. Topic: Information theory in probability and statistics.
  • Iowa State University, Department of Statistics, March 28, 1997. Topic: Information theory in probability and statistics: The fundamental role of Kullback divergence.
  • Washington University, St. Louis, Department of Electrical Engineering, Center for Imaging Systems, April 16, 1997. Topic: Universal data compression, prediction, and gambling.
  • Massachusetts Institute of Technology, LIDS Colloquium, May 5, 1998. Topic: Simple universal portfolio selection.
  • University of California, Santa Cruz, Baskin Center for Computer Engineering, October 1998. Topic: Approximation bounds for Gaussian mixtures.
  • Lucent, Bell Laboratories, Murray Hill, New Jersey, March 1999. Topic: Approximation and estimation bounds for mixture density estimation.
  • Stanford University, Department of Statistics, Probability Seminar, May 24, 1999. Topic: Information, martingales, Markov chains, convex projections, and the CLT.
  • Stanford University, Department of Statistics, Statistics Seminar, May 25, 1999. Topic: Mixture density estimation.
  • Rice University, Departments of Statistics and Electrical Engineering, November 4, 2000. Topics: Information theory and statistics -- best invariant predictive density estimators.
  • University of Chicago, Department of Statistics, November 21, 2000. Topics: Information theory and statistics -- best invariant predictive density estimators.
  • Yale University, Department of Computer Science, Alan J. Perlis Seminar, April 26, 2001. Topic: Neural nets, Gaussian mixtures, and statistical information theory.
  • Brown University, Department of Applied Mathematics, May 9, 2001. Topic: I do not recall.
  • University of Massachusetts at Lowell, Department of Mathematics, September 19, 2001. Topic: Mixture density estimation.
  • University of California at Los Angeles, Department of Statistics, May 21, 2002. Topic: Nonlinear approximation, estimation, and neural nets (I do not recall the specific title).
  • Columbia University, Department of Statistics, October 28, 2002. Topic: Information inequalities in probability and statistics.
  • University of Georgia, Department of Statistics, November 26, 2002. Topic: Information inequalities in probability and statistics.
  • University of Pennsylvania, Wharton School, October 29, 2003. Topic: Portfolio estimation for compounding wealth.
  • University of North Carolina (in conjunction with Duke University), Departments of Statistics, November 3, 2004. Topic: Risk assessment and advantages of model mixing for regression.
  • South Carolina, Department of Mathematics, April 7, 2005. IMI Distinquished Lecture. Topic: Statistical theory for nonlinear function approximation: neural nets, mixture models, and adaptive kernel machines.
  • Helsinki University and Helsinki Institute of Information Technology, Helsinki, Finland. August 22-25, 2005. Two talks: (1) Statistical foundations and analysis of the minimum description length principle. (2) Consequences of MDL for neural nets and Gaussian mixtures.
  • Princeton University, Department of Operations Research and Financial Engineering, October 4, 2005. Topic: Statistical perspectives on growth rate optimal portfolio estimation.
  • IBM Research Laboratories, Yorktown Heights, September 22, 2006. Topic: Generalized entropy power inequalities and the central limit theorem.
  • Purdue University, Department of Computer Science, February 26, 2007. Prestige Lecture Series on the Science of Information. Topic: The interplay of information theory, probability, and statistics.
  • University of Illinois, Joint Seminar, Department of Statistics and Department of Electrical and Computer Engineering, February 27, 2007. Prestige Conference Series. Topic: The interplay of information theory and probability.
  • Boston University, Department of Statistics, March 1, 2007. Prestige Conference Series. Topic: Information inequalities and the central limit theorem.
  • University of California at Berkeley, Department of Computer Science, October 4, 2007. Topic: Fast and accurate greedy algorithm for L1 penalized least squares. Primarily presented by Cong Huang.
  • Rutgers University, Department of Statistics, December 12, 2007. Topic: Fast and accurate L1 penalized least squares. Co-presented with Cong Huang.

Ph.D. Dissertations Supervised:

  1. Bertrand S. Clarke (1989). Asymptotic Cumulative Risk and Bayes Risk under Entropy Loss, with Applications. University of Illinois at Urbana-Champaign. [Was Assistant Professor at Purdue University; then Professor at University of British Columbia and University of Miami; Now Chairman, Department of Statistics, University of Nebraska]

  2. Chyong-Hwa Sheu (1989). Density Estimation with Kullback-Leibler Loss. University of Illinois at Urbana-Champaign.

  3. Yuhong Yang (1996). Minimax Optimal Density Estimation. Yale University. [Was Assistant Professor at Iowa State University; Now Professor at University of Minnesota, Department of Statistics.]

  4. Qun (Trent) Xie (1997). Minimax Coding and Prediction. Yale University. [Was at GE Capital, Inc., Fairfield, CT. Then an Assistant Professor at Tsinghua Univ.]

  5. Gerald Cheang (1998). Neural Net Approximation and Estimation of Functions. Yale University. [Now at Singapore Technical and Education University.]


  6. Qiang (Jonathan) Li (1999). Estimation of Mixture Models. Yale University. [Was at KPMG Financial Services, New York. Then at Stanford Research Institute, Palo Alto. Now at Radar Networks, Inc., San Franscisco]

  7. Feng Liang (2002). Exact Minimax Predictive Density Estimation. Yale University. [Was Assistant Professor Department of Statistics, Duke University. Now Associate Professor, Department of Statistics, University of Illinois at Urbana-Champaign]


  8. Wei Qiu (2007). Maximum Wealth Portfolios. Yale University. [J.P. Morgan Chase, Columbus, Ohio.]

  9. Cong Huang (2008). Risk of Penalized Least Squares, Greedy Selection and L1-Penalization for Flexible Function Libraries. Yale University. [Was at Columbia University, Department of Statistics. Now in China.]

  10. Xi Luo (Rossi) (2009). Penalized Likelihoods: Fast Algorithms and Risk Bounds. Yale University. [Was at University of Pennsylvania, postdoc with Tony Cai, Department of Statistics. Now Assistant Professor at Brown University, Department of Biostatistics.]

  11. Antony Joseph (2012). Achieving Information Theoretic Limits with High Dimensional Regression. Yale University. Co-developer at Yale of capacity-achieving sparse superposition codes. [Was at University of California, Berkeley, postdoc with Bin Yu, Department of Statistics. Now at Walmart Research.]

  12. Sanghee Cho (2014). High-Dimensional Regression with Random Design, including Sparse Superposition Codes. Yale University. [At GE Research, Schenectady, New York]

  13. Sabyasachi Chatterjee (2014). Adaptation in Estimation and Annealing. Yale University. [Postdoc with John Lafferty, Department of Statistics, University of Chicago.]

  14. Xiao (Grace) Yang (2015). Compression and Predictive Distributions for Large Alphabets. Yale University. [Now at Apple Inc., Cupertino, California.]

  15. Cynthia Rush (May 2016). Iterative Algorithms for Inference and Optimization, with Applications in Communications and Compressed Sensing. Yale University. [Now at the Department of Statistics, Columbia University.]