Andrew R. Barron
Yale University
Professor of Statistics (100% Appointment)
Professor of Electrical Engineering (Curtesy Appointment)
Co-Director of Graduate Studies in Statistics and Data Science

Address: Department of Statistics and Data Science, 24 Hillhouse Avenue, New Haven, CT 06511
Phone: +1-203-997-5229, 203-432-0666
Fax: 203-432-0633
E-Mail: Andrew.Barron@yale.edu

  • Ph.D., Electrical Engineering, Stanford University, 1985.
  • M.S., Electrical Engineering, Stanford University, 1982.
  • B.S. (Magna Cum Laude), E.E. and Math Science, Rice University, 1981.
  • W. T. Woodson H.S., Fairfax, Virginia, 1977.

Experience:

  • 2016 - present   Professor, Department of Statistics and Data Science, Yale University
  • 1992 - 2016       Professor, Department of Statistics, Yale University
  • 1990 - 1992       Associate Professor of Statistics and Electrical & Computer Engineering, University of Illinois
  • 1984 - 1998       Consultant, Barron Associates, Inc., Stanardsville, Virginia
  • 1992 Spring      Visiting Reseach Scholar, Barron Associates, Inc., Stanardsville, Virginia
  • 1991 Fall           Visiting Scholar, Mathematical Sciences Research Institute, Berkeley, California
  • 1985 - 1990       Assistant Professor of Statistics and Electrical & Computer Engineering, University of Illinois
  • 1982 - 1985       Research Assistant, Stanford University
  • 1981 - 1983       Consultant, Adaptronics, Inc., McLean, Virginia
  • 1977 - 1980       Engineer, Adaptronics, Inc., McLean, Virginia (Summers)

Honors:

  • Fellow of the IEEE. For Contributions to Information Theory and Statistics.
  • Best paper prize for all IEEE journals in 1990-1991, Browder J. Thompson Memorial Prize,
    for authors of age 30 or under at time of submission.
  • IMS Medallion Award Winner. Presented at 2005 Joint ASA-IMS Annual Meetings, Minneapolis, MN.
  • Best paper prize, National Aerospace Electronics Conference, 1990.
  • Finalist for the best paper prize, Information Theory Society, IEEE, 1987.
  • Nominated for the Marconi Young Scientist Award, 1990.
  • Board of Governors, IEEE Information Theory Society, Four terms,1995-1997, 1998-2000, 2013-2016, 2017-2019.
  • Appointed Secretary, Board of Governors, IEEE Information Theory Society, 1989-1990.
  • Keynote speaker at several conferences.
  • Chairman, AMS Summer Research Conference, Adaptive Selection of Models and Procedures, 1996.
  • Program Committee, IMS-ASA Joint Statistical Meetings, 1991.
  • Program Committees, IEEE International Symposium on Information Theory, Numerous times.
  • Program Committees, IEEE Workshop on Information Theory, 1989, 2008.
  • Program Committee, World Congress on Neural Networks, 1995.
  • Program Committee, Neural Information Processing Systems: Natural and Synthetic, 1995.
  • Program Committee, ACM Workshop on Computational Learning Theory, 1991, 1997.
  • James Waters Creativity Award for best undergraduate research at Rice, 1981.
  • Houston Telephone Engineers scholarship, top student in communication theory, 1981.
  • Top Award for leadership, scholarship and service, Woodson, High School, Fairfax, VA, 1977.

Research Interests:

  • Entropy Power Inequalities and Central Limit Theorems.
  • Capacity-achieving Sparse Superposition Codes for the Gaussian Channel. Communication by Regression.
  • Entropy Rates, Likelihood Stabilization, and Inference for Dependent Processes.
  • Foundations of Minimum Description Length Principle of Inference and Universal Data Compression.
  • Statistical Risk Analysis for Penalized Criteria for Model Selection.
  • Statistical Risk Analysis for Bayes Procedures.
  • Statistical Perspectives and Analysis of Artificial Neural Networks.
  • Nonlinear Approximation and Estimation for High-dimensional Libraries of Functions.
  • Greedy Algorithms for Subset Selection, Mixture Density Estimation, and L1 Penalty Optimization.
  • Maximum Wealth Stock Indices and Growth Rate Optimal Portfolio Estimation.

PUBLICATIONS: Papers available electronically on www.stat.yale.edu/~arb4

Ph.D. Dissertation:

  1. A. R. Barron (1985). Logically smooth density estimation. Stanford Univ., Stanford, CA.

Monograph:

  1. R. Venkataramanan, S. Tatikonda and A. R. Barron (2019). Sparse Regression Codes. Foundations and Trends in Communications and Information Theory. Vol.15, No.1-2, pp.1-195.

Journal Publications:

  1. D. Cleveland, A. R. Barron, A. N. Mucciardi (1980). Methods for determining the depth of near-surface defects. Journal of Nondestructive Evaluation, Vol.1, pp.21-36.
  1. A. R. Barron (1985). The strong ergodic theorem for densities: generalized Shannon-McMillan-Breiman theorem. Annals of Probability, Vol.13, pp.1292-1303. (Finalist for the best paper prize by the IEEE Information Theory Society.)
  1. A. R. Barron (1986). Entropy and the central limit theorem. Annals of Probability, Vol.14, pp.336-342. (Finalist for the best paper prize by the IEEE Information Theory Society.)
  1. A. R. Barron (1986). Discussion on Diaconis and Freedman: the consistency of Bayes estimates. Annals of Statistics, Vol.14, pp.26-30.
  1. A. R. Barron and T. M. Cover (1988). A bound on the financial value of information. IEEE Transactions on Information Theory, Vol.34, pp.1097-1100.
  1. A. R. Barron (1989). Uniformly powerful goodness of fit tests. Annals of Statistics, Vol.17, pp.107-124.
  1. B. Clarke and A. R. Barron (1990). Information-theoretic asymptotics of Bayes methods. IEEE Transactions on Information Theory, Vol.IT-38, pp.453-471. (Winner 1992 Browder J. Thompson Memorial Prize award for the best paper in all IEEE journals for authors of age 30 or under at time of submission).
  1. A. R. Barron and X. Xiao (1991). Discussion on Friedman's multivariate adaptive regression. Annals of Statistics, Vol.19, pp.67-82.
  1. A. R. Barron and C. Sheu (1991). Approximation of density functions by sequences of exponential families. Annals of Statistics, Vol.19, pp.1347-1369.
  1. A. R. Barron and T. M. Cover (1991). Minimum complexity density estimation. IEEE Transactions on Information Theory, Vol.IT-37, pp.1034-1054.
  1. A. R. Barron, L. Gyorfi, and E. C. van der Meulen (1992). Distribution estimation consistent in total variation and in two types of information divergence. IEEE Transactions on Information Theory, Vol.IT-38, pp.1437-1454.
  1. A. R. Barron (1993). Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory, Vol.IT-39, pp.930-944.
  1. A. R. Barron (1994). Approximation and estimation bounds for artificial neural networks. Machine Learning, Vol.14, pp.113-143.
  1. A. R. Barron (1994). Comment on Cheng and : Neural Networks, A Review from a Statistical Perspective. Statistical Science, Vol.9, No. 1, pp.33-35.
  1. B. Clarke and A. R. Barron (1994). Jeffreys' prior is asymptotically least favorable under entropy risk. Journal of Statistical Planning and Inference, Vol.41, pp.37-60.
  1. Q. Xie and A. R. Barron (1997). Minimax redundancy for the class of memoryless sources. IEEE Transactions on Information Theory, Vol.43, pp.646-657.
  1. Y. Yang and A. R. Barron (1998). An asymptotic property of model selection criteria. IEEE Transactions on Information Theory, Vol.44, pp.117-133.
  1. A. R. Barron, J. Rissanen and B. Yu (1998). The minimum description length principle in coding and modeling. (Invited Paper. Special issue in honor of 50 years since Claude Shannon's seminal work.) IEEE Transactions on Information Theory, Vol.44, pp.2734-2760.
  1. A. R. Barron and N. Hengartner (1998). Information theory and superefficiency. Annals of Statistics, Vol.26, pp.1800-1825.
  1. A. R. Barron, L. Birge and P. Massart (1999). Risk bounds for model selection by penalization. Probability Theory and Related Fields, Vol.113, pp.301-413.
  1. A. R. Barron, M. Schervish and L. Wasserman (1999). The consistency of posterior distributions in nonparametric problems. Annals of Statistics, Vol.27, pp.536-651.
  1. Y. Yang and A. R. Barron (1999). Information-theoretic determination of minimax rates of convergence. Annals of Statistics, Vol.27, pp.1564-1599.
  1. Q. Xie and A. R. Barron (2000). Asymptotic minimax regret for data compression, gambling, and prediction. IEEE Transaction on Information Theory, Vol.46, pp.431-445.
  1. G. Cheang and A. R. Barron (2000). A Better Approximation for Balls. Journal of Approximation Theory, Vol.104, pp. 183-203.
  1. J. E. Cross and A. R. Barron (2003). Efficient Universal Portfolios for Past Dependent Target Classes. Mathematical Finance, Volume 13, Issue 2, Page 245-276.
  1. O. Johnson and A. R. Barron (2004). Fisher Information Inequalities and the Central Limit Theorem. Probability Theory Related Field , 129, Page 391-409.
  1. F. Liang and A. R. Barron (2004). Exact Minimax Strategies for Predictive Density Estimation, Data Compression, and Model Selection. IEEE Transactions on Information Theory, Vol. 50, Page 2708-2726.
  1. G. Leung and A. R. Barron (2006). Information theory and mixing least-squares regressions. IEEE Transactions on Information Theory, Vol. 52. no.8, pp.3396-3410.
  1. M. Madiman and A. R. Barron (2007). Generalized entropy power inequalities and monotonicity properties of information. IEEE Transactions on Information Theory, Vol. 53. no.7, pp.2317-2329.
  1. A. R. Barron, A. Cohen, W. Dahmen and R. DeVore (2008). Approximation and learning by greedy algorithms. Annals of Statistics, Vol. 36, pp. 64-94.
  1. J. Takeuchi, T. Kawabata and A. R. Barron (2013). Properties of Jeffreys Mixture for Markov Sources. IEEE Transactions on Information Theory . Vol.59, no.1. January, pp.438-457.
  1. A. Joseph and A. R. Barron (2012). Least Squares Superposition Codes of Moderate Dictionary Size Are Reliable at Rates up to Capacity. IEEE Transactions on Information Theory, vol.58, no.5. May, pp.2541-2557.
  1. A. Joseph and A. R. Barron (2014). Fast Sparse Superposition Codes have Exponentially Small Error Probability for R < C. IEEE Transactions on Information Theory, vol.60. No.2, February, pp.919-942.
  1. A. M. Kagan, Tinghui Yu, A. R. Barron and M. Madiman (2014). Contributions to the theory of Pitman estimators. Journal of Mathematical Sciences. vol.199, no.2, pp.202-214.
  1. L. Jakobek, M. Boc and A. R. Barron (2015). Optimization of Ultrasonic-Assisted Extraction of Phenolic Compounds from Apples. Food Anal. Methods, Vol. 8, pp. 2612-2625.
  1. L. Jakobek and A. R. Barron (2016). Ancient Apple Varieties from Croatia as a Source of Bioactive Polyphenolic Compounds. Journal of Food Composition and Analysis, Vol.45, pp. 9-15.
  1. X. Yang and A. R. Barron (2017). Minimax Compression and Large Alphabet Approximation through Poissonization and Tilting. IEEE Transactions on Information Theory, vol.63.
  1. A. R. Barron, M. Bensic and K. Sabo (2018). A Note on Weighted Least Square Distribution Fitting and Full Standardization of the Empirical Distribution Function. Test. Vol.27, no.4, pp.946-967.
  1. J. M. Klusowski and A. R. Barron (2016). Risk Risk Bounds for High-dimensional Ridge Function Combinations Including Neural Networks. Submitted. ArXiv:1607.01434v4.
  1. J. M. Klusowski and A. R. Barron (2018). Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with l_1 and l_0 Controls. IEEE Transactions on Information Theory. Vol.64, no.12, pp.7649-7656.
  1. J. A. R. Barron and J.M. Klusowski (2018). Approximation and Estimation for High-Dimensional Deep Learning Networks. In revision for the IEEE Transactions on Information Theory. ArXiv:1809.03090v2. 18 Sept. 2018.
  1. J. A. R. Barron and J.M. Klusowski (2019). Complexity, Statistical Risk and Metric Entropy of Deep Nets Using Total Path Variation. Being incorporated into the revision of the preceeding paper for the IEEE Transactions on Information Theory. ArXiv:1902.00800v2. 8 Feb. 2019.

Book Chapters:

  1. A. R. Barron (1984). Predicted squared error: a criterion for automatic model selection. Chapter 4 in Self-Organizing Methods in Modeling, S. J. Farlow (Editor), Marcel Dekker, New York, pp.87-103.
  1. R. L. Barron, A. N. Mucciardi, F. J. Cook, J. N. Craig, and A. R. Barron (1984). Adaptive learning networks. Chapter 2 in Self-Organizing Methods in Modeling, S. J. Farlow (Editor), Marcel Dekker, New York, pp.25-65.
  1. A. R. Barron (1987). Are Bayes rules consistent in information? In Open Problems in Communication and Computation, T. M. Cover and B. Gopinath (Editors), Springer-Verlag, New York, pp.85-91.
  1. A. R. Barron (1991). Complexity regularization with application to artificial neural networks. In Nonparametric Functional Estimation and Related Topics, G. Roussas (Editor), Kluwer Academic Publishers, Boston, MA and Dordrecht, The Netherlands, pp.561-576.
  1. A. R. Barron (1998). Information-theoretic Characterization of Bayes Performance and the Choice of Priors in Parametric and Nonparametric Problems. In Bayesian Statistics 6, J.M. Bernardo, J.O. Berger, A.P. Dawid and A.F.M. Smith (Editors). Oxford University Press. pp.27-52.
  1. J.Q. Li and A.R. Barron (2000). Mixture Density Estimation. In Advances in Neural Information Processing Systems, Vol.12, S.A. Solla, T.K. Leen and K-R. Mueller (Editors). MIT Press, Cambridge, Massachusetts, pp. 279-285.
  1. F. Liang and A. Barron (2005). Exact minimax predictive density estimation and MDL. [Plus the table with MDL parameter estimates.] In Advances in Minimum Description Length: Theory and Applications, P.D. Grunwald, I.J. Myung and M.A. Pitt (Editors). MIT Press, Cambridge, Massachusetts. pp.177-193.
  1. A.R. Barron, C. Huang, J. Q. Li and Xi Luo (2008). MDL Principle, Penalized Likelihood, and Statistical Risk. In Festschrift for Jorma Rissanen. Peter Grunwald, Petri Myllymaki, Ioan Tabus, Marcelo Weinberger & Bin Yu (Editors). Tampere International Center for Signal Processing. TICSP series, #38. Tampere University of Technology, Tampere, Finland.

Publications in Conference Proceedings: (4 to 27 pages)

  1. A. R. Barron, F. W. van Straten, and R. L. Barron (1977). Adaptive learning network approach to weather forcasting: a summary. Proceedings of the IEEE International Conference on Cybernetics and Society, Washington, DC, September 19-21. Published by IEEE, New York, pp.724-727.
  1. A. R. Barron and R. L. Barron (1988). Statistical learning networks: a unifying view. In Computing Science and Statistics: Proceedings of the 20th Symposium on the Interface, Reston, Virginia, April 20-23. E. Wegman (Editor), Published by the American Statistical Association, Alexandria, Virginia, pp.192-203. (Invited presentation).
  1. A. R. Barron (1989). Statistical properties of artificial neural networks. Proceedings of the IEEE International Conference on Decision and Control, Tampa, Florida, Dec.13-15. pp.280-285, vol.1. (Invited presentation).
  1. R. L. Barron, R. L. Cellucci, P. R. Jordan, N. E. Beam, P. Hess, and A. R. Barron (1990). Applications of polynomial neural networks to fault detection, isolation, and estimation (FDIE) and reconfigurable flight control. Proceedings of the National Aerospace Electronics Conference, Dayton, Ohio, May 23-25, pp.507-519, vol.2 (Winner of the best paper prize, 1990 NAECON). Republished in Proceedings 1998 NAECON, pp. 348-360.
  1. A. R. Barron (1991). Approximation and estimation bounds for artificial neural networks. In Computational Learning Theory: Proceedings of the Fourth Annual ACM Workshop, Santa Cruz, CA, August 5-7. L. Valiant, Ed., Morgan Kaufmann Publishers, Inc., San Mateo, California, pp.243-249. (Honored as one of the four papers that appeared by invitation in expanded form in the special issue of Machine Learning, representing the top presentations at the workshop.)
  1. A. R. Barron (1992). Neural Net Approximation. Proceedings of the 7th Yale Workshop on Adaptive and Learning Systems, May 20-22, K. S. Narendra (Editor), Center for Systems Science, Yale University, pp. 69-72.
  1. D. Haussler and A. R. Barron (1993). How well do Bayes methods work for on-line prediction of + or -1 values? Computational Learning and Cognition: Proc. Third NEC Research Symposium, SIAM, Philadelphia, pp.74-101.
  1. J. Takeuchi and A. R. Barron (1997). Asymptotically minimax regret for exponential families. 20th Symposium on Information Theory and Its Applications. pp.665-668.
  1. J. Takeuchi and A. R. Barron (1998). Robustly Minimax Codes for Universal Data Compression. 21st Symposium on Information Theory and Its Applications. Gifu, Japan, December 2-5.
  1. G.H.L. Cheang and A. R. Barron (1999). Estimation with Two Hidden Layer Neural Nets. Proceedings of the 1999 International Joint Conference on Neural Networks (IJCNN), pp.375-378, vol.1.
  1. G.H.L. Cheang and A. R. Barron (2001). Penalized Least Squares, Model Selection, Convex Hull Classes, and Neural Nets. Proceedings of the 9th European Symposium on Artificial Neural Networks. Bruges, Belgium, April 25-27, pp.371-376.
  1. A. R. Barron (2000). Limits of Information, Markov Chains and Projection. Proc. IEEE International Symposium on Information Theory. Sorrento, Italy, June 25-30.
  1. J. Takeuchi and A. R. Barron (2001). Properties of Jeffreys mixture for Markov sources. Proc. Workshop on Information Based Induction Sciences (IBIS), pp. 327-333.
  1. G. Leung and A. R. Barron (2005). Combining Least-squares Regressions: An Upper Bound on Mean Squared Error. Proc. International Symposium on Information Theory, September 4-9, pp. 1711-1715.
  1. M. Madiman and A. R. Barron (2006). The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities. Proc. IEEE International Symposium on Information Theory. Seattle, Washington, July 2006. pp. 1021-1025. Presentation Link
  1. J. Takeuchi, A. R. Barron, T. Kawabata (2006). Statistical curvature and stochastic complexity. Proceedings of the 2nd Symposium on Information Geometry and its Applications, Tokyo, Japan, December 12-16, pp. 29-36.
  1. A. R. Barron and Xi Luo (2007). Adaptive Annealing. Proceedings 45th Annual Allerton Conference on Communication, Control, and Computing. Allerton House, UIUC, Illinois. September 26-28. pp. 665-673. Presentation Link
  1. A. R. Barron, C. Huang, J. Q. Li, and Xi Luo (2008). MDL, Penalized Likelihood and Statistical Risk. IEEE Information Theory Workshop. Porto, Portugal, May 4-9. pp. 247-257. Presentation Link
  1. A. R. Barron and Xi Luo (2008). MDL procedures with l_1 penalty and their statistical risk. First Workshop on Information Theoretic Methods in Science and Engineering. Tampere, Finland, August 18-20, 2008.
  1. M. Madiman, A. R. Barron, A. Kagan, T. Yu (2009). A Model for Pricing Data Bundles by Minimax Risks for Estimation of a Location Parameter. Proceedings of the IEEE Workshop on Information Theory. Volos, Greece, June 10-12. pp. 106-109.
  1. A. R. Barron, A. Joseph (2010). Least Squares Superposition Codes of Moderate Dictionary Size, Reliable at Rates up to Capacity. Proc. IEEE International Symposium on Information Theory. Austin, Texas, June 13-18. pp. 275-279. Presentation Link
  1. A. R. Barron, A. Joseph (2010). Towards fast reliable communication at rates near capacity with Gaussian noise. Proc. IEEE International Symposium on Information Theory. Austin, Texas, June 13-18. pp. 315-319.
  1. A. R. Barron, A. Joseph (2011). Analysis of fast sparse superposition codes. Proc. IEEE International Symposium on Information Theory. St Petersburg, Russia, August 1-6. pp. 1772-1776. Presentation Link.
  1. E. Abbe, A. R. Barron (2011). Polar coding schemes for the AWGN channel. Proc. IEEE International Symposium on Information Theory. St Petersburg, Russia, August 1-6, 2011. pp. 194-198.
  1. A. R. Barron and Sanghee Cho (2012). High-rate sparse superposition codes with iteratively optimal estimates. Proc. IEEE International Symposium on Information Theory. Cambridge, MA, July 2012, pp. 120-124.
  1. Cynthia Rush and A. R. Barron (2013). Using the method of nearby measures in superposition coding with a Bernoulli dictionary. 6th Workshop on Information Theory Methods in Science and Engineering (WITMSE). Tokyo, Japan, August 2013.
  1. Xiao Yang and A. R. Barron (2013). Large alphabet coding and prediction through Poissonization and tilting. Proc. 6th Workshop on Information Theoretic Methods in Science and Engineering (WITMSE) . Tokyo, Japan, August 2013. pp. 68-74.
  1. Xiao Yang and A. R. Barron (2014). Compression and prediction for large alphabet iid and Markov models. Proc. 7th Workshop on Information Theoretic Methods in Science and Engineering (WITMSE) . Honolulu, July 5-8. pp. 31-34.
  1. Jun'ichi Takeuchi and A. R. Barron (2013). Asymptotically minimax regret by Bayes mixtures for non-exponential families. Proc. IEEE Information Theory Workshop. pp. 1-5.
  1. Jun'ichi Takeuchi and A. R. Barron (2014). Asymptotically Minimax Regret for Models with Hidden Variables. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014, pp.3037-3041.
  1. Jun'ichi Takeuchi and A. R. Barron (2014). Stochastic Complexity for Tree Models. Proc. IEEE Information Theory Workshop. November 2-4, 2014, pp.222-226.
  1. Sabyasachi Chatterjee and A. R. Barron (2014). Information Theoretic Validity of Penalized Likelihood. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014, pp. 3027-3031.
  1. Grace Xiao Yang and A. R. Barron (2014). Compression and Predictive Distributions for Large Alphabet i.i.d. and Markov Models. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014, pp. 2504-2508.
  1. A. Barron, Teemu Roos, Kazuho Watanabe (2014). Bayesian Properties of Normalized Maximum Likelihood and its Fast Computation. Proc. IEEE International Symposium on Information Theory. Honolulu, HI, June 2014. pp.1667-1671.
  1. J. M. Klusowski and A. R. Barron (2017). Minimax Lower Bounds for Ridge Combinations Including Neural Nets. Proc. IEEE International Symposium on Information Theory. Aachen, Germany, July 2017. pp. 1376-1380.
  1. Jun'ichi K. Miyamoto, A.R. Barron and J. Takeuchi (2019). Improved MDL Estimators Using Local Exponential Family Bundles Applied to Mixture Families. Proc. IEEE Information Theory Workshop. Paris, France, July 2019. pp.1442-1446.

Newsletter Article:

  1. R. Venkataramanan, S. Tatikonda, A. Barron (2016). Sparse Regression Codes. IEEE Information Theory Society Newsletter.December 2016, pp. 7-15. [Based on ISIT Tutorial by R. Venkataramanan and A. Barron, Barcelona, July 2016.]

Patents:

Technical Reports: (with details not in subsequent publications)

Other Conference Presentations: (proceedings containing not more than 1 page abstracts). Links to recent presentation files available at www.stat.yale.edu/~arb4

  • A. R. Barron (1983). Convergence of logically simple estimates of unknown probability densities. IEEE International Symposium on Information Theory, Saint Jovite, Canada, September 26-30.
  • A. R. Barron (1985). Entropy and the central limit theorem. IEEE International Symposium on Information Theory, Brighton, England, June 23-28.
  • A. R. Barron (1985). Ergodic theorem for densities: generalized Shannon-McMillan-Breiman theorem. IEEE International Symposium on Information Theory, Brighton, England, June 23-28.
  • A. R. Barron (1985). Logically smooth density estimation. Joint IMS, ASA Annual Meeting, Las Vegas, Nevada, August 5-8.
  • A. R. Barron (1987). Applications of large deviations in statistics. Conference on Asymptotic Methods for Stochastic Systems: Large Deviations Theory and Practice, University of Maryland, October 25-27. (Invited Presentation).
  • A. R. Barron (1988). The convergence in information of probability density estimators. IEEE International Symposium on Information Theory, Kobe, Japan, June 19-24.
  • T. M. Cover and A. R. Barron (1988). A bound on the financial value of information. IEEE International Symposium on Information Theory, Kobe, Japan, June 19-24.
  • A. R. Barron (1989). Minimum complexity density estimation. IMS Regional Meeting, Lexington, Kentucky, March 19-22. (Invited Presentation).
  • A. R. Barron (1989). Portfolio selection based on nonparametric density estimates for the stock market. 21st Symposium on the Interface: Computing Science and Statistics, Orlando, Florida, April 9-12. (Invited Presentation).
  • A. R. Barron (1989). Minimum complexity estimation. IEEE Workshop on Information Theory, Center for Applied Math, Cornell University, June 26-30. (Session Organizer).
  • A. R. Barron (1989). Some statistical properties of polynomial networks and other artificial neural networks. Conference on Neural Information Processing Systems, Denver, Colorado, November 27-30. (Invited Plenary Presentation).
  • A. R. Barron (1990). An index of resolvability of probability density estimators. IEEE International Symposium on Information Theory, San Diego, California, January 14-19.
  • A. R. Barron (1990). Some statistical convergence properties of artificial neural networks. IEEE International Symposium on Information Theory, San Diego, California, January 14-19.
  • A. R. Barron (1990). The index of resolvability: statistical convergence of minimum complexity estimation. AAAI Symposium on Minimal-Length Encoding, Stanford California, March 27-29 (Invited Presentation).
  • A. R. Barron (1990). Some approximation and estimation theorems for artificial neural networks. IEEE Information Theory Workshop, Veldhoven, The Netherlands, June 10-15 (Invited Presentation).
  • A. R. Barron (1990). Statistical properties of artificial neural networks. SIAM Annual Meeting, Chicago, Illinois, July 20 (Invited Presentation).
  • A. R. Barron (1990). Complexity Regularization. NATO Advanced Study Institute on Nonparametric Functional Estimation and Related Topics, Spetses, Greece, July 29 - August 11 (Invited Presentation).
  • A. R. Barron (1991). Information theory and the stock market: the effect of side information. Workshop on Coordination of Distributed Information and Decisions, Cornell University, April 11-13. (Invited Presentation).
  • A. R. Barron (1991). Approximation and estimation results for adaptively synthesized neural network architectures. Workshop on Theoretical Issues in Neural Nets, Center for Discrete Mathematics and Theoretical Computer Science, Rutgers, University, May 20-23. (Invited Presentation).
  • A. R. Barron and R. L. Barron (1991). Artificial neural networks in industry: some developments using statistical techniques. IMS Special Topics Meeting on Statistics in Industry, Philadelphia, Pennsylvania, June 9-12 (Invited Presentation).
  • A. R. Barron (1991). Universal approximation bounds for superpositions of a sigmoidal function. IEEE International Symposium on Information Theory, Budapest, Hungary, June 23-29.
  • A. R. Barron (1991). Approximation and estimation bounds for sigmoidal and polynomial networks. Joint Statistical Meetings, Atlanta, Georgia, August 19-22. (Session Organizer).
  • A. R. Barron (1991). Approximation and estimation bounds for neural networks and projection pursuit. Workshop on Neural Information Processing Systems, Vail, Colorado, December 10-11. (Invited Presentation).
  • A. R. Barron (1991). Risk estimation, risk optimality, regularization, and neural nets. Workshop on Neural Information Processing Systems, Vail, Colorado, December 10-11.
  • A. R. Barron (1992). Approximation, estimation, and computation results for artificial neural networks. 24th Symposium on the Interface: Computing Science and Statistics, College Station, Texas, March 19-21. (Invited Presentation).
  • A. R. Barron (1992). Artificial neural networks: stochastic analysis and engineering applications? Symposium on Stochastic Processes in Engineering Applications, Otaniemi, Finland, April 7-9. (Invited Plenary Presentation).
  • A. R. Barron, D. Olive, and Y. Yang (1992). Asymptotically optimal complexity-based model selection. IMS Regional Meeting, Corvallis, Oregon, June 14-15. (Invited Presentation).
  • A. R. Barron (1992). Statistical accuracy of neural nets. Conference on Neural Information Processing Systems: Natural and Synthetic. Denver, Colorado, November 30 - December 2. (Invited Tutorial).
  • A. R. Barron (1993). Neural net approximation. Annual Meeting of the American Math Society, San Antonio, Texas, January 13-16. (Invited Presentation).
  • A. R. Barron, L. Gyorfi, and E. C. van der Meulen (1993). Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. IEEE International Symposium on Information Theory, San Antonio, Texas, January 17-22. p.51.
  • A. R. Barron, B. Clarke, and D. Haussler (1993). Information bounds for the risk of Bayesian predictions and the redundancy of universal codes. IEEE International Symposium on Information Theory, San Antonio, Texas, January 17-22.
  • A. R. Barron (1993). Statistical accuracy of neural nets. Workshop on Information and Geometry, Hakone, Japan. March. (Invited Presentation).
  • A. R. Barron (1993). Optimal rate properties of minimum complexity estimation. Workshop on Descriptional Complexity, Schloss Dagstahl, Wadern, Germany. May. (Invited Presentation).
  • A. R. Barron (1993). Do neural nets avoid the curse of dimensionality? Congress on Statistics, Vannes, France, May. (Invited Presentation).
  • A. R. Barron (1993). Performance bounds for neural net estimation and classification. Annual Meeting of the Classification Society of North America, Pittsburgh, June. (Invited Presentation).
  • A. R. Barron (1993). Do neural nets avoid the curse of dimensionality? NATO ASI on Statistics and Neural Nets, Les Arc, France, June. (Invited Presentation).
  • A. R. Barron (1994). The accuracy of Bayes estimates of neural nets. 26th Symposium on the Interface: Computing Science and Statistics, Research Triangle Park, NC, June 15-18.
  • A. R. Barron, Y. Yang and B. Yu (1994). Asymptotically optimal function estimation by minimum complexity criteria. IEEE International Symposium on Information Theory, Trondheim, Norway, June 27 - July 1. (One page proceeding and seven page original submission available).
  • A. R. Barron (1994). Neural net approximation and estimation. IEEE Workshop on Information Theory, Moscow, Russia, July 3-6. (Invited Presentation).
  • A. R. Barron (1994) Minimum complexity estimation. ML/COLT: Workshop on Applications of Descriptional Complexity to Induction, Statistical and Visual Inference, New Brunswick, New Jersey, July 8-9. (Invited Presentation.)
  • A. R. Barron (1995). Statistics and neural nets. New England Statistics Conference, Storrs, Connecticut, April 22. (Invited Plenary Presentation).
  • A. R. Barron, Y. Yang (1995). Information-theoretic development of minimax rates of convergence. IEEE International Symposium on Information Theory, Whistler, British Columbia, September 17-22. (Added to session at recommendation of program chair; not in proceedings).
  • N. Hengartner and A. R. Barron (1995). Information theory and superefficiency. IMS regional meeting Stanford, California. (Invited Presentation).
  • B. S. Clarke and A. R. Barron (1995). Jeffreys' prior yields the asymptotic minimax redundancy. IEEE-IMS Workshop on Information Theory and Statistics, Alexandria, Virginia, October 27-29. (Invited Presentation).
  • A. R. Barron (1995). Asymptotically optimal model selection and neural nets. IEEE-IMS Workshop on Information Theory and Statistics, Alexandria, Virginia, October 27-29. (Invited Presentation).
  • Q. Xie and A. R. Barron (1996). Asymptotic minimax regret for data compression, gambling, and prediction. Workshop on sequence prediction, Santa Cruz, California, May 3-5. (Invited, jointly presented).
  • A. R. Barron (1996). The fundamental role of Kullback Information in large sample statistics. Solomon Kullback Memorial Conference, Washington, DC, May 23-24. (Invited Presentation).
  • A. R. Barron (1996). Adaptation and model selection. AMS-IMS-SIAM Summer Research Conference on adaptive selection of models and statistical procedures, Mount Holyoke, Massachusetts, June 22-28. (Conference Chairmen, A. Barron, P. Bickel, I. Johnstone, D. Donoho).
  • A. R. Barron and Y. Yang (1996). Information theory in nonparametric estimation. Nonparametric Estimation: The Road Ahead, Canberra, Australia, July 2-4. (Invited Presentation).
  • A. R. Barron (1996). Adaptive model selection and neural networks. Sydney Interational Statistics Congress, IMS regional meeting, Sydney, Australia, July 8-12. (Invited Presentation).
  • A. R. Barron (1996). Asymptotics of Bayes estimators. International Society of Bayesian Analysis, Regional Meeting, Chicago, Illinois, August 2-3. (Invited Presentation).
  • A. R. Barron and Y. Yang (1996). Adaptive model selection and the index of resolvability. Joint Statistical Meetings of the ASA and IMS, Chicago, Illinois, August 4-8. (Invited Presentation).
  • A. R. Barron and Q. Xie (1997). Asymptotic minimax regret for data compression, gambling and prediction. IEEE International Symposium on Information Theory, Ulm, Germany, June 29 - July 4.
  • A. R. Barron (1997). Information theory in probability, statistics, learning, and neural nets. Computational Learning Theory: Tenth Annual ACM Workshop, Nashville, Tennessee, July 6-9. (Invited Plenary Presentation).
  • A. R. Barron (1997). Information theory in probability, statistics, learning, and neural nets. International Conference on Combinatorics, Information Theory, and Statistics, University of Southern Maine, Portland, Maine, July 18-20. (Invited Presentation).
  • A. R. Barron and Y. Yang (1997). Information-theoretic determination of minimax rates of convergence. Symposium on Nonparametric Functional Estimation, Centre de Recherches Mathematiques, University of Montreal, October 16-18. (Invited Presentation).
  • A. R. Barron (1998). Nonlinear approximation, greedy algorithms, and neural networks. Ninth International Conference on Approximation Theory, January 4-6. (Invited Presentation).
  • A. R. Barron (1998). How information theory illuminates the behavior of risk functions of Bayes proceedures. Purdue Workshop on the Interface between Paradigms of Statistics. June 17-19. (Invited Presentation).
  • A. R. Barron and J.-I. Takauchi (1998). Mixture models achieving optimal coding regret. IEEE Information Theory Workshop. Killarney, Ireland, June 22-26.
  • J. Takeuchi and A. R. Barron (1998). Asymptotic Minimax Regret by Bayes Mixtures. International Symposium on Information Theory. Cambridge, Ma, August 16-21.
  • A. R. Barron (1998). Information theory in probability and statistics; Approximation and estimation bounds for Gaussian mixtures. CIRM Workshop on Information Theory, Statistics, and Image Analysis, Marsielle, France, December 7-11. (Two Invited Presentations).
  • A. R. Barron (1999). Information theory in probability and statistics. IEEE Information Theory Workshop on Detection, Estimation, Classification, and Imaging, Sante Fe, New Mexico, February 24-26. (Invited Presentation).
  • A. R. Barron (1999). Decision theory of regret for universal coding, gambling, and prediction. DIMACS Workshop: Online Decision Making, Rutgers University, New Brunswick, NJ, July 12-15. (Invited Presentation).
  • A. R. Barron (2000). Limits of information, Markov chains and projection. IEEE International Symposium on Information Theory, Sorrento, Italy, June 26-30.
  • A. R. Barron, Laszlo Gyorfi, Micheal Nussbaum (2000). Nonparametric Estimation, Neural Nets and Risk Asymptotics. Short Course. Mathematical Research Institute, Oberwolfach, Germany, June 10-17.
  • A. R. Barron (2000). Information theory in probability; Information theory in statistics. J. Bolyia Society Conference on Information Theory in Mathematics, honoring 50th anniversary of formation of what is now known as the Renyi Institute of Mathematics of the Hungarian Academy of Sciences. Balatonelle, Hungary, July 3-7. (Two Invited Presentations).
  • A. R. Barron (2000). Prediction, data compression, gambling, and model selection: Do Bayes procedures nearly minimize the maximum of regret over all possible data sequences? AMS-IMS-SIAM Summer Research Conference on Bayes, Frequentist, and Likelihood Inference: a Synthesis. Mount Holyoke College, South Hadley, Massachusetts, July 9-13. (Invited Presentation.)
  • A. R. Barron (2001). Information-theoretic bounds for mixture modeling, model selection, and data compression. Workshop on Information Theory and Statistics, DIMACS, Rutgers University, March 2001. (Invited Presentation).
  • A. R. Barron (2001). Information theory in probability and statistics. 23rd European Meeting of Statistics, Funchal, Madeira, Portugal, August 13-18. (Invited Keynote Plenary Presentation).
  • F. Liang and A. R. Barron (2001). Minimax optimal predictive density estimation, data compression, and model selection. Workshop on MDL at Conference on Neural Information Processing Systems, Whistler, British Columbia, December 10. (Invited Presentation).
  • F. Liang and A. R. Barron (2002). Exact minimax strategies for predictive density estimation, data compression and model selection. IEEE International Symposium on Information Theory, Lausanne, Switzerland, July 1-5.
  • A. R. Barron, Jiangfeng Yu, and Wei Qui (2003). Maximum compounded wealth: portfolio estimation, option pricing, and stock selection. Workshop on Complexity and Inference, DIMACS, Rutgers University, June 2-5. (Invited Presentation).
  • A. R. Barron (2003). The role of information in the central limit problem. Symposium on Information Theory and Some Friendly Neighbors -- Ein Wunschkonzert, ZIF, Center for Interdisciplinary Research, Bielefeld, August 11-13. (Invited Presentation).
  • A. R. Barron (2003). Interplay of statistics and information theory in formulation and selection of models. Workshop on Model Building, Dortmund, Germany, November 13-14. (Invited Presentation).
  • G. Leung and A. R. Barron (2004). Information theory, model selection and model mixing for regression, Conference on Information Sciences and Systems, Princeton, NJ, March 17-19. (Invited Presentation in session on Information Theory, Computer Science, and Statistics.)
  • A. R. Barron and G. Leung (2004). Risk assessment for Bayes procedures and model mixing in regression. IVth Workshop on Bayesian Nonparametrics, Universita di Roma La Sapienza, Rome, Italy, June 12-16. (Invited Presentation).
  • A. R. Barron and G. Leung (2004). Risk assessment for model mixing. Workshop on Mathematical Foundations of Learning Theory, Barcelona, Spain, June 18-23. (Invited Presentation).
  • A. R. Barron (2004). Relative entropy in probability theory and mathematical statistics. Workshop on Entropy in the Mathematical, Physical, and Engineering Sciences, Padova, Italy, June 24-27. (Two Invited Presentations).
  • A. R. Barron (2004). Fitting functions of many variables: neural networks and beyond. 16th Conference on Computational Statistics (COMPSTAT 2004), Prague, Czech Republic, August 23-28. (Invited Keynote Plenary Presentation).
  • A. R. Barron (2005). Neural nets, mixture models, and adaptive kernel machines. Yale Workshop on Adaptive and Learning Systems, May 29-31, Center for Systems Science, Yale University. (Invited presentation).
  • A. R. Barron (2005). Challenges in high-dimensional function estimation and attempted solutions, Congress on Statistics, Pau, France, June 6-10. (Invited Presentation).
  • A. R. Barron (2005). Information theory and risk analysis. Medallion Lecture. Joint Statistical Meetings of the IMS and ASA, Minneapolis, Minnesota, August 7-11. (Presented with IMS Medallion Award; one-hour special invited presentation on Aug. 7).
  • A. R. Barron (2005). Information theory and statistics for machine learning, IEEE Workshop on Machine Learning for Signal Processing XV, Mystic, CT, October 28-30. (Invited Keynote Plenary Presentation).
  • M. Madiman and A. R. Barron (2006). Monotonicity of information in the central limit theorem, Workshop on Information Theory and its Applications, University of California, San Diego, February 6-9. (Invited Presentation).
  • A. R. Barron (2006). Simple risk bounds for mixing least squares regressions. Journees: Model Selection in Statistics: Different approaches, University de Nice, Sophia-Antipolis, Nice, France, March 14-19 (Two one-hour invited presentations).
  • A. R. Barron (2006). Simple risk bounds for mixing least squares regressions. International Workshop on Applied Probability, University of Connecticut, Storrs, CT, May 18. (Invited Presentation).
  • A. R. Barron and Wei Qiu (2007). Maximum wealth portfolios, Workshop on Information Theory and its Applications, University of California, San Diego, January 29 - February 2. (Invited Presentation).
  • A. R. Barron, Cong Huang, and Xi Luo (2008). Penalized squared error and likelihood: risk bounds and fast algorithms, Workshop on Sparsity in High Dimensional Statistics and Learning Theory, Georgia Institute of Technology, Atlanta, Georgia, March 22-24. (Invited Three-Part Presentation).
  • A. R. Barron (2008).Principles of Information Theory in Probability and Statistics. Elements of Information Theory Workshop: CoverFest. On the Occasion of the 70th birthday of Tom Cover. Stanford University, May 16. (Invited Presentation).
  • A. R. Barron (2008). MDL Procedures with L_1 Penalty and their Statistical Risk Information and Communication Conference, Renyi Institute, Budapest, August 25-28, on the occasion of the 70th birthday of Imre Csiszar. (Invited Presentation).
  • A. R. Barron (2011). Information Theory and Flexible High-Dimensional Non-Linear Function Estimation. Info-Metrics Institute Workshop, American University, Wash, DC, November 12. (Invited Presentation). Disclaimer: The proposed solution on page 19 to the differential equation for Adaptive Annealing is problematic due to discontinuity of the gradient at the origin.
  • A. R. Barron (2011). Information and Statistics and Practical Achievement of Shannon Capacity. (Three-part Invited Tutorial Presentation.) Workshop on Information Theory and its Applications, U.C. San Diego, February 9.
  • A. R. Barron (2011). Communication by Regression: Practical Achievement of Shannon Capacity, Workshop Infusing Statistics and Engineering, Harvard University, June 5-6. (Invited Presentation).
  • A. R. Barron (2011). Sparse Superposition Codes: low complexity and exponentially small error probability at all rates below capacity, Workshop on Information Theory Methods in Science and Engineering, Helsinki, Finland, August 8. (Invited Presentation.)
  • A. R. Barron (2014). Overview of Recent Developments in Penalized Likelihood and MDL. Workshop on Information Theory Methods in Science and Engineering, Wakiki, HI, July 5. (Invited Presentation). Includes results obtained with Teemu Roos and Kazuho Watanabe in visit to Helsinkii, Finland, September 2013.
  • A. R. Barron (2015). Information and Statistics, IMA Workshop on Information and Concentration Phenomena, Minneapolis, MN, April 13. (Invited Presentation)
  • A. R. Barron (2015). Information and Statistics (Invited Plenary Presentation). IEEE Information Theory Workshop, Jerusalem, Israel, April 30.
  • A. R. Barron (2015). Computationally Feasible Greedy Algorithms for Neural Nets Non-Convex Optimization Workshop, NIPS, Montreal, Canada, December 12. (Invited Presentation).
  • A. R. Barron (2016). High-Dimensional Neural Networks: Statistical and Computational Properties. (Invited Plenary presentation). Conference on Operational Research, Osijek, Department of Economics, September 27.
  • A. R. Barron (2016). Probability for Normalized Maximum Likelihood Calculations. Workshop on Information Theory Methods in Science & Engineering. Helsinki, September 19. (Invited Presentation.) Based on joint work with X. Yang, T. Roos, K. Watanabe.
  • A.R. Barron (2017). Stochastic Diffusion for Gaussian Mixtures, Workshop on Information-Theoretic Inequalities. Newark, Delaware, April 22-23. (Invited Presentation).
  • A.R. Barron (2017). Central Limit Theory and Convergence to Invariance via Information Theory Inequalities, AIM Workshop on Entropy Power Inequalities. San Jose, May 1-5, 2017. (Co-organizer of Workshop with O. Johnson, M. Madiman, I. Kontoyiannis.)
  • A. R. Barron (2018). Approximating Multi-layer Learning Networks. Conference on Channels, Statistics, Information, Secrecy, Zero-error and Randomness: In honor of Imre Csiszar's 80th Birthday. Budapest, June 4-5. (Invited Presentation).
  • A. R. Barron (2018). Discussion of a Link Between Brown and Jordan, Lawrence D Brown Memorial Workshop, Philadelphia, December 1. (Invited Discussion).
  • A. R. Barron (2018). Approximating Deep Nets. Two days in Honor of Pascal Massart and Lucien Birge, Paris, June 28-29. (Invited Presentation).
  • A. R. Barron (2018). Risk Bounds for Neural Nets in High Dimensions. Workshop on High-Dimensional Statistics, Columbia University, New York, September 14-15. (Invited Presentation).
  • A. R. Barron (2019). Gaussian Complexity, Metric Entropy & Risk of Deep Nets. New England Statistics Symposium. Session in Celebration of Rick Vitale. May 16, 2019. (Invited Presentation).
  • A. R. Barron (2019). Gaussian Complexity, Metric Entropy, and the Statistical Learning of Deep Nets. Recent Progress in Foundational Data Science. Institute of Mathematics and Its Applications and the Institute for Research in Statistics and Its Applications, Minneapolis, September 16-17. (Invited Presentation.)
  • A. R. Barron (2019). Approximation, Complexity & Risk Properties of Deep Nets. Approximation and Data Analysis School & Conference, Nishny Novgorod, 3-4 October. (Two Invited Presentations).
  • A. R. Barron (2020). Deep Network Approximation. International Zurich Seminar on Information and Communication.. Zurich, 26-28 February. (Invited Plenary Presentation).

Invited Departmental Seminar Presentations: (These had short abstract announcements).
[Recent presentation files are included at www.stat.yale.edu.]

  • Purdue University, Joint Statistics Colloquium, October 3, 1985. Topic: Entropy and the central limit theorem.
  • Michigan State University, Department of Statistics and Probability, January 28, 1986. Topic: Generalized Shannon-McMillan-Breiman theorem.
  • University of Chicago, Department of Statistics, October 20, 1986. Topic: Uniformly powerful tests.
  • University of Virginia, Department of Mathematics, March 5, 1987. Topic: Convergence of Bayes estimators of probability density functions.
  • Stanford University, Department of Statistics, October 20, 1987. Topic: Convergence of Bayes estimators of probability density functions.
  • University of Chicago, Department of Statistics, March 7, 1988. Topic: Convergence of Bayes estimators of probability density functions.
  • McGill University, Joint Statistics Seminar for Montreal universities, March 31, 1988. Topic: Approximation of densities by sequences of exponential families.
  • Dupont Research Center, Dover, Delaware, April 26, 1988. Topic: Statistical learning networks.
  • IBM T. J. Watson Research Center, Yorktown Heights, New York, August 10, 1988. Topic: Statistical learning networks.
  • Stanford University, Information Systems Laboratory, November 3, 1988. Topic: Minimum complexity density estimation.
  • IBM Technical Education Center, Thornwood, New York, January 11-12, 1989. Statistical learning networks. In the short course on Knowledge Acquisition from Data.
  • Cornell University, Department of Economics and Program of Statistics (Co-hosts), February 1, 1989. Topic: Convergence of Bayes estimators of probability density functions.
  • Purdue University, Department of Statistics, September 7, 1989. Topic: Minimum complexity density estimation.
  • University of Lowell, Massachusetts, Joint Seminar, Department of Mathematics and Department of Electrical Engineering, March14, 1990. Topic: Statistical properties of polynomial networks and other artificial neural networks.
  • Carnegie Mellon University, Department of Statistics, April 4, 1990. Topic: Statistical properties of polynomial networks and other artificial neural networks.
  • University of Chicago, Department of Statistics, October 15, 1990. Topic: Statistical properties of artificial neural networks.
  • University of California, San Diego, Department of Mathematics, January 7, 1991. Topic: Approximation bounds for artificial neural networks.
  • University of California, San Diego, Department of Mathematics, January 8, 1991. Topic: Complexity regularization for nonlinear model selection.
  • Siemens Corporation, Princeton, New Jersey, February 28, 1991. Topic: Universal approximation bounds for superpositions of a sigmoidal function.
  • University of Wisconsin, Department of Statistics, April 3, 1991. Topic: Complexity regularization for nonlinear model selection.
  • University of Wisconsin, Department of Mathematics, April 4, 1991. Topic: Approximation bounds for artificial neural networks.
  • Technical University of Budapest, Department of Electrical Engineering, July 2, 1991. Topic: Universal approximation bounds for superpositions of a sigmoidal function.
  • Mathematical Sciences Research Institute, Berkeley, California, September 25, 1991. Topic: Empirical process bounds for artificial neural networks.
  • Stanford University, Department of Statistics, October 15, 1991. Topic: Approximation and estimation bounds for artificial neural networks.
  • University of California, Santa Cruz, Department of Computer and Information Sciences, October 17, 1991. Topic: Computationally efficient approximation and estimation of functions using artificial neural networks.
  • Yale University, Department of Statistics, January 13, 1992. Topic: Neural network estimation.
  • University of Virginia, Department of Electrical Engineering, Eminent Speaker Series, February 21, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • North Carolina State University, Department of Statistics, February 29, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • Cornell University, Center for Applied Mathematics, March 6, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of North Carolina, Department of Statistics, March 30, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of Joenesu, Finland, Department of Statistics, April 9, 1992. Topic: Introduction to artificial neural networks.
  • University of Paris VI, Department of Statistics, April 15, 1992, and University of Paris, Orsay, Department of Statistics, April16, 1992. Topic: Estimation of functions of several variables -- neural networks, Fourier decomposition, and Bayes methods.
  • University of Paris VI, Department of Statistics, April 22, 1992, and University of Paris, Orsay, Department of Statistics, April23, 1992. Topic: Performance bounds for complexity-based model selection.
  • Princeton University, Department of Electrical Engineering, May 14, 1992. Topic: Overview of approximation results for sigmoidal networks.
  • University of Massachusetts at Lowell, Joint Seminar, Department of Mathematics and Department of Electrical Engineering, October 21, 1992. Topic: Statistical accuracy of neural nets.
  • University of Tokyo, Japan, Department of Information, Physics and Engineering, March 1993. Topic: Information theory and model selection.
  • University of Paris VI, Department of Statistics, May 1993. Topic: Optimal rate properties of minimum complexity estimation.
  • University of Paris VI, Department of Statistics, May 1993. Topic: Information-theoretic proof of martingale convergence.
  • University of Pennsylvania, Wharton School, October 21, 1993. Topic: Neural networks and statistics.
  • Massachusetts Institute of Technology, Center for Biological and Computational Learning, October 27, 1993. Topic: Neural networks and statistics.
  • Rutgers University, Department of Statistics, October 5, 1994, Topic: Statistical accuracy of neural nets.
  • University of South Carolina, Department of Mathematics, Spring 1995. Topic: Neural net approximation.
  • Carnegie Mellon University, Department of Statistics, Fall 1995. Topic: Information risk and superefficiency.
  • Massachusetts Institute of Technology, Department of Applied Mathematics, March 1996. Topic: Consistent and uniformly consistent classification.
  • Columbia University, Department of Statistics, Fall 1996. Topic: Consistency of posterior distributions in nonparametric problems.
  • Northeastern University, Joint Mathematics Colloquium with MIT, Harvard, and Brandiess, February 27, 1997. Topic: Information theory in probability and statistics.
  • Iowa State University, Department of Statistics, March 28, 1997. Topic: Information theory in probability and statistics: The fundamental role of Kullback divergence.
  • Washington University, St. Louis, Department of Electrical Engineering, Center for Imaging Systems, April 16, 1997. Topic: Universal data compression, prediction, and gambling.
  • Massachusetts Institute of Technology, LIDS Colloquium, May 5, 1998. Topic: Simple universal portfolio selection.
  • University of California, Santa Cruz, Baskin Center for Computer Engineering, October 1998. Topic: Approximation bounds for Gaussian mixtures.
  • Lucent, Bell Laboratories, Murray Hill, New Jersey, March 1999. Topic: Approximation and estimation bounds for mixture density estimation.
  • Stanford University, Department of Statistics, Probability Seminar, May 24, 1999. Topic: Information, martingales, Markov chains, convex projections, and the CLT.
  • Stanford University, Department of Statistics, Statistics Seminar, May 25, 1999. Topic: Mixture density estimation.
  • Rice University, Departments of Statistics and Electrical Engineering, November 4, 2000. Topics: Information theory and statistics -- best invariant predictive density estimators.
  • University of Chicago, Department of Statistics, November 21, 2000. Topics: Information theory and statistics -- best invariant predictive density estimators.
  • Yale University, Department of Computer Science, Alan J. Perlis Seminar, April 26, 2001. Topic: Neural nets, Gaussian mixtures, and statistical information theory.
  • Brown University, Department of Applied Mathematics, May 9, 2001. Topic: I do not recall.
  • University of Massachusetts at Lowell, Department of Mathematics, September 19, 2001. Topic: Mixture density estimation.
  • University of California at Los Angeles, Department of Statistics, May 21, 2002. Topic: Nonlinear approximation, estimation, and neural nets (I do not recall the specific title).
  • Columbia University, Department of Statistics, October 28, 2002. Topic: Information inequalities in probability and statistics.
  • University of Georgia, Department of Statistics, November 26, 2002. Topic: Information inequalities in probability and statistics.
  • University of Pennsylvania, Wharton School, October 29, 2003. Topic: Portfolio estimation for compounding wealth.
  • University of North Carolina (in conjunction with Duke University), Departments of Statistics, November 3, 2004. Topic: Risk assessment and advantages of model mixing for regression.
  • South Carolina, Department of Mathematics, April 7, 2005. IMI Distinquished Lecture. Topic: Statistical theory for nonlinear function approximation: neural nets, mixture models, and adaptive kernel machines.
  • Helsinki University and Helsinki Institute of Information Technology, Helsinki, Finland. August 22-25, 2005. Two talks: (1) Statistical foundations and analysis of the minimum description length principle. (2) Consequences of MDL for neural nets and Gaussian mixtures.
  • Princeton University, Department of Operations Research and Financial Engineering, October 4, 2005. Topic: Statistical perspectives on growth rate optimal portfolio estimation.
  • IBM Research Laboratories, Yorktown Heights, September 22, 2006. Topic: Generalized entropy power inequalities and the central limit theorem.
  • Purdue University, Department of Computer Science, February 26, 2007. Prestige Lecture Series on the Science of Information. Topic: The interplay of information theory, probability, and statistics.
  • University of Illinois, Joint Seminar, Department of Statistics and Department of Electrical and Computer Engineering, February 27, 2007. Prestige Conference Series. Topic: The interplay of information theory and probability.
  • Boston University, Department of Statistics, March 1, 2007. Prestige Conference Series. Topic: Information inequalities and the central limit theorem.
  • University of California at Berkeley, Department of Computer Science, October 4, 2007. Topic: A Simple Algorithm for L1-Penalized Least Squares. Primarily presented by Cong Huang.
  • Rutgers University, Department of Statistics, December 12, 2007. Topic: Fast and Accurate L1 Penalized Estimations. Co-presented with Cong Huang.
  • Harvard University, Department of Statistics. Topic: Information Theory and Flexible High-Dimensional Non-Linear Function Estimation, Oct. 2011.
  • University of Michigan, Department of Statistics, Topic: Communication by Regression. March 9, 2012.
  • Rice University, Department of Statistics, Topic: Communication by Regression. Rice University, March 12, 2012.
  • Cambridge University, Statistics Laboratory, Topic: Communication by Regression: Sparse Superposition Codes, November 14, 2013.
  • University of Osijek, Croatia, Department of Mathematics. Topic :Neural Net Approximation and Estimation of Functions. March 12, 2015.
  • Massachusetts Institute of Technology, Department of Electrical Engineering, Laboratory for Information and Decision Sciences. April 8, 2014. .
  • Princeton University, Department of Electrical Engineering. Topic: Computationally Feasible Greedy Algorithms for Neural Nets, February 22, 2016.
  • University of Osijek, Department of Mathematics, Topic: Computationally Feasible Greedy Algorithms for Sigmoidal and Polynomial Networks, March 16, 2016.
  • Yale University, Department of Statistics and Data Science, Topic: Accuracy of High-Dimensional Deep Learning Networks. September 17, 2019.
  • University of Southern California, Department of Statistics, Topic: Complexity Theory for Deep Learning. October 25, 2019.

Ph.D. Dissertations Supervised:

  1. Bertrand S. Clarke (1989). Asymptotic Cumulative Risk and Bayes Risk under Entropy Loss, with Applications. University of Illinois at Urbana-Champaign. [Was Assistant Professor at Purdue University; then Professor at University of British Columbia and University of Miami; Now Chairman, Department of Statistics, University of Nebraska]
  1. Chyong Hwa Sheu (1989). Density Estimation with Kullback-Leibler Loss. University of Illinois at Urbana-Champaign.
  1. Yuhong Yang (1996). Minimax Optimal Density Estimation. Yale University. [Was Assistant Professor at Iowa State University; Now Professor at University of Minnesota, Department of Statistics.]
  1. Qun (Trent) Xie (1997). Minimax Coding and Prediction. Yale University. [Was at GE Capital, Inc., Fairfield, CT. Then an Assistant Professor at Tsinghua Univ.]
  1. Gerald Cheang (1998). Neural Net Approximation and Estimation of Functions. Yale University. [Now at Singapore Technical and Education University.]
  1. Jason Cross (1999). Universal Portfolios for Target Classes having a Continuous Form of Dependence on Side Information. Yale University. [Was at an investment start-up firm with Myron Scholes in New York. Now runs an investment firm in Minneapolis, Minnesota.]
  1. Qiang (Jonathan) Li (1999). Estimation of Mixture Models. Yale University. [Was at KPMG Financial Services, New York. Then at Stanford Research Institute, Palo Alto. Now at Radar Networks, Inc., San Franscisco.]
  1. Feng Liang (2002). Exact Minimax Predictive Density Estimation. Yale University. [Was Assistant Professor, Department of Statistics, Duke University. Now Associate Professor, Department of Statistics, University of Illinois at Urbana-Champaign]
  1. Gilbert Leung (2004). Information Theory and Mixing Least Squares Regression. Yale University. [At Qualcomm, first in San Diego, now near San Jose.]
  1. Wei (David) Qiu (2007). Maximum Wealth Portfolios. Yale University. [J.P. Morgan Chase, Columbus, Ohio.]
  1. Cong Huang (2008). Risk of Penalized Least Squares, Greedy Selection and L1-Penalization for Flexible Function Libraries. Yale University. [Was at Columbia University, Department of Statistics. Now in China.]
  1. Xi Luo (Rossi) (2009). Penalized Likelihoods: Fast Algorithms and Risk Bounds. Yale University. [Was at University of Pennsylvania, postdoc with Tony Cai, Department of Statistics. Now Assistant Professor at Brown University, Department of Biostatistics.]
  1. Antony Joseph (2012).Achieving Information Theoretic Limits with High Dimensional Regression. Yale University. Co-developer at Yale of capacity-achieving sparse superposition codes. [Was at University of California, Berkeley, postdoc with Bin Yu, Department of Statistics. Now at Walmart Research.]

  1. Sanghee Cho (2014). High-Dimensional Regression with Random Design, including Sparse Superposition Codes. Yale University. [At GE Research, Schenectady, New York]
  1. Sabyasachi Chatterjee (2014). Adaptation in Estimation and Annealing. Yale University. [Was Postdoc with John Lafferty, University of Chicago, Department of Statistics. Now Assistant Professor, University of Illinois at Urbana-Champaign, Department of Statistics.]
  1. Xiao (Grace) Yang (2015). Compression and Predictive Distributions for Large Alphabets. Yale University. [At Apple, Inc, Cupertino, California.]
  1. Cynthia Rush (May 2016). Iterative Algorithms for Inference and Optimization, with Applications in Communications and Compressed Sensing. Yale University. [Assistant Professor at Columbia University, Department of Statistics]
  1. William David Brinda (May 2018). Adaptive Estimation with Gaussian Radial Basis Mixtures. Yale University. [Was Lecturer, Yale University, Department of Statistics and Data Science]
  1. Jason Klusowski (May 2018). Density, Function, and Parameter Estimation with High-Dimensional Data. Yale University. [Assistant Professor at Rutgers University, Department of Statistics]

Associate Editorship:

  • 1993 - 1995 IEEE Transactions on Information Theory. A.E. for nonparametric estimation, classification and neural nets
  • 1995 - 1997 Annals of Statistics.
  • 1994 - 1997 Neural Networks.

Yale Departmental and Divisional Responsibilities:

  • Department of Statistics

   Director or Co-Director of Graduate Studies (Fall 1993 - Spring 1999; Fall 2017 - Spring 2020).
   Director of Undergraduate Studies (Fall 2010 - Spring 2016, Spring 2017).
  Acting Chair (Spring 1998).
  Chair (Spring 2001 - Fall 2006).

  • Division of Social Sciences

Senior Appointments Committee (Fall 1994 - Spring 1995; Fall 1999 - Spring 2000)

  • Program of Applied Mathematics

   Director of Undergraduate Studies (Fall 1999 - Spring 2001; Fall 2004 - Spring 2006)
   Senior Coordinator of Undergraduate Studies (Fall 2006 - Spring 2015)

Other Activities and Honors:

  • FAI free flight model glider designer and competitor, F1A class (Andrew Barron and family):


   Andrew is a five time U.S. National Champion: 1984, 1987, 1992, 2007 and 2009.
   Three time U.S. National Team Member at the World Championships:
       1995 in Hungary, 2001 in California, and 2013 in France.
   Member of nine man U.S. Team winning the Challenge France World Championships 2013.
   Alternate to the U.S. National Teams in 1977, 2015, 2019.    America's Cup Champion in 1998 and 2009.

   Son Peter was a U.S. National Team Member at the World Championships 2015 in Mongolia.

   Son Peter was a U.S. National Team Member at the Junior World Championships 1998 in Romania.

   Son John was a U.S. National Team Member at the Junior World Championships 2000 in the Chech Republic.

   Son Timothy was the U.S. National Champion in 2010 and 2012.
   Timothy was a US National Team Member at the Junior World Championships 2008 in the Ukraine.
   Timothy finished as third place individual, part of second place F1A team (USA), and part of
   first place U.S. team (F1A,F1B,F1P combined).

   Daughters Michelle and Gina earned place on U.S. National Team for Junior WC 2012 in Slovania.
   Gina finished as part of second place F1A team and Second place team overall (F1A,F1B,F1P combined).

Co-owner and manager of Barron Field, LLC, which owns a 284 acre sod-farm in Orange County, NY. It is the primary flying site used in the north-eastern US region to host free-flight meets and competitions, sanctioned by the Academy of Model Aeronautics (AMA) and the Federation Aeronautique International (FAI).