Andrew
R. Barron
Yale University
Professor of Statistics (100%
Appointment)
Professor of Electrical Engineering (Curtesy Appointment)
CoDirector of Graduate Studies in Statistics and Data Science
Address: Department of
Statistics and Data Science, 24 Hillhouse Avenue, New Haven, CT 06511
Phone:
+12039975229, 2034320666
Fax:
2034320633
EMail: Andrew.Barron@yale.edu
 Ph.D.,
Electrical Engineering, Stanford University, 1985.
 M.S.,
Electrical Engineering, Stanford University, 1982.
 B.S. (Magna Cum Laude), E.E. and Math Science, Rice University,
1981.
 W. T. Woodson
H.S., Fairfax, Virginia, 1977.
Experience:
 2016  present
Professor, Department of Statistics and Data Science, Yale University
 1992  2016
Professor, Department of Statistics, Yale University
 1990  1992
Associate Professor of Statistics and Electrical & Computer
Engineering, University of Illinois
 1984  1998
Consultant, Barron Associates, Inc., Stanardsville,
Virginia
 1992 Spring Visiting Reseach Scholar, Barron Associates, Inc., Stanardsville, Virginia
 1991 Fall
Visiting Scholar, Mathematical Sciences Research Institute,
Berkeley, California
 1985  1990
Assistant Professor of Statistics and Electrical & Computer
Engineering, University of Illinois
 1982  1985
Research Assistant, Stanford University
 1981  1983
Consultant, Adaptronics, Inc., McLean,
Virginia
 1977  1980
Engineer, Adaptronics, Inc., McLean,
Virginia (Summers)
Honors:
 Fellow of the
IEEE. For Contributions to Information Theory and Statistics.
 Best paper
prize for all IEEE journals in 19901991, Browder J. Thompson Memorial
Prize,
for authors of age 30 or under at time of submission.
 IMS Medallion
Award Winner. Presented at 2005 Joint ASAIMS Annual Meetings,
Minneapolis, MN.
 Best paper
prize, National Aerospace Electronics Conference, 1990.
 Finalist for
the best paper prize, Information Theory Society, IEEE, 1987.
 Nominated for
the Marconi Young Scientist Award, 1990.
 Board of
Governors, IEEE Information Theory Society, Four terms,19951997,
19982000, 20132016, 20172019.
 Appointed
Secretary, Board of Governors, IEEE Information Theory Society,
19891990.
 Keynote
speaker at several conferences.
 Chairman, AMS
Summer Research Conference, Adaptive Selection of Models and Procedures,
1996.
 Program
Committee, IMSASA Joint Statistical Meetings, 1991.
 Program
Committees, IEEE International Symposium on Information Theory, Numerous
times.
 Program
Committees, IEEE Workshop on Information Theory, 1989, 2008.
 Program
Committee, World Congress on Neural Networks, 1995.
 Program
Committee, Neural Information Processing Systems: Natural and Synthetic,
1995.
 Program
Committee, ACM Workshop on Computational Learning Theory, 1991, 1997.
 James Waters
Creativity Award for best undergraduate research at Rice, 1981.
 Houston
Telephone Engineers scholarship, top student in communication theory,
1981.
 Top Award for
leadership, scholarship and service, Woodson, High School, Fairfax, VA,
1977.
Research
Interests:
 Entropy Power
Inequalities and Central Limit Theorems.
 Capacityachieving
Sparse Superposition Codes for the Gaussian Channel. Communication by
Regression.
 Entropy Rates,
Likelihood Stabilization, and Inference for Dependent Processes.
 Foundations of
Minimum Description Length Principle of Inference and Universal Data
Compression.
 Statistical
Risk Analysis for Penalized Criteria for Model Selection.
 Statistical
Risk Analysis for Bayes Procedures.
 Statistical
Perspectives and Analysis of Artificial Neural Networks.
 Nonlinear
Approximation and Estimation for Highdimensional Libraries of
Functions.
 Greedy
Algorithms for Subset Selection, Mixture Density Estimation, and L1
Penalty Optimization.
 Maximum Wealth
Stock Indices and Growth Rate Optimal Portfolio Estimation.
PUBLICATIONS: Papers available electronically on
www.stat.yale.edu/~arb4
Ph.D. Dissertation:
Monograph:
 R. Venkataramanan, S. Tatikonda and A. R. Barron
(2019). Sparse Regression Codes. Foundations and Trends in Communications and Information Theory. Vol.15, No.12, pp.1195.
Journal Publications:
 D. Cleveland,
A. R. Barron, A. N. Mucciardi (1980). Methods
for determining the depth of nearsurface defects. Journal of Nondestructive
Evaluation, Vol.1, pp.2136.
 B. Clarke and
A. R. Barron (1990). Informationtheoretic
asymptotics of Bayes methods. IEEE Transactions on
Information Theory, Vol.IT38, pp.453471. (Winner 1992
Browder J. Thompson Memorial Prize award for the best paper in all IEEE
journals for authors of age 30 or under at time of submission).
Book
Chapters:
 R. L. Barron,
A. N. Mucciardi, F. J. Cook, J. N. Craig, and
A. R. Barron (1984). Adaptive
learning networks. Chapter 2 in SelfOrganizing
Methods in Modeling, S. J. Farlow (Editor), Marcel Dekker,
New York, pp.2565.
 J.Q. Li and
A.R. Barron (2000). Mixture
Density Estimation. In Advances
in Neural Information Processing Systems, Vol.12, S.A. Solla, T.K. Leen and KR.
Mueller (Editors). MIT Press, Cambridge, Massachusetts, pp. 279285.
 A.R. Barron,
C. Huang, J. Q. Li and Xi Luo (2008). MDL
Principle, Penalized Likelihood, and Statistical Risk. In Festschrift for Jorma Rissanen.
Peter Grunwald, Petri Myllymaki, Ioan Tabus, Marcelo Weinberger & Bin Yu (Editors).
Tampere International Center for Signal Processing. TICSP series, #38. Tampere
University of Technology, Tampere, Finland.
Publications
in Conference Proceedings: (4 to 27 pages)
 A. R. Barron,
F. W. van Straten, and R. L. Barron (1977). Adaptive
learning network approach to weather forcasting:
a summary.
Proceedings of the IEEE International Conference on Cybernetics and
Society, Washington, DC, September 1921. Published by IEEE,
New York, pp.724727.
 A. R. Barron
and R. L. Barron (1988). Statistical
learning networks: a unifying view. In Computing Science and Statistics: Proceedings of
the 20th Symposium on the Interface, Reston, Virginia, April
2023. E. Wegman (Editor), Published by the
American Statistical Association, Alexandria, Virginia, pp.192203.
(Invited presentation).
 R. L. Barron,
R. L. Cellucci, P. R. Jordan, N. E. Beam, P.
Hess, and A. R. Barron (1990). Applications
of polynomial neural networks to fault detection, isolation, and
estimation (FDIE) and reconfigurable flight control. Proceedings of the National
Aerospace Electronics Conference, Dayton, Ohio, May 2325,
pp.507519, vol.2 (Winner of the best paper prize, 1990 NAECON).
Republished in Proceedings
1998 NAECON, pp. 348360.
 A. R. Barron
(1991). Approximation
and estimation bounds for artificial neural networks. In Computational Learning Theory:
Proceedings of the Fourth Annual ACM Workshop, Santa Cruz,
CA, August 57. L. Valiant, Ed., Morgan Kaufmann Publishers, Inc., San
Mateo, California, pp.243249. (Honored as one of the four papers
that appeared by invitation in expanded form in the special issue of Machine Learning,
representing the top presentations at the workshop.)
 A. R. Barron
(1992). Neural
Net Approximation. Proceedings
of the 7th Yale Workshop on Adaptive and Learning Systems, May
2022, K. S. Narendra (Editor), Center for Systems Science, Yale University,
pp. 6972.
 J. Takeuchi
and A. R. Barron (2001). Properties of Jeffreys mixture for Markov sources.
Proc. Workshop on Information Based Induction Sciences (IBIS),
pp. 327333.
 A. R. Barron
and Xi Luo (2007). Adaptive
Annealing. Proceedings 45th Annual Allerton Conference on Communication,
Control, and Computing. Allerton House, UIUC, Illinois. September 2628.
pp. 665673. Presentation Link
Newsletter Article:
 R. Venkataramanan,
S. Tatikonda, A. Barron (2016). Sparse Regression Codes.
IEEE Information Theory Society
Newsletter.December 2016, pp. 715. [Based on ISIT Tutorial by
R. Venkataramanan and A. Barron, Barcelona, July 2016.]
Patents:
Technical Reports: (with
details not in subsequent publications)
 A. R. Barron
(1984). Monotonic
central limit theorem for densities. Department of Statistics
Technical Report #50, Stanford University, Stanford, California.
 A. R. Barron
(1988). The
exponential convergence of posterior probabilities with implications for
Bayes estimators of density functions. Department of Statistics
Technical Report #7, University of Illinois, Champaign, Illinois.
 B. Clarke and
A. R. Barron (1990). Entropy
risk and the Bayesian central limit theorem. Department of
Statistics Technical Report, Purdue University, West Lafayette, Indiana.
 A. R. Barron
(1991). Information
theory and martingales. Presented at the 1991 IEEE International
Symposium on Information Theory (recent results session), Budapest,
Hungary, June 2329.
 A. R. Barron,
Y. Yang and B. Yu (1994). Asymptotically
optimal function estimation by minimum complexity criteria. Seven page
submission. Presented at the IEEE
International Symposium on Information Theory, Trondheim,
Norway, June 27  July 1.
 A. R. Barron
(1997). Information theory in
probability, statistics, learning, and neural nets. Department of
Statistics. Yale University. Working paper distributed at plenary
presentation of the Tenth Annual ACM Workshop on
Computational Learning Theory.
 J.I. Takeuchi
and A. R. Barron (1997). Asymptotically
minimax regret for exponential and curved exponential families.
Fourteen page original. Presented at the 1998 International Symposium
on Information Theory, Cambridge, Massachusetts.
 A. R. Barron
(1999). Limits of information,
Markov chains, and projection. Eight page original submission.
Presented at the 2000 IEEE International Symposium on
Information Theory, Sorrento, Italy.
 J. Yu and A.
R. Barron (2003). Maximal
compounded wealth for portfolios of stocks and options. Working paper,
some of which was presented at the Workshop on Complexity and Inference,
DIMACS, Rutgers University, June 25.
 W. Qiu and
A. R. Barron (2007). A maximum
wealth asset index and mixture strategies for universal portfolios on
subsets of stocks. See also the Yale Dissertation of
Wei (David) Qiu.
 C. Huang,
G.L.H. Cheang and A. R. Barron (2008). Risk
of Penalized Least Squares, Greedy Selection and L1 Penalization for
Flexible Function Libraries. Yale Department of Statistics Technical
Report. [Too long for journal publication, yet still has some of our best results.]
 A. R. Barron
and A. Joseph (2011). Sparse
Superposition Codes are Fast and Reliable at
Rates Approaching Capacity with Gaussian Noise. June 10, 2011. [This
is an expanded version. A shorter version was completed in 2012, appearing
2014 in the IEEE Transactions on Information Theory, per the
publication list above.]
 S. Chatterjee
and A. R. Barron (2014). Information Theory of
Penalized Likelihoods and its Statistical Implications. arXiv:1401.6714v2,
April 27. [A shorter version is in ISIT 2014.]
Other Conference Presentations: (proceedings containing not more than 1 page abstracts). Links to recent presentation files available at www.stat.yale.edu/~arb4
 A. R. Barron
(1983). Convergence of logically simple estimates of unknown probability
densities. IEEE
International Symposium on Information Theory, Saint Jovite, Canada, September 2630.
 A. R. Barron
(1985). Entropy and the central limit theorem. IEEE International Symposium on
Information Theory, Brighton, England, June 2328.
 A. R. Barron
(1985). Ergodic theorem for densities: generalized ShannonMcMillanBreiman theorem. IEEE
International Symposium on Information Theory, Brighton,
England, June 2328.
 A. R. Barron
(1985). Logically smooth density estimation. Joint IMS, ASA Annual Meeting, Las
Vegas, Nevada, August 58.
 A. R. Barron
(1987). Applications of large deviations in statistics. Conference on Asymptotic
Methods for Stochastic Systems: Large Deviations Theory and Practice,
University of Maryland, October 2527. (Invited Presentation).
 A. R. Barron
(1988). The convergence in information of probability density
estimators. IEEE
International Symposium on Information Theory, Kobe, Japan,
June 1924.
 T. M. Cover
and A. R. Barron (1988). A bound on the financial value of information. IEEE International Symposium on
Information Theory, Kobe, Japan, June 1924.
 A. R. Barron
(1989). Minimum complexity density estimation. IMS Regional Meeting,
Lexington, Kentucky, March 1922. (Invited Presentation).
 A. R. Barron
(1989). Portfolio selection based on nonparametric density estimates for
the stock market. 21st
Symposium on the Interface: Computing Science and Statistics,
Orlando, Florida, April 912. (Invited Presentation).
 A. R. Barron
(1989). Minimum complexity estimation.
IEEE Workshop on Information Theory, Center for Applied
Math, Cornell University, June 2630. (Session Organizer).
 A. R. Barron
(1989). Some statistical properties of polynomial networks and other
artificial neural networks.
Conference on Neural Information Processing Systems, Denver,
Colorado, November 2730. (Invited Plenary Presentation).
 A. R. Barron
(1990). An index of resolvability of probability density estimators. IEEE International Symposium on
Information Theory, San Diego, California, January 1419.
 A. R. Barron
(1990). Some statistical convergence properties of artificial neural
networks. IEEE
International Symposium on Information Theory, San Diego,
California, January 1419.
 A. R. Barron
(1990). The index of resolvability: statistical convergence of minimum
complexity estimation. AAAI
Symposium on MinimalLength Encoding, Stanford California,
March 2729 (Invited Presentation).
 A. R. Barron
(1990). Some approximation and estimation theorems for artificial neural
networks. IEEE
Information Theory Workshop, Veldhoven,
The Netherlands, June 1015 (Invited Presentation).
 A. R. Barron
(1990). Statistical properties of artificial neural networks. SIAM Annual Meeting,
Chicago, Illinois, July 20 (Invited Presentation).
 A. R. Barron (1990).
Complexity Regularization. NATO Advanced Study Institute on Nonparametric
Functional Estimation and Related Topics, Spetses, Greece,
July 29  August 11 (Invited Presentation).
 A. R. Barron (1991).
Information theory and the stock market: the effect of side information.
Workshop on Coordination of Distributed
Information and Decisions, Cornell University, April 1113.
(Invited Presentation).
 A. R. Barron
(1991). Approximation and estimation results for adaptively synthesized
neural network architectures.
Workshop on Theoretical Issues in Neural Nets, Center for
Discrete Mathematics and Theoretical Computer Science, Rutgers,
University, May 2023. (Invited Presentation).
 A. R. Barron
and R. L. Barron (1991). Artificial neural networks in industry: some
developments using statistical techniques. IMS Special Topics Meeting on Statistics in
Industry, Philadelphia, Pennsylvania, June 912 (Invited
Presentation).
 A. R. Barron
(1991). Universal approximation bounds for superpositions
of a sigmoidal function. IEEE
International Symposium on Information Theory, Budapest,
Hungary, June 2329.
 A. R. Barron
(1991). Approximation and estimation bounds for sigmoidal and polynomial
networks. Joint
Statistical Meetings, Atlanta, Georgia, August 1922.
(Session Organizer).
 A. R. Barron
(1991). Approximation and estimation bounds for neural networks and
projection pursuit. Workshop
on Neural Information Processing Systems, Vail, Colorado,
December 1011. (Invited Presentation).
 A. R. Barron
(1991). Risk estimation, risk optimality, regularization, and neural
nets. Workshop on
Neural Information Processing Systems, Vail, Colorado,
December 1011.
 A. R. Barron
(1992). Approximation, estimation, and computation results for
artificial neural networks. 24th
Symposium on the Interface: Computing Science and Statistics,
College Station, Texas, March 1921. (Invited Presentation).
 A. R. Barron
(1992). Artificial neural networks: stochastic analysis and engineering
applications? Symposium
on Stochastic Processes in Engineering Applications, Otaniemi, Finland, April 79. (Invited Plenary
Presentation).
 A. R. Barron,
D. Olive, and Y. Yang (1992). Asymptotically optimal complexitybased
model selection. IMS
Regional Meeting, Corvallis, Oregon, June 1415. (Invited
Presentation).
 A. R. Barron
(1992). Statistical accuracy of neural nets. Conference on Neural Information Processing
Systems: Natural and Synthetic. Denver, Colorado, November
30  December 2. (Invited Tutorial).
 A. R. Barron
(1993). Neural net approximation. Annual
Meeting of the American Math Society, San Antonio, Texas,
January 1316. (Invited Presentation).
 A. R. Barron,
L. Gyorfi, and E. C. van der Meulen (1993). Universal coding of nondiscrete
sources based on distribution estimation consistent in expected
information divergence. IEEE
International Symposium on Information Theory, San Antonio,
Texas, January 1722. p.51.
 A. R. Barron,
B. Clarke, and D. Haussler (1993). Information bounds for the risk of
Bayesian predictions and the redundancy of universal codes. IEEE International Symposium on
Information Theory, San Antonio, Texas, January 1722.
 A. R. Barron
(1993). Statistical accuracy of neural nets. Workshop on Information and Geometry,
Hakone, Japan. March. (Invited Presentation).
 A. R. Barron
(1993). Optimal rate properties of minimum complexity estimation. Workshop on Descriptional
Complexity, Schloss Dagstahl, Wadern, Germany.
May. (Invited Presentation).
 A. R. Barron
(1993). Do neural nets avoid the curse of dimensionality? Congress on Statistics,
Vannes, France, May. (Invited Presentation).
 A. R. Barron
(1993). Performance bounds for neural net estimation and classification.
Annual Meeting of the
Classification Society of North America, Pittsburgh, June.
(Invited Presentation).
 A. R. Barron
(1993). Do neural nets avoid the curse of dimensionality? NATO ASI on Statistics and
Neural Nets, Les Arc, France, June. (Invited Presentation).
 A. R. Barron
(1994). The accuracy of Bayes estimates of neural nets. 26th Symposium on the Interface:
Computing Science and Statistics, Research Triangle Park,
NC, June 1518.
 A. R. Barron,
Y. Yang and B. Yu (1994). Asymptotically optimal function estimation by
minimum complexity criteria. IEEE
International Symposium on Information Theory, Trondheim,
Norway, June 27  July 1. (One page proceeding and seven page original submission
available).
 A. R. Barron
(1994). Neural net approximation and estimation. IEEE Workshop on Information
Theory, Moscow, Russia, July 36. (Invited Presentation).
 A. R. Barron
(1994) Minimum complexity estimation. ML/COLT:
Workshop on Applications of Descriptional
Complexity to Induction, Statistical and Visual Inference,
New Brunswick, New Jersey, July 89. (Invited Presentation.)
 A. R. Barron
(1995). Statistics and neural nets. New
England Statistics Conference, Storrs, Connecticut, April
22. (Invited Plenary Presentation).
 A. R. Barron,
Y. Yang (1995). Informationtheoretic development of minimax rates of
convergence. IEEE
International Symposium on Information Theory, Whistler, British
Columbia, September 1722. (Added to session at recommendation of
program chair; not in proceedings).
 N. Hengartner and A. R. Barron (1995). Information
theory and superefficiency. IMS regional meeting
Stanford, California. (Invited Presentation).
 B. S. Clarke
and A. R. Barron (1995). Jeffreys' prior
yields the asymptotic minimax redundancy. IEEEIMS Workshop on Information Theory and
Statistics, Alexandria, Virginia, October 2729. (Invited
Presentation).
 A. R. Barron
(1995). Asymptotically optimal model selection and neural nets. IEEEIMS Workshop on
Information Theory and Statistics, Alexandria, Virginia,
October 2729. (Invited Presentation).
 Q. Xie and A. R. Barron (1996). Asymptotic minimax
regret for data compression, gambling, and prediction. Workshop on sequence prediction,
Santa Cruz, California, May 35. (Invited, jointly presented).
 A. R. Barron
(1996). The fundamental role of Kullback
Information in large sample statistics. Solomon Kullback
Memorial Conference, Washington, DC, May 2324. (Invited
Presentation).
 A. R. Barron
(1996). Adaptation and model selection. AMSIMSSIAM Summer Research Conference on
adaptive selection of models and statistical procedures,
Mount Holyoke, Massachusetts, June 2228. (Conference Chairmen, A.
Barron, P. Bickel, I. Johnstone, D. Donoho).
 A. R. Barron
and Y. Yang (1996). Information theory in nonparametric estimation. Nonparametric Estimation: The
Road Ahead, Canberra, Australia, July 24. (Invited
Presentation).
 A. R. Barron
(1996). Adaptive model selection and neural networks. Sydney Interational
Statistics Congress, IMS regional meeting, Sydney,
Australia, July 812. (Invited Presentation).
 A. R. Barron
(1996). Asymptotics of Bayes estimators. International Society of
Bayesian Analysis, Regional Meeting, Chicago, Illinois,
August 23. (Invited Presentation).
 A. R. Barron
and Y. Yang (1996). Adaptive model selection and the index of
resolvability. Joint
Statistical Meetings of the ASA and IMS, Chicago, Illinois,
August 48. (Invited Presentation).
 A. R. Barron
and Q. Xie (1997). Asymptotic minimax regret
for data compression, gambling and prediction. IEEE International Symposium on
Information Theory, Ulm, Germany, June 29  July 4.
 A. R. Barron
(1997). Information theory in probability, statistics, learning, and
neural nets. Computational
Learning Theory: Tenth Annual ACM Workshop, Nashville,
Tennessee, July 69. (Invited Plenary Presentation).
 A. R. Barron
(1997). Information theory in probability, statistics, learning, and
neural nets. International
Conference on Combinatorics, Information Theory, and Statistics,
University of Southern Maine, Portland, Maine, July 1820. (Invited
Presentation).
 A. R. Barron
and Y. Yang (1997). Informationtheoretic determination of minimax rates
of convergence. Symposium
on Nonparametric Functional Estimation, Centre de Recherches Mathematiques,
University of Montreal, October 1618. (Invited Presentation).
 A. R. Barron
(1998). Nonlinear approximation, greedy algorithms, and neural networks.
Ninth International
Conference on Approximation Theory, January 46. (Invited
Presentation).
 A. R. Barron
(1998). How information theory illuminates the behavior of risk
functions of Bayes proceedures. Purdue Workshop on the
Interface between Paradigms of Statistics. June 1719.
(Invited Presentation).
 A. R. Barron
and J.I. Takauchi (1998). Mixture models
achieving optimal coding regret. IEEE
Information Theory Workshop. Killarney, Ireland, June 2226.
 J. Takeuchi
and A. R. Barron (1998). Asymptotic Minimax Regret by Bayes Mixtures. International
Symposium on Information Theory. Cambridge, Ma, August
1621.
 A. R. Barron
(1998). Information theory in probability and statistics; Approximation
and estimation bounds for Gaussian mixtures. CIRM Workshop on Information Theory, Statistics,
and Image Analysis, Marsielle,
France, December 711. (Two Invited Presentations).
 A. R. Barron
(1999). Information theory in probability and statistics. IEEE Information Theory
Workshop on Detection, Estimation, Classification, and Imaging,
Sante Fe, New Mexico, February 2426. (Invited
Presentation).
 A. R. Barron
(1999). Decision theory of regret for universal coding, gambling, and
prediction. DIMACS
Workshop: Online Decision Making, Rutgers University, New
Brunswick, NJ, July 1215. (Invited Presentation).
 A. R. Barron
(2000). Limits
of information, Markov chains and projection. IEEE International Symposium on
Information Theory, Sorrento, Italy, June 2630.
 A. R. Barron,
Laszlo Gyorfi, Micheal
Nussbaum (2000). Nonparametric
Estimation, Neural Nets and Risk Asymptotics.
Short Course. Mathematical Research Institute, Oberwolfach,
Germany, June 1017.
 A. R. Barron
(2000). Information theory in probability; Information theory in
statistics. J. Bolyia Society Conference on Information Theory in
Mathematics, honoring 50th anniversary of formation of what is
now known as the Renyi Institute of
Mathematics of the Hungarian Academy of Sciences. Balatonelle,
Hungary, July 37. (Two Invited Presentations).
 A. R. Barron
(2000). Prediction, data compression, gambling, and model selection: Do
Bayes procedures nearly minimize the maximum of regret over all possible
data sequences? AMSIMSSIAM
Summer Research Conference on Bayes, Frequentist, and Likelihood
Inference: a Synthesis. Mount Holyoke College, South Hadley,
Massachusetts, July 913. (Invited Presentation.)
 A. R. Barron
(2001). Informationtheoretic bounds for mixture modeling, model
selection, and data compression. Workshop
on Information Theory and Statistics, DIMACS, Rutgers
University, March 2001. (Invited Presentation).
 A. R. Barron
(2001). Information theory in probability and statistics. 23rd European Meeting of
Statistics, Funchal, Madeira, Portugal, August 1318.
(Invited Keynote Plenary Presentation).
 F. Liang and
A. R. Barron (2001). Minimax optimal predictive density estimation, data
compression, and model selection. Workshop
on MDL at Conference on Neural Information Processing Systems,
Whistler, British Columbia, December 10. (Invited Presentation).
 F. Liang and
A. R. Barron (2002). Exact minimax strategies for predictive density
estimation, data compression and model selection. IEEE International Symposium on
Information Theory, Lausanne, Switzerland, July 15.
 A. R. Barron, Jiangfeng Yu, and Wei Qui (2003). Maximum compounded
wealth: portfolio estimation, option pricing, and stock selection. Workshop on Complexity and
Inference, DIMACS, Rutgers University, June 25. (Invited
Presentation).
 A. R. Barron
(2003). The role of information in the central limit problem. Symposium on Information Theory
and Some Friendly Neighbors  Ein Wunschkonzert, ZIF, Center for
Interdisciplinary Research, Bielefeld, August 1113. (Invited
Presentation).
 A. R. Barron
(2003). Interplay of statistics and information theory in formulation
and selection of models. Workshop
on Model Building, Dortmund, Germany, November 1314.
(Invited Presentation).
 G. Leung and
A. R. Barron (2004). Information theory, model selection and model
mixing for regression, Conference
on Information Sciences and Systems, Princeton, NJ, March
1719. (Invited Presentation in session on Information Theory, Computer
Science, and Statistics.)
 A. R. Barron
and G. Leung (2004). Risk assessment for Bayes procedures and model
mixing in regression. IVth Workshop on Bayesian Nonparametrics,
Universita di Roma La Sapienza, Rome, Italy,
June 1216. (Invited Presentation).
 A. R. Barron
and G. Leung (2004). Risk assessment for model mixing. Workshop on Mathematical
Foundations of Learning Theory, Barcelona, Spain, June
1823. (Invited Presentation).
 A. R. Barron
(2004). Relative entropy in probability theory and mathematical
statistics. Workshop
on Entropy in the Mathematical, Physical, and Engineering Sciences,
Padova, Italy, June 2427. (Two Invited Presentations).
 A. R. Barron
(2004). Fitting functions of many variables: neural networks and beyond.
16th Conference on
Computational Statistics (COMPSTAT 2004), Prague, Czech
Republic, August 2328. (Invited Keynote Plenary Presentation).
 A. R. Barron
(2005). Neural nets, mixture models, and adaptive kernel machines. Yale Workshop on Adaptive and
Learning Systems, May 2931, Center for Systems Science,
Yale University. (Invited presentation).
 A. R. Barron
(2005). Challenges in highdimensional function estimation and attempted
solutions, Congress
on Statistics, Pau, France, June 610. (Invited Presentation).
 A. R. Barron
(2005). Information theory and risk analysis. Medallion Lecture. Joint Statistical Meetings of
the IMS and ASA, Minneapolis, Minnesota, August 711.
(Presented with IMS Medallion Award; onehour special invited
presentation on Aug. 7).
 A. R. Barron
(2005). Information theory and statistics for machine learning, IEEE Workshop on Machine
Learning for Signal Processing XV, Mystic, CT, October
2830. (Invited Keynote Plenary Presentation).
 M. Madiman and A. R. Barron (2006). Monotonicity of
information in the central limit theorem, Workshop on Information Theory and its
Applications, University of California, San Diego, February
69. (Invited Presentation).
 A. R. Barron
(2006). Simple risk bounds for mixing least squares regressions. Journees: Model Selection in
Statistics: Different approaches, University de Nice,
SophiaAntipolis, Nice, France, March 1419 (Two onehour invited
presentations).
 A. R. Barron
(2006). Simple risk bounds for mixing least squares regressions. International Workshop on
Applied Probability, University of Connecticut, Storrs, CT,
May 18. (Invited Presentation).
 A. R. Barron
and Wei Qiu (2007). Maximum wealth portfolios,
Workshop on
Information Theory and its Applications, University of
California, San Diego, January 29  February 2. (Invited Presentation).
 A. R. Barron,
Cong Huang, and Xi Luo (2008). Penalized squared error and likelihood:
risk bounds and fast algorithms, Workshop
on Sparsity in High Dimensional Statistics and Learning Theory,
Georgia Institute of Technology, Atlanta,
Georgia, March 2224. (Invited ThreePart Presentation).
 A. R. Barron
(2008).Principles of Information
Theory in Probability and Statistics. Elements of
Information Theory Workshop: CoverFest. On the
Occasion of the 70th birthday of Tom Cover. Stanford University, May 16. (Invited Presentation).
 A. R. Barron (2008). MDL Procedures with
L_1 Penalty and their Statistical Risk Information and Communication Conference,
Renyi Institute, Budapest, August 2528, on
the occasion of the 70th birthday of Imre Csiszar. (Invited Presentation).
 A. R. Barron (2011). Information
Theory and Flexible HighDimensional NonLinear Function Estimation.
InfoMetrics Institute Workshop, American University,
Wash, DC, November 12. (Invited Presentation). Disclaimer: The proposed
solution on page 19 to the differential equation for Adaptive Annealing
is problematic due to discontinuity of the gradient at the origin.
 A. R. Barron (2011). Information
and Statistics and Practical Achievement of Shannon Capacity.
(Threepart Invited Tutorial Presentation.) Workshop on Information
Theory and its Applications, U.C. San Diego, February 9.
 A. R. Barron (2011).
Communication by Regression: Practical Achievement of Shannon
Capacity, Workshop Infusing Statistics and Engineering, Harvard
University, June 56. (Invited Presentation).
 A. R. Barron (2011).
Sparse Superposition Codes: low complexity and exponentially small error probability at all rates below
capacity, Workshop on Information Theory Methods in Science and
Engineering, Helsinki, Finland, August 8. (Invited Presentation.)
 A. R. Barron (2014). Overview of Recent
Developments in Penalized Likelihood and MDL.
Workshop on Information Theory Methods in Science and Engineering, Wakiki,
HI, July 5. (Invited Presentation). Includes results obtained with Teemu Roos and Kazuho
Watanabe in visit to Helsinkii, Finland, September 2013.
 A. R. Barron (2015). Information and
Statistics, IMA Workshop on Information and Concentration Phenomena,
Minneapolis, MN, April 13. (Invited Presentation)
 A. R. Barron (2015). Information
and Statistics (Invited Plenary Presentation). IEEE Information Theory
Workshop, Jerusalem, Israel, April 30.
 A. R. Barron (2015). Computationally Feasible
Greedy Algorithms for Neural Nets NonConvex
Optimization Workshop, NIPS, Montreal, Canada, December 12. (Invited Presentation).
 A. R. Barron (2016). HighDimensional
Neural Networks: Statistical and Computational Properties. (Invited Plenary
presentation). Conference on Operational Research, Osijek, Department of
Economics, September 27.
 A. R. Barron (2016).
Probability for Normalized Maximum Likelihood
Calculations. Workshop on Information Theory Methods in Science & Engineering.
Helsinki, September 19. (Invited Presentation.) Based on joint work with X. Yang, T. Roos, K. Watanabe.
 A.R. Barron (2017). Stochastic Diffusion for
Gaussian Mixtures, Workshop on InformationTheoretic Inequalities.
Newark, Delaware, April 2223. (Invited Presentation).
 A.R. Barron (2017).
Central Limit Theory and Convergence
to Invariance via Information Theory Inequalities, AIM Workshop on Entropy
Power Inequalities. San Jose, May 15, 2017. (Coorganizer of Workshop with
O. Johnson, M. Madiman, I. Kontoyiannis.)
 A. R. Barron (2018). Approximating
Multilayer Learning Networks. Conference on Channels, Statistics, Information, Secrecy,
Zeroerror and Randomness: In honor of Imre Csiszar's 80th Birthday. Budapest, June 45.
(Invited Presentation).
 A. R. Barron (2018).
Discussion of a Link Between
Brown and Jordan, Lawrence D Brown Memorial Workshop,
Philadelphia, December 1. (Invited Discussion).
 A. R. Barron (2018).
Approximating Deep Nets. Two days in Honor of Pascal Massart and
Lucien Birge, Paris, June 2829. (Invited Presentation).
 A. R. Barron (2018).
Risk Bounds for Neural Nets in High Dimensions. Workshop on
HighDimensional Statistics, Columbia University, New York, September 1415.
(Invited Presentation).
 A. R. Barron (2019).
Gaussian Complexity,
Metric Entropy & Risk of Deep Nets. New England Statistics Symposium.
Session in Celebration of Rick Vitale. May 16, 2019. (Invited Presentation).
 A. R. Barron (2019).
Gaussian Complexity, Metric Entropy,
and the Statistical Learning of Deep Nets. Recent Progress in Foundational
Data Science. Institute of Mathematics and Its Applications and the Institute for
Research in Statistics and Its Applications, Minneapolis, September 1617.
(Invited Presentation.)
 A. R. Barron (2019).
Approximation, Complexity & Risk Properties of Deep Nets. Approximation and
Data Analysis School & Conference, Nishny Novgorod, 34 October.
(Two Invited Presentations).
 A. R. Barron (2020).
Deep Network Approximation. International Zurich Seminar on Information
and Communication.. Zurich, 2628 February. (Invited Plenary Presentation).
Invited Departmental
Seminar Presentations: (These had short abstract announcements).
[Recent presentation files are included at www.stat.yale.edu.]
 Purdue
University, Joint Statistics Colloquium, October 3, 1985. Topic: Entropy
and the central limit theorem.
 Michigan State
University, Department of Statistics and Probability, January 28, 1986.
Topic: Generalized ShannonMcMillanBreiman
theorem.
 University of
Chicago, Department of Statistics, October 20, 1986. Topic: Uniformly
powerful tests.
 University of
Virginia, Department of Mathematics, March 5, 1987. Topic: Convergence
of Bayes estimators of probability density functions.
 Stanford
University, Department of Statistics, October 20, 1987. Topic:
Convergence of Bayes estimators of probability density functions.
 University of
Chicago, Department of Statistics, March 7, 1988. Topic: Convergence of
Bayes estimators of probability density functions.
 McGill
University, Joint Statistics Seminar for Montreal universities, March
31, 1988. Topic: Approximation of densities by sequences of exponential
families.
 Dupont Research
Center, Dover, Delaware, April 26, 1988. Topic: Statistical learning
networks.
 IBM T. J.
Watson Research Center, Yorktown Heights, New York, August 10, 1988.
Topic: Statistical learning networks.
 Stanford
University, Information Systems Laboratory, November 3, 1988. Topic:
Minimum complexity density estimation.
 IBM Technical
Education Center, Thornwood, New York, January 1112, 1989. Statistical
learning networks. In the short course on Knowledge Acquisition from
Data.
 Cornell
University, Department of Economics and Program of Statistics
(Cohosts), February 1, 1989. Topic: Convergence of Bayes estimators of
probability density functions.
 Purdue University,
Department of Statistics, September 7, 1989. Topic: Minimum complexity
density estimation.
 University of
Lowell, Massachusetts, Joint Seminar, Department of Mathematics and
Department of Electrical Engineering, March14, 1990. Topic: Statistical
properties of polynomial networks and other artificial neural networks.
 Carnegie
Mellon University, Department of Statistics, April 4, 1990. Topic:
Statistical properties of polynomial networks and other artificial
neural networks.
 University of
Chicago, Department of Statistics, October 15, 1990. Topic: Statistical
properties of artificial neural networks.
 University of
California, San Diego, Department of Mathematics, January 7, 1991.
Topic: Approximation bounds for artificial neural networks.
 University of
California, San Diego, Department of Mathematics, January 8, 1991.
Topic: Complexity regularization for nonlinear model selection.
 Siemens
Corporation, Princeton, New Jersey, February 28, 1991. Topic: Universal
approximation bounds for superpositions of a
sigmoidal function.
 University of
Wisconsin, Department of Statistics, April 3, 1991. Topic: Complexity
regularization for nonlinear model selection.
 University of
Wisconsin, Department of Mathematics, April 4, 1991. Topic:
Approximation bounds for artificial neural networks.
 Technical
University of Budapest, Department of Electrical Engineering, July 2,
1991. Topic: Universal approximation bounds for superpositions
of a sigmoidal function.
 Mathematical
Sciences Research Institute, Berkeley, California, September 25, 1991.
Topic: Empirical process bounds for artificial neural networks.
 Stanford
University, Department of Statistics, October 15, 1991. Topic:
Approximation and estimation bounds for artificial neural networks.
 University of
California, Santa Cruz, Department of Computer and Information Sciences,
October 17, 1991. Topic: Computationally efficient approximation and
estimation of functions using artificial neural networks.
 Yale
University, Department of Statistics, January 13, 1992. Topic: Neural
network estimation.
 University of
Virginia, Department of Electrical Engineering, Eminent Speaker Series,
February 21, 1992. Topic: Estimation of functions of several variables
 neural networks, Fourier decomposition, and Bayes methods.
 North Carolina
State University, Department of Statistics, February 29, 1992. Topic:
Estimation of functions of several variables  neural networks, Fourier
decomposition, and Bayes methods.
 Cornell
University, Center for Applied Mathematics, March 6, 1992. Topic:
Estimation of functions of several variables  neural networks, Fourier
decomposition, and Bayes methods.
 University of
North Carolina, Department of Statistics, March 30, 1992. Topic:
Estimation of functions of several variables  neural networks, Fourier
decomposition, and Bayes methods.
 University of Joenesu, Finland, Department of Statistics, April 9,
1992. Topic: Introduction to artificial neural networks.
 University of
Paris VI, Department of Statistics, April 15, 1992, and University of
Paris, Orsay, Department of Statistics,
April16, 1992. Topic: Estimation of functions of several variables 
neural networks, Fourier decomposition, and Bayes methods.
 University of
Paris VI, Department of Statistics, April 22, 1992, and University of
Paris, Orsay, Department of Statistics,
April23, 1992. Topic: Performance bounds for complexitybased model
selection.
 Princeton
University, Department of Electrical Engineering, May 14, 1992. Topic:
Overview of approximation results for sigmoidal networks.
 University of
Massachusetts at Lowell, Joint Seminar, Department of Mathematics and
Department of Electrical Engineering, October 21, 1992. Topic:
Statistical accuracy of neural nets.
 University of
Tokyo, Japan, Department of Information, Physics and Engineering, March
1993. Topic: Information theory and model selection.
 University of
Paris VI, Department of Statistics, May 1993. Topic: Optimal rate
properties of minimum complexity estimation.
 University of
Paris VI, Department of Statistics, May 1993. Topic:
Informationtheoretic proof of martingale convergence.
 University of
Pennsylvania, Wharton School, October 21, 1993. Topic: Neural networks
and statistics.
 Massachusetts
Institute of Technology, Center for Biological and Computational
Learning, October 27, 1993. Topic: Neural networks and statistics.
 Rutgers
University, Department of Statistics, October 5, 1994, Topic:
Statistical accuracy of neural nets.
 University of
South Carolina, Department of Mathematics, Spring 1995. Topic: Neural
net approximation.
 Carnegie
Mellon University, Department of Statistics, Fall 1995. Topic:
Information risk and superefficiency.
 Massachusetts
Institute of Technology, Department of Applied Mathematics, March 1996.
Topic: Consistent and uniformly consistent classification.
 Columbia
University, Department of Statistics, Fall 1996. Topic: Consistency of
posterior distributions in nonparametric problems.
 Northeastern
University, Joint Mathematics Colloquium with MIT, Harvard, and Brandiess, February 27, 1997. Topic: Information
theory in probability and statistics.
 Iowa State
University, Department of Statistics, March 28, 1997. Topic: Information
theory in probability and statistics: The fundamental role of Kullback divergence.
 Washington
University, St. Louis, Department of Electrical Engineering, Center for
Imaging Systems, April 16, 1997. Topic: Universal data compression,
prediction, and gambling.
 Massachusetts
Institute of Technology, LIDS Colloquium, May 5, 1998. Topic: Simple
universal portfolio selection.
 University of
California, Santa Cruz, Baskin Center for Computer Engineering, October
1998. Topic: Approximation bounds for Gaussian mixtures.
 Lucent, Bell
Laboratories, Murray Hill, New Jersey, March 1999. Topic: Approximation
and estimation bounds for mixture density estimation.
 Stanford
University, Department of Statistics, Probability Seminar, May 24, 1999.
Topic: Information, martingales, Markov chains, convex projections, and
the CLT.
 Stanford
University, Department of Statistics, Statistics Seminar, May 25, 1999.
Topic: Mixture density estimation.
 Rice
University, Departments of Statistics and Electrical Engineering,
November 4, 2000. Topics: Information theory and statistics  best
invariant predictive density estimators.
 University of
Chicago, Department of Statistics, November 21, 2000. Topics:
Information theory and statistics  best invariant predictive density
estimators.
 Yale University,
Department of Computer Science, Alan J. Perlis Seminar, April 26, 2001.
Topic: Neural nets, Gaussian mixtures, and statistical information
theory.
 Brown
University, Department of Applied Mathematics, May 9, 2001. Topic: I do
not recall.
 University of
Massachusetts at Lowell, Department of Mathematics, September 19, 2001.
Topic: Mixture density estimation.
 University of
California at Los Angeles, Department of Statistics, May 21, 2002.
Topic: Nonlinear approximation, estimation, and neural nets (I do not
recall the specific title).
 Columbia
University, Department of Statistics, October 28, 2002. Topic:
Information inequalities in probability and statistics.
 University of
Georgia, Department of Statistics, November 26, 2002. Topic: Information
inequalities in probability and statistics.
 University of
Pennsylvania, Wharton School, October 29, 2003. Topic: Portfolio
estimation for compounding wealth.
 University of
North Carolina (in conjunction with Duke University), Departments of
Statistics, November 3, 2004. Topic: Risk assessment and advantages of
model mixing for regression.
 South
Carolina, Department of Mathematics, April 7, 2005. IMI Distinquished Lecture. Topic: Statistical theory for
nonlinear function approximation: neural nets, mixture models, and
adaptive kernel machines.
 Helsinki
University and Helsinki Institute of Information Technology, Helsinki,
Finland. August 2225, 2005. Two talks: (1) Statistical foundations and
analysis of the minimum description length principle. (2) Consequences
of MDL for neural nets and Gaussian mixtures.
 Princeton
University, Department of Operations Research and Financial Engineering,
October 4, 2005. Topic: Statistical perspectives on growth rate optimal
portfolio estimation.
 IBM Research
Laboratories, Yorktown Heights, September 22, 2006. Topic: Generalized
entropy power inequalities and the central limit theorem.
 Purdue
University, Department of Computer Science, February 26, 2007. Prestige
Lecture Series on the Science of Information. Topic: The interplay of
information theory, probability, and statistics.
 University of
Illinois, Joint Seminar, Department of Statistics and Department of
Electrical and Computer Engineering, February 27, 2007. Prestige
Conference Series. Topic: The interplay of information theory and
probability.
 Boston
University, Department of Statistics, March 1, 2007. Prestige Conference
Series. Topic: Information inequalities and the central limit theorem.
 University of
California at Berkeley, Department of Computer Science, October 4, 2007.
Topic: A Simple Algorithm for
L1Penalized Least Squares. Primarily presented by Cong Huang.
 Rutgers
University, Department of Statistics, December 12, 2007. Topic: Fast and Accurate L1
Penalized Estimations. Copresented with Cong Huang.
 Harvard University,
Department of Statistics. Topic: Information
Theory and Flexible HighDimensional NonLinear Function Estimation, Oct. 2011.
 University of Michigan, Department of Statistics,
Topic: Communication by Regression. March 9, 2012.
 Rice University, Department of Statistics, Topic: Communication
by Regression. Rice University, March 12, 2012.
 Cambridge University, Statistics Laboratory, Topic: Communication by
Regression: Sparse Superposition Codes, November 14, 2013.
 University of Osijek, Croatia, Department of Mathematics.
Topic :Neural Net
Approximation and Estimation of Functions. March 12, 2015.
 Massachusetts Institute of Technology, Department of
Electrical Engineering, Laboratory for Information and Decision Sciences. April 8, 2014.
.
 Princeton University, Department of Electrical Engineering.
Topic: Computationally Feasible
Greedy Algorithms for Neural Nets, February 22, 2016.
 University of Osijek, Department of Mathematics, Topic: Computationally
Feasible Greedy Algorithms for Sigmoidal and Polynomial Networks, March
16, 2016.
 Yale University,
Department of Statistics and Data Science, Topic:
Accuracy of HighDimensional Deep Learning Networks. September 17, 2019.
 University of Southern California,
Department of Statistics, Topic: Complexity Theory for Deep Learning. October 25, 2019.
Ph.D. Dissertations
Supervised:
 Yuhong Yang (1996). Minimax Optimal Density
Estimation. Yale University. [Was Assistant Professor at Iowa State
University; Now Professor at University of Minnesota, Department of
Statistics.]
 Qun (Trent) Xie (1997). Minimax Coding and Prediction.
Yale University. [Was at GE Capital, Inc., Fairfield, CT. Then an
Assistant Professor at Tsinghua Univ.]
 Qiang (Jonathan) Li
(1999). Estimation of
Mixture Models. Yale University. [Was at KPMG Financial Services,
New York. Then at Stanford Research Institute, Palo Alto. Now at Radar
Networks, Inc., San Franscisco.]
 Feng Liang
(2002). Exact Minimax
Predictive Density Estimation. Yale University. [Was Assistant
Professor, Department of Statistics, Duke University. Now Associate
Professor, Department of Statistics, University of Illinois at UrbanaChampaign]
 Sabyasachi Chatterjee
(2014). Adaptation in
Estimation and Annealing. Yale University. [Was Postdoc with John
Lafferty, University of Chicago, Department of Statistics. Now Assistant Professor, University of Illinois at UrbanaChampaign, Department of Statistics.]
Associate
Editorship:
 1993  1995 IEEE Transactions on
Information Theory. A.E. for nonparametric estimation,
classification and neural nets
 1995  1997 Annals of Statistics.
 1994  1997 Neural Networks.
Yale
Departmental and Divisional Responsibilities:
Director or CoDirector of Graduate
Studies (Fall 1993  Spring 1999; Fall 2017  Spring 2020).
Director of Undergraduate Studies (Fall 2010  Spring
2016, Spring 2017).
Acting Chair (Spring 1998).
Chair (Spring 2001  Fall 2006).
 Division of
Social Sciences
Senior Appointments Committee (Fall 1994 
Spring 1995; Fall 1999  Spring 2000)
 Program of
Applied Mathematics
Director of Undergraduate
Studies (Fall 1999  Spring 2001; Fall 2004  Spring 2006)
Senior Coordinator of Undergraduate Studies (Fall 2006 
Spring 2015)
Other
Activities and Honors:
 FAI free
flight model glider designer and competitor, F1A class (Andrew Barron
and family):
Andrew is a five time U.S. National Champion: 1984, 1987,
1992, 2007 and 2009.
Three time U.S. National Team Member at the World
Championships:
1995 in Hungary, 2001 in California, and 2013
in France.
Member of nine man U.S. Team
winning the Challenge France World Championships 2013.
Alternate to the U.S. National Teams in 1977, 2015, 2019.
America's Cup Champion in 1998 and 2009.
Son Peter was a U.S. National Team Member at the World
Championships 2015 in Mongolia.
Son Peter was a U.S. National Team Member at the Junior
World Championships 1998 in Romania.
Son John was a U.S. National Team Member at the Junior
World Championships 2000 in the Chech Republic.
Son Timothy was the U.S. National Champion in 2010 and
2012.
Timothy was a US National Team Member at the Junior World
Championships 2008 in the Ukraine.
Timothy finished as third place individual, part of second
place F1A team (USA), and part of
first place U.S. team (F1A,F1B,F1P combined).
Daughters Michelle and Gina earned place on U.S. National
Team for Junior WC 2012 in Slovania.
Gina finished as part of second place F1A team and Second
place team overall (F1A,F1B,F1P combined).
Coowner and manager of Barron Field, LLC, which owns a 284 acre sodfarm in
Orange County, NY. It is the primary flying site used in the northeastern US
region to host freeflight meets and competitions, sanctioned by the Academy
of Model Aeronautics (AMA) and the Federation Aeronautique
International (FAI).

