Information Inequalities in Combinatorics.

Entropy inequalities for joint distributions.

There is a long history of entropy inequalities for joint distributions, going back to Shannon's development of his chain rule and subadditivity, Fujishige's discovery of its submodularity, and Han's study of entropic "correlation measures". The closely related Shearer's lemma, first developed for combinatorial applications, was improved and put to powerful uses by Kahn, Friedgut and others. In joint work with Prasad Tetali [MT10], we synthesize, generalize and provide a clear and unified development of all of these inequalities. As applications, we give new bounds for determinants of positive-definite matrices using minors, and also new bounds for counting graph homomorphisms.

Entropy inequalities for sums and more general partition-determined functions.

Description forthcoming…

Entropy of random structures.

Description forthcoming…

References

[KM12]    I. Kontoyiannis and M. Madiman. Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information. Submitted, 2012. [arXiv] [pdf]
[MT10]    M. Madiman and P. Tetali. Information Inequalities for Joint Distributions, with Interpretations and Applications. IEEE Trans. Inform. Theory, 56, no. 6, pp.2699-2713, June 2010. [arXiv] [pdf] [journal]
[MMT12]    M. Madiman, A. Marcus and P. Tetali. Entropy and set cardinality inequalities for partition-determined functions. Random Structures and Algorithms, Vol. 40, pp. 399-424, 2012. [arXiv] [pdf] [journal]

Read the relevant publications for more details, or go back to research summaries.