A Mutual Information Inequality and Some Applications
Publication in refereed journal
CUHK Authors
Full Text
Digital Object Identifier (DOI) DOI for CUHK Users |
Altmetrics Information
.
Other information
AbstractIn this paper we derive an inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable. One choice of a family of auxiliary variables leads to a new proof of a Stam-type inequality regarding the Fisher Information of sums of independent random variables. Another choice of a family of auxiliary random variables leads to new results as well as new proofs of results relating to strong data processing constants and maximal correlation between sums of independent random variables. Other results obtained include convexity of Kullback–Leibler divergence over a parameterized path along pairs of binomial and Poisson distributions, as well as a new duality-based argument relating the Stam-type inequality and entropy power inequality.
All Author(s) ListChin Wa Lau, Chandra Nair, David Ng
Journal nameIEEE Transactions on Information Theory
Year2023
Month10
Volume Number69
Issue Number10
PublisherIEEE
Pages6210 - 6220
ISSN0018-9448
eISSN1557-9654
LanguagesEnglish-United States