A Mutual Information Inequality and Some Applications
Publication in refereed journal


Times Cited
Web of Science0WOS source URL (as at 19/08/2024) Click here for the latest count
Altmetrics Information
.

Other information
AbstractIn this paper we derive an inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable. One choice of a family of auxiliary variables leads to a new proof of a Stam-type inequality regarding the Fisher Information of sums of independent random variables. Another choice of a family of auxiliary random variables leads to new results as well as new proofs of results relating to strong data processing constants and maximal correlation between sums of independent random variables. Other results obtained include convexity of Kullback–Leibler divergence over a parameterized path along pairs of binomial and Poisson distributions, as well as a new duality-based argument relating the Stam-type inequality and entropy power inequality.
All Author(s) ListChin Wa Lau, Chandra Nair, David Ng
Journal nameIEEE Transactions on Information Theory
Year2023
Month10
Volume Number69
Issue Number10
PublisherIEEE
Pages6210 - 6220
ISSN0018-9448
eISSN1557-9654
LanguagesEnglish-United States

Last updated on 2024-20-08 at 00:34