f-divergences and their applications in lossy compression and bounding generalization error
Publication in refereed journal
Full Text
Digital Object Identifier (DOI) DOI for CUHK Users |
Altmetrics Information
.
Other information
AbstractIn this paper, we provide three applications for f-divergences: (i) we introduce Sanov’s upper bound on the tail probability of the sum of independent random variables based on super-modular f-divergence and show that our generalized Sanov’s bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual f-information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular f-divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual f-information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the f-rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds.
Acceptance Date30/03/2023
All Author(s) ListS Masiha, A Gohari, MH Yassaee
Journal nameIEEE Transactions on Information Theory
Year2023
Month12
Volume Number69
Issue Number12
PublisherIEEE
Pages7538 - 7564
ISSN0018-9448
eISSN1557-9654
LanguagesEnglish-United States