Information Theory and Related Fields

Report on Current Developments in Information Theory and Related Fields

General Trends and Innovations

The recent literature in information theory and related fields has seen a significant push towards refining and generalizing existing theoretical frameworks, particularly in the areas of entropy, mutual information, and their applications in various coding and optimization problems. A common thread across several papers is the exploration of new bounds and convergence properties for information-theoretic measures, often leveraging advanced mathematical tools such as integral norms, projective metrics, and probabilistic representations.

One of the key directions is the development of tighter bounds for divergence measures, such as the Kullback-Leibler (KL) divergence and the Rényi divergence. These bounds are not only of theoretical interest but also have practical implications in the analysis of convergence rates in information-theoretic settings, such as in the entropic conditional central limit theorem and rate-distortion-perception coding. The use of Hilbert's projective metric to establish linear convergence in iterative algorithms for computing information measures is another notable advancement, providing a more efficient computational framework for these quantities.

Another significant trend is the application of submodularity theory to information-theoretic measures, particularly in the context of sensor placement and network optimization. The recognition of mutual information as a submodular function under certain conditions opens up new avenues for optimizing information flow in complex systems, especially in the presence of additive noise.

The field is also witnessing a deeper exploration of the connections between information theory and measure theory. This includes the development of a logarithmic decomposition for entropy, which provides a finer and more intuitive characterization of information quantities. This approach not only enriches the theoretical underpinnings of information theory but also offers new perspectives on classical problems such as the Gács-Körner and Wyner common information.

Noteworthy Contributions

  • New Upper Bounds for KL-Divergence: The paper introduces novel upper bounds for KL-divergence based on integral norms, which are shown to sandwich the convergence in KL-divergence between L1 and L2 norms. This work has potential applications in the analysis of the rate theorem of the entropic conditional central limit theorem.

  • Gaussian Rate-Distortion-Perception Coding: The study provides nondegenerate bounds on the distortion-rate-perception function for Gaussian sources, revealing new insights into the limitations of existing bounds in the weak perception constraint regime.

  • Linear Convergence in Hilbert's Projective Metric: The paper establishes linear convergence for iterative algorithms used in computing Augustin and Rényi information measures, significantly improving the computational efficiency of these methods.

  • Submodularity of Mutual Information: The proof that mutual information is submodular under certain conditions for multivariate Gaussian sources with additive noise extends the applicability of submodularity theory to information-theoretic optimization problems.

  • Logarithmic Decomposition and Signed Measure Space for Entropy: The introduction of a logarithmic decomposition for entropy provides a new geometric perspective on information quantities, offering a deeper understanding of common information measures and their properties.

Sources

New Upper bounds for KL-divergence Based on Integral Norms

Gaussian Rate-Distortion-Perception Coding and Entropy-Constrained Scalar Quantization

Linear Convergence in Hilbert's Projective Metric for Computing Augustin Information and a Rényi Information Measure

Strong Converse Inequalities for Bernstein Polynomials with Explicit Asymptotic Constants

Submodularity of Mutual Information for Multivariate Gaussian Sources with Additive Noise

A Logarithmic Decomposition and a Signed Measure Space for Entropy

The Kneser--Poulsen phenomena for entropy

A Generalization of Axiomatic Approach to Information Leakage