Search

Showing total 33 results
33 results

Search Results

1. Settling the Sharp Reconstruction Thresholds of Random Graph Matching.

2. Bias for the Trace of the Resolvent and Its Application on Non-Gaussian and Non-Centered MIMO Channels.

3. The Gray-Wyner Network and Wyner’s Common Information for Gaussian Sources.

4. On Sampling Continuous-Time AWGN Channels.

5. Generalized Submodular Information Measures: Theoretical Properties, Examples, Optimization Algorithms, and Applications.

6. Bits Through Queues With Feedback.

7. An Upgrading Algorithm With Optimal Power Law.

8. One-Shot Variable-Length Secret Key Agreement Approaching Mutual Information.

9. Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels.

10. Information-Distilling Quantizers.

11. The Secrecy Capacity of Cost-Constrained Wiretap Channels.

12. From Log-Determinant Inequalities to Gaussian Entanglement via Recoverability Theory.

13. Proving and Disproving Information Inequalities: Theory and Scalable Algorithms.

14. Convergence of Smoothed Empirical Measures With Applications to Entropy Estimation.

15. Analysis of KNN Information Estimators for Smooth Distributions.

16. On the Relation Between Identifiability, Differential Privacy, and Mutual-Information Privacy.

17. Minimum-Entropy Couplings and Their Applications.

18. The Replica-Symmetric Prediction for Random Linear Estimation With Gaussian Matrices Is Exact.

19. Rényi Resolvability and Its Applications to the Wiretap Channel.

20. Comparison of Channels: Criteria for Domination by a Symmetric Channel.

21. Demystifying Fixed $k$ -Nearest Neighbor Information Estimators.

22. $\Phi$ -Entropic Measures of Correlation.

23. Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions.

24. IT Formulae for Gamma Target: Mutual Information and Relative Entropy.

25. Relations Between Information and Estimation in Discrete-Time Lévy Channels.

26. Extensions of the I-MMSE Relationship to Gaussian Channels With Feedback and Memory.

27. Security Analysis of $\varepsilon $ -Almost Dual Universal2 Hash Functions: Smoothing of Min Entropy Versus Smoothing of Rényi Entropy of Order 2.

28. A Preadapted Universal Switch Distribution for Testing Hilberg’s Conjecture.

29. Monotone Measures for Non-Local Correlations.

30. Asymptotic Mutual Information Statistics of MIMO Channels and CLT of Sample Covariance Matrices.

31. Information Equals Amortized Communication.

32. Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information.

33. CommentsComments on “Canalizing Boolean Functions Maximize Mutual Information”.