Search

Showing total 49 results
49 results

Search Results

1. A Unified Framework for One-Shot Achievability via the Poisson Matching Lemma.

2. A Bound on Undirected Multiple-Unicast Network Information Flow.

3. Source Resolvability and Intrinsic Randomness: Two Random Number Generation Problems With Respect to a Subclass of f-Divergences.

4. The Gray-Wyner Network and Wyner’s Common Information for Gaussian Sources.

5. Generalized Submodular Information Measures: Theoretical Properties, Examples, Optimization Algorithms, and Applications.

6. Reverse Euclidean and Gaussian Isoperimetric Inequalities for Parallel Sets With Applications.

7. Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels.

8. An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes.

9. Compute-Forward for DMCs: Simultaneous Decoding of Multiple Combinations.

10. Proving and Disproving Information Inequalities: Theory and Scalable Algorithms.

11. Capacity of Gaussian Many-Access Channels.

12. On the Relation Between Identifiability, Differential Privacy, and Mutual-Information Privacy.

13. Determining the Number of Samples Required to Estimate Entropy in Natural Sequences.

14. Empirical Lipschitz Constants for the Renyi Entropy Maximum Likelihood Estimator.

15. The Optimal Sub-Packetization of Linear Capacity-Achieving PIR Schemes With Colluding Servers.

16. Strong Functional Representation Lemma and Applications to Coding Theorems.

17. Preserving Data-Privacy With Added Noises: Optimal Estimation and Privacy Analysis.

18. Demystifying Fixed $k$ -Nearest Neighbor Information Estimators.

19. A Numerical Study on the Wiretap Network With a Simple Network Topology.

20. Conditioning of Random Block Subdictionaries With Applications to Block-Sparse Recovery and Regression.

21. On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver.

22. Achievability Proof via Output Statistics of Random Binning.

23. Intrinsic Entropies of Log-Concave Distributions.

24. Entropy Bounds on Abelian Groups and the Ruzsa Divergence.

25. Capacity Bounds for Additive Symmetric $\alpha $ -Stable Noise Channels.

26. Relations Between Information and Estimation in Discrete-Time Lévy Channels.

27. Capacity Bounds for Networks With Correlated Sources and Characterisation of Distributions by Entropies.

28. Simulation of a Channel With Another Channel.

29. On Rényi Entropy Power Inequalities.

30. Extensions of the I-MMSE Relationship to Gaussian Channels With Feedback and Memory.

31. Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach.

32. Weak Convergence Analysis of Asymptotically Optimal Hypothesis Tests.

33. Security Analysis of $\varepsilon $ -Almost Dual Universal2 Hash Functions: Smoothing of Min Entropy Versus Smoothing of Rényi Entropy of Order 2.

34. On State-Dependent Degraded Broadcast Channels With Cooperation.

35. A Preadapted Universal Switch Distribution for Testing Hilberg’s Conjecture.

36. Decentralized Wireless Networks With Asynchronous Users and Burst Transmissions.

37. A Note on the Broadcast Channel With Stale State Information at the Transmitter.

38. $H$ -Transforms for Wireless Communication.

39. Cut-Set Bounds for Networks With Zero-Delay Nodes.

40. $k$ -Connectivity in Random Key Graphs With Unreliable Links.

41. Measures of Entropy From Data Using Infinitely Divisible Kernels.

42. On the Conditional Rényi Entropy.

43. Book Inequalities.

44. On Cooperative Multiple Access Channels With Delayed CSI at Transmitters.

45. Information Equals Amortized Communication.

46. Infeasibility Proof and Information State in Network Information Theory.

47. The Capacity Region of a Class of Z Channels With Degraded Message Sets.

48. The Capacity Region of the Source-Type Model for Secret Key and Private Key Generation.

49. The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2^n.