Search

Your search keyword '"*DIFFERENTIAL entropy"' showing total 38 results

Search Constraints

Start Over You searched for: Descriptor "*DIFFERENTIAL entropy" Remove constraint Descriptor: "*DIFFERENTIAL entropy" Journal ieee transactions on information theory Remove constraint Journal: ieee transactions on information theory
38 results on '"*DIFFERENTIAL entropy"'

Search Results

1. Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality.

2. Compressibility Measures for Affinely Singular Random Vectors.

3. Entropic Compressibility of Lévy Processes.

4. Variations on a Theme by Massey.

5. On the Consistency of the Kozachenko-Leonenko Entropy Estimate.

6. Sharp Variance-Entropy Comparison for Nonnegative Gaussian Quadratic Forms.

7. Convergence of Smoothed Empirical Measures With Applications to Entropy Estimation.

8. Discrete Modulation for Interference Mitigation.

9. Remote Source Coding Under Gaussian Noise: Dueling Roles of Power and Entropy Power.

10. Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy.

11. Information Measures, Inequalities and Performance Bounds for Parameter Estimation in Impulsive Noise Environments.

12. Intrinsic Entropies of Log-Concave Distributions.

13. Bounds on Variance for Unimodal Distributions.

14. Yet Another Proof of the Entropy Power Inequality.

15. Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels.

16. Arbitrarily Tight Bounds on Differential Entropy of Gaussian Mixtures.

17. The Gaussian Multiple Access Diamond Channel.

18. Higher Order Derivatives in Costa’s Entropy Power Inequality.

19. Generalized Cut-Set Bounds for Broadcast Networks.

20. Vector Gaussian Multiterminal Source Coding.

21. A New Entropy Power Inequality for Integer-Valued Random Variables.

22. The Entropy Power Inequality for Quantum Systems.

23. On the Equivalence Between Stein and De Bruijn Identities.

24. Complex-Valued Random Vectors and Channels: Entropy, Divergence, and Capacity.

25. Information Theoretic Proofs of Entropy Power Inequalities.

26. Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels.

27. A Probabilistic Upper Bound on Differential Entropy.

28. On the Estimation of Differential Entropy From Data Located on Embedded Manifolds.

29. An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems.

30. Maximizing the Entropy of a Sum of Independent Bounded Random Variables.

31. Complex Random Vectors and ICA Models: Identifiability, Uniqueness, and Separability.

32. Causal Coding of Stationary Sources and Individual Sequences With High Resolution.

33. On the Existence and Characterization of the Maxent Distribution Under General Moment Inequality Constraints.

34. Convergence of Differential Entropies.

35. Information Properties of Order Statistics and Spacings.

36. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof.

37. Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation.

38. A Simple Proof of the Entropy-Power Inequality.

Catalog

Books, media, physical & digital resources