6 results on '"Rioul, Olivier"'
Search Results
2. Yet Another Proof of the Entropy Power Inequality.
- Author
-
Rioul, Olivier
- Subjects
- *
MATHEMATICAL proofs , *ENTROPY (Information theory) , *GAUSSIAN processes , *PERTURBATION theory , *YOUNG'S modulus , *MATHEMATICAL variables - Abstract
Yet another simple proof of the entropy power inequality is given, which avoids both the integration over a path of Gaussian perturbation and the use of Young’s inequality with sharp constant or Rényi entropies. The proof is based on a simple change of variables, is formally identical in one and several dimensions, and easily settles the equality case. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
3. Information Theoretic Proofs of Entropy Power Inequalities.
- Author
-
Rioul, Olivier
- Subjects
- *
INFORMATION theory , *ENTROPY (Information theory) , *MATHEMATICAL inequalities , *MATHEMATICAL proofs , *SHANNON'S model (Communication) , *COMMUNICATIONS research , *MATHEMATICAL transformations - Abstract
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
4. Rényi Entropy Power Inequalities via Normal Transport and Rotation.
- Author
-
Rioul, Olivier
- Subjects
- *
ENTROPY (Information theory) , *MATHEMATICAL equivalence , *UNCERTAINTY (Information theory) , *MATHEMATICAL variables , *MATHEMATICAL bounds - Abstract
Following a recent proof of Shannon's entropy power inequality (EPI), a comprehensive framework for deriving various EPIs for the Rényi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation. Simple arguments are given to recover the previously known Rényi EPIs and derive new ones, by unifying a multiplicative form with constant c and a modification with exponent α of previous works. In particular, for log-concave densities, we obtain a simple transportation proof of a sharp varentropy bound. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
5. This is IT: A Primer on Shannon’s Entropy and Information
- Author
-
Olivier Rioul, Institut Polytechnique de Paris (IP Paris), Département Communications & Electronique (COMELEC), Télécom ParisTech, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Bertrand Duplantier and Vincent Rivasseau, and Rioul, Olivier
- Subjects
Shannon's source coding theorem ,Logarithm ,Computer science ,Entropy power inequality ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,Information theory ,Notation ,Exponential function ,Shannon’s entropy ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,Channel coding theorem ,restrict ,Shannon’s capacity formula ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,Information inequality ,Entropy (information theory) ,Entropy power ,Source coding theorem ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,Mathematical economics - Abstract
International audience; What is Shannon’s information theory (IT)? Despite its continued impact on our digital society, Claude Shannon’s life and work is still unknown to numerous people. In this tutorial, we review many aspects of the concept of entropy and information from a historical and mathematical point of view. The text is structured into small, mostly independent sections, each covering a particular topic. For simplicity we restrict our attention to one-dimensional variables and use logarithm and exponential notations log and exp without specifying the base. We culminate with a simple exposition of a recent proof (2017) of the entropy power inequality (EPI), one of the most fascinating inequalities in the theory.
- Published
- 2021
6. Challenge codes for physically unclonable functions with Gaussian delays: A maximum entropy problem
- Author
-
Jean-Luc Danger, Alexander Schaub, Olivier Rioul, Joseph J. Boutros, Sylvain Guilley, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Département Communications & Electronique (COMELEC), Télécom ParisTech, Secure and Safe Hardware (SSH), Secure-IC S.A.S, Institut Mines-Télécom [Paris] (IMT), Texas A&M University at Qatar, and Rioul, Olivier
- Subjects
[MATH.MATH-PR] Mathematics [math]/Probability [math.PR] ,Computer Networks and Communications ,Gaussian ,Binary number ,Multivariate normal distribution ,0102 computer and information sciences ,02 engineering and technology ,[INFO.INFO-DM]Computer Science [cs]/Discrete Mathematics [cs.DM] ,01 natural sciences ,Microbiology ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,symbols.namesake ,[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR] ,[MATH.MATH-GM]Mathematics [math]/General Mathematics [math.GM] ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,0202 electrical engineering, electronic engineering, information engineering ,Discrete Mathematics and Combinatorics ,Entropy (information theory) ,[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST] ,ComputingMilieux_MISCELLANEOUS ,[INFO.INFO-CR] Computer Science [cs]/Cryptography and Security [cs.CR] ,Mathematics ,Discrete mathematics ,Algebra and Number Theory ,Computer Science::Information Retrieval ,Applied Mathematics ,Principle of maximum entropy ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,[MATH.MATH-GM] Mathematics [math]/General Mathematics [math.GM] ,020206 networking & telecommunications ,[SPI.TRON]Engineering Sciences [physics]/Electronics ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,[INFO.INFO-DM] Computer Science [cs]/Discrete Mathematics [cs.DM] ,010201 computation theory & mathematics ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,ComputingMethodologies_DOCUMENTANDTEXTPROCESSING ,symbols ,Probability distribution ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,Random variable - Abstract
Motivated by a security application on physically unclonable functions, we evaluate the probability distributions and Renyi entropies of signs of scalar products of i.i.d. Gaussian random variables against binary codewords in \begin{document}$ \{\pm1\}^n $\end{document} . The exact distributions are determined for small values of \begin{document}$ n $\end{document} and upper bounds are provided by linking this problem to the study of Boolean threshold functions. Finally, Monte-Carlo simulations are used to approximate entropies up to \begin{document}$ n = 10 $\end{document} .
- Published
- 2018
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.