8 results on '"Rioul, Olivier"'
Search Results
2. The Interplay between Error, Total Variation, Alpha-Entropy and Guessing: Fano and Pinsker Direct and Reverse Inequalities
- Author
-
Rioul, Olivier
- Subjects
entropy ,Rényi entropy ,guessing entropy ,guessing moments ,total variation distance ,error probability ,data processing inequality ,majorization ,Schur concavity ,Fano inequality ,Pinsker inequality - Abstract
Using majorization theory via “Robin Hood” elementary operations, optimal lower and upper bounds are derived on Rényi and guessing entropies with respect to either error probability (yielding reverse-Fano and Fano inequalities) or total variation distance to the uniform (yielding reverse-Pinsker and Pinsker inequalities). This gives a general picture of how the notion of randomness can be measured in many areas of computer science.
- Published
- 2023
- Full Text
- View/download PDF
3. Not Just Pointing: Shannon's Information Theory as a General Tool for Performance Evaluation of Input Techniques
- Author
-
Guiard, Yves, Gori, Julien, Roy, Quentin, Rioul, Olivier, Guiard, Yves, Laboratoire Traitement et Communication de l'Information (LTCI), Télécom ParisTech-Institut Mines-Télécom [Paris] (IMT)-Centre National de la Recherche Scientifique (CNRS), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, and Télécom ParisTech
- Subjects
computer input ,Fitts' law ,communication ,input techniques ,Shannon's information theory ,dynamics of information gain ACM Classification Keywords H52 Information interfaces and presenta- tion: Evaluation/methodology ,[SCCO.PSYC]Cognitive science/Psychology ,[SCCO.PSYC] Cognitive science/Psychology ,entropy ,interaction techniques - Abstract
This article was submitted to the ACM CHI conference in September 2017, and rejected in December 2017. It is currently under revision.; Input techniques serving, quite literally, to allow users to send information to the computer, the information theoretic approach seems tailor-made for their quantitative evaluation. Shannon's framework makes it straightforward to measure the performance of any technique as an effective information transmission rate, in bits/s. Apart from pointing, however, evaluators of input techniques have generally ignored Shannon, contenting themselves with less rigorous methods of speed and accuracy measurements borrowed from psychology. We plead for a serious consideration in HCI of Shannon's information theory as a tool for the evaluation of all sorts of input techniques. We start with a primer on Shannon's basic quantities and the theoretical entities of his communication model. We then discuss how the concepts should be applied to the input techniques evaluation problem. Finally we outline two concrete methodologies, one focused on the discrete timing and the other on the continuous time course of information gain by the computer.
- Published
- 2017
4. Challenge codes for physically unclonable functions with Gaussian delays: A maximum entropy problem.
- Author
-
Schaub, Alexander, Rioul, Olivier, Danger, Jean-Luc, Guilley, Sylvain, and Boutros, Joseph
- Subjects
GAUSSIAN function ,MAXIMUM entropy method ,RANDOM variables ,ENTROPY (Information theory) ,BOOLEAN functions ,PEOPLE with disabilities - Abstract
Motivated by a security application on physically unclonable functions, we evaluate the probability distributions and Rényi entropies of signs of scalar products of i.i.d. Gaussian random variables against binary codewords in {±1}
n . The exact distributions are determined for small values of n and upper bounds are provided by linking this problem to the study of Boolean threshold functions. Finally, Monte-Carlo simulations are used to approximate entropies up to n = 10. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
5. Yet Another Proof of the Entropy Power Inequality.
- Author
-
Rioul, Olivier
- Subjects
- *
MATHEMATICAL proofs , *ENTROPY (Information theory) , *GAUSSIAN processes , *PERTURBATION theory , *YOUNG'S modulus , *MATHEMATICAL variables - Abstract
Yet another simple proof of the entropy power inequality is given, which avoids both the integration over a path of Gaussian perturbation and the use of Young’s inequality with sharp constant or Rényi entropies. The proof is based on a simple change of variables, is formally identical in one and several dimensions, and easily settles the equality case. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
6. Information Theoretic Proofs of Entropy Power Inequalities.
- Author
-
Rioul, Olivier
- Subjects
- *
INFORMATION theory , *ENTROPY (Information theory) , *MATHEMATICAL inequalities , *MATHEMATICAL proofs , *SHANNON'S model (Communication) , *COMMUNICATIONS research , *MATHEMATICAL transformations - Abstract
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
7. Be My Guesses: The interplay between side-channel leakage metrics.
- Author
-
Béguinot, Julien, Cheng, Wei, Guilley, Sylvain, and Rioul, Olivier
- Subjects
- *
ADDITIVE white Gaussian noise , *ENTROPY , *HAMMING weight , *LEAKAGE - Abstract
In a theoretical context of side-channel attacks, optimal bounds between success rate, guessing entropy and statistical distance are derived with a simple majorization (Schur-concavity) argument. They are further theoretically refined for different versions of the classical Hamming weight leakage model, in particular assuming a priori equiprobable secret keys and additive white Gaussian measurement noise. Closed-form expressions and numerical computation are given. A study of the impact of the choice of the substitution box with respect to side-channel resistance reveals that its nonlinearity tends to homogenize the expressivity of success rate, guessing entropy and statistical distance. The intriguing approximate relation between guessing entropy and success rate G E = 1 / S R is observed in the case of 8-bit bytes and low noise. The exact relation between guessing entropy, statistical distance and alphabet size G E = M + 1 2 − M 2 S D for deterministic leakages and equiprobable keys is proved. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Information theory: An analysis and design tool for HCI
- Author
-
Wanyu Liu, Antti Oulasvirta, Olivier Rioul, Michel Beaudouin-Lafon, Yves Guiard, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Département Communications & Electronique (COMELEC), Télécom ParisTech, Institut de Recherche et Coordination Acoustique/Musique (IRCAM), Aalto University, Laboratoire de Recherche en Informatique (LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), Extreme Interaction (EX-SITU), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), Design, Interaction, Visualization & Applications (DIVA), Département Informatique et Réseaux (INFRES), Human-Centered Computing (LRI) (HCC - LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), European Project: 695464,ERC,ONE(2016), European Project: 637991,H2020,ERC-2014-STG,COMPUTED(2015), Rioul, Olivier, Unified Principles of Interaction - ONE - - ERC2016-10-01 - 2021-09-30 - 695464 - VALID, and Computational User Interface Design - COMPUTED - - H20202015-04-01 - 2020-03-31 - 637991 - VALID
- Subjects
Optimization ,[MATH.MATH-PR] Mathematics [math]/Probability [math.PR] ,Information theory ,[INFO.INFO-TS] Computer Science [cs]/Signal and Image Processing ,Entropy ,Performance ,[MATH.MATH-CA]Mathematics [math]/Classical Analysis and ODEs [math.CA] ,[INFO.INFO-DM]Computer Science [cs]/Discrete Mathematics [cs.DM] ,[MATH.MATH-FA]Mathematics [math]/Functional Analysis [math.FA] ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR] ,InformationSystems_MODELSANDPRINCIPLES ,[MATH.MATH-GM]Mathematics [math]/General Mathematics [math.GM] ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST] ,[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processing ,[INFO.INFO-CR] Computer Science [cs]/Cryptography and Security [cs.CR] ,HCI ,[MATH.MATH-FA] Mathematics [math]/Functional Analysis [math.FA] ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,[MATH.MATH-GM] Mathematics [math]/General Mathematics [math.GM] ,[MATH.MATH-CA] Mathematics [math]/Classical Analysis and ODEs [math.CA] ,Mutual information ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,[INFO.INFO-DM] Computer Science [cs]/Discrete Mathematics [cs.DM] ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-HC] Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing ,Model - Abstract
Position Paper; International audience; Shannon’s information theory, since its first introduction in 1948, has received much attention and successful applications in many domains. Apart from Fitts’ law and Hick’s law, which came out when experimental psychologists were enthusiastic about applying information theory to various areas of psychology, the relation of information theory to human-computer interaction (HCI) has not been clear. Even the two above-mentioned “laws” remain controversial in both psychology and HCI. On the other hand, in recent years, information theory has started to directly inspire or contribute to HCI research.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.