8 results on '"Rioul, Olivier"'
Search Results
2. Information-Theoretic Analysis of Human Performance for Command Selection
- Author
-
Liu, Wanyu, Rioul, Olivier, Beaudouin-Lafon, Michel, Guiard, Yves, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Bernhaupt, Regina, editor, Dalvi, Girish, editor, Joshi, Anirudha, editor, K. Balkrishan, Devanuj, editor, O'Neill, Jacki, editor, and Winckler, Marco, editor
- Published
- 2017
- Full Text
- View/download PDF
3. Glass+Skin: An Empirical Evaluation of the Added Value of Finger Identification to Basic Single-Touch Interaction on Touch Screens
- Author
-
Roy, Quentin, Guiard, Yves, Bailly, Gilles, Lecolinet, Éric, Rioul, Olivier, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Abascal, Julio, editor, Barbosa, Simone, editor, Fetter, Mirko, editor, Gross, Tom, editor, Palanque, Philippe, editor, and Winckler, Marco, editor
- Published
- 2015
- Full Text
- View/download PDF
4. Contributions aux théories des ondelettes, du codage conjoint source-canal et de l'information
- Author
-
Rioul, Olivier, Rioul, Olivier, Département Communications & Electronique (COMELEC), Télécom ParisTech, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Université Pierre et Marie Curie, and Georges Alquié
- Subjects
[MATH.MATH-PR] Mathematics [math]/Probability [math.PR] ,Information theory ,[INFO.INFO-TS] Computer Science [cs]/Signal and Image Processing ,[MATH.MATH-CA]Mathematics [math]/Classical Analysis and ODEs [math.CA] ,[INFO.INFO-DM]Computer Science [cs]/Discrete Mathematics [cs.DM] ,Wavelets ,[MATH.MATH-FA]Mathematics [math]/Functional Analysis [math.FA] ,Codage source-canal conjoint ,Théorie de l’information ,[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR] ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,[MATH.MATH-GM]Mathematics [math]/General Mathematics [math.GM] ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,Ondelettes ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,Joint source-channel coding ,[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST] ,[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processing ,[INFO.INFO-CR] Computer Science [cs]/Cryptography and Security [cs.CR] ,[MATH.MATH-FA] Mathematics [math]/Functional Analysis [math.FA] ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,[MATH.MATH-GM] Mathematics [math]/General Mathematics [math.GM] ,[MATH.MATH-CA] Mathematics [math]/Classical Analysis and ODEs [math.CA] ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,[INFO.INFO-DM] Computer Science [cs]/Discrete Mathematics [cs.DM] ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-HC] Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing - Published
- 2009
5. Information Theoretic Proofs of Entropy Power Inequalities.
- Author
-
Rioul, Olivier
- Subjects
- *
INFORMATION theory , *ENTROPY (Information theory) , *MATHEMATICAL inequalities , *MATHEMATICAL proofs , *SHANNON'S model (Communication) , *COMMUNICATIONS research , *MATHEMATICAL transformations - Abstract
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
6. This is IT: A Primer on Shannon’s Entropy and Information
- Author
-
Olivier Rioul, Institut Polytechnique de Paris (IP Paris), Département Communications & Electronique (COMELEC), Télécom ParisTech, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Bertrand Duplantier and Vincent Rivasseau, and Rioul, Olivier
- Subjects
Shannon's source coding theorem ,Logarithm ,Computer science ,Entropy power inequality ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,Information theory ,Notation ,Exponential function ,Shannon’s entropy ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,Channel coding theorem ,restrict ,Shannon’s capacity formula ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,Information inequality ,Entropy (information theory) ,Entropy power ,Source coding theorem ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,Mathematical economics - Abstract
International audience; What is Shannon’s information theory (IT)? Despite its continued impact on our digital society, Claude Shannon’s life and work is still unknown to numerous people. In this tutorial, we review many aspects of the concept of entropy and information from a historical and mathematical point of view. The text is structured into small, mostly independent sections, each covering a particular topic. For simplicity we restrict our attention to one-dimensional variables and use logarithm and exponential notations log and exp without specifying the base. We culminate with a simple exposition of a recent proof (2017) of the entropy power inequality (EPI), one of the most fascinating inequalities in the theory.
- Published
- 2021
7. Geometria de canais de comunicação
- Author
-
Lucas D'Oliveira, Rafael Gregorio, 1988, Firer, Marcelo, 1961, Lavor, Carlile Campos, Alves, Marcelo Muniz Silva, Dutour Sikiric, Mathieu, Rioul, Olivier, Universidade Estadual de Campinas. Instituto de Matemática, Estatística e Computação Científica, Programa de Pós-Graduação em Matemática Aplicada, and UNIVERSIDADE ESTADUAL DE CAMPINAS
- Subjects
Decodificação por máxima verossimilhança ,Error-correcting codes (Information theory) ,Information theory ,Decodificação por mínima distância ,Minimum distance decoding ,Teoria da informação ,Embeddings (Mathematics) ,Maximum likelihood decoding ,Códigos corretores de erros (Teoria da informação) ,Mergulhos (Matemática) - Abstract
Orientador: Marcelo Firer Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica Resumo: Abordamos os canais de comunicação a partir de um ponto de vista geométrico. Mostramos que a decodificação por máxima verossimilhança e a decodificação por mínima distância são um caso particular de uma forma mais geral de decodificação que pode ser definida para qualquer matriz. Com base nisso, definimos uma equivalência de decodificação e mostramos que ela divide o espaço de matrizes em classes de equivalência que são regiões generalizadas de um arranjo de hiperplanos bem conhecido. Em seguida, definimos uma distância entre essas regiões que mede a probabilidade de um código aleatório ser decodificado incorretamente. Mostramos que esta distância é uma versão ponderada da distância de Kendall tau. Com isso, obtemos uma distância entre canais. Se para um canal existe uma métrica de modo que os decodificadores por máxima verossimilhança e mínima distância coincidem, o canal é metrizavel. Damos caracterizações para um canal ser metrizavel e apresentamos um algoritmo que constrói uma métrica nesse caso. Mostramos também que qualquer métrica, a menos de uma equivalência de decodificação, pode ser mergulhada isometricamente no hipercubo com a métrica de Hamming e, portanto, em termos de decodificação, a métrica de Hamming é universal. Apresentamos um algoritmo que, para qualquer métrica invariante por translação, dá um limite superior na dimensão mínima de tal mergulho. Encontramos também limitantes inferiores e superiores para essa dimensão. No apêndice, apresentamos uma contribuição teórica feita a um trabalho de navegação de mapas Abstract: We approach communication channels from a geometrical viewpoint. We show that maximum likelihood decoding and minimum distance decoding are a particular case of a more general form of decoding which can be defined for any matrix. Based on this we define a decoding equivalence and show that it partitions the space of matrices into equivalence classes which are generalized regions of a well known hyperplane arrangement: the braid arrangement. We then define a distance between these regions which measures the probability of a random code being decoded incorrectly. It is shown that this distance is a weighted variation of the Kendall tau distance. With this, we obtain a distance between channels. If for a channel there exists a metric such that the maximum likelihood and minimum distance decoders coincide, the channel is metrizable. We give characterizations for a channel to be metrizable and present an algorithm which constructs a metric in such a case. We also show that any metric, up to decoding equivalence, can be isometrically embedded into the hypercube with the Hamming metric, and thus, in terms of decoding, the Hamming metric is universal. We then present an algorithm which for any translation invariant metric gives an upper bound on the minimum dimension of such an embedding. We also give lower and upper bounds for this embedding dimension over the set of all such metrics. In the appendix we present the theoretical contribution made to a work on multi-scale navigation Doutorado Matemática Aplicada Doutor em Matemática Aplicada CAPES
- Published
- 2017
8. Information theory: An analysis and design tool for HCI
- Author
-
Wanyu Liu, Antti Oulasvirta, Olivier Rioul, Michel Beaudouin-Lafon, Yves Guiard, Communications Numériques (COMNUM), Laboratoire Traitement et Communication de l'Information (LTCI), Institut Mines-Télécom [Paris] (IMT)-Télécom Paris-Institut Mines-Télécom [Paris] (IMT)-Télécom Paris, Département Communications & Electronique (COMELEC), Télécom ParisTech, Institut de Recherche et Coordination Acoustique/Musique (IRCAM), Aalto University, Laboratoire de Recherche en Informatique (LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), Extreme Interaction (EX-SITU), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), Design, Interaction, Visualization & Applications (DIVA), Département Informatique et Réseaux (INFRES), Human-Centered Computing (LRI) (HCC - LRI), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), European Project: 695464,ERC,ONE(2016), European Project: 637991,H2020,ERC-2014-STG,COMPUTED(2015), Rioul, Olivier, Unified Principles of Interaction - ONE - - ERC2016-10-01 - 2021-09-30 - 695464 - VALID, and Computational User Interface Design - COMPUTED - - H20202015-04-01 - 2020-03-31 - 637991 - VALID
- Subjects
Optimization ,[MATH.MATH-PR] Mathematics [math]/Probability [math.PR] ,Information theory ,[INFO.INFO-TS] Computer Science [cs]/Signal and Image Processing ,Entropy ,Performance ,[MATH.MATH-CA]Mathematics [math]/Classical Analysis and ODEs [math.CA] ,[INFO.INFO-DM]Computer Science [cs]/Discrete Mathematics [cs.DM] ,[MATH.MATH-FA]Mathematics [math]/Functional Analysis [math.FA] ,[MATH.MATH-IT] Mathematics [math]/Information Theory [math.IT] ,[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR] ,InformationSystems_MODELSANDPRINCIPLES ,[MATH.MATH-GM]Mathematics [math]/General Mathematics [math.GM] ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST] ,[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processing ,[INFO.INFO-CR] Computer Science [cs]/Cryptography and Security [cs.CR] ,HCI ,[MATH.MATH-FA] Mathematics [math]/Functional Analysis [math.FA] ,[MATH.MATH-IT]Mathematics [math]/Information Theory [math.IT] ,[MATH.MATH-GM] Mathematics [math]/General Mathematics [math.GM] ,[MATH.MATH-CA] Mathematics [math]/Classical Analysis and ODEs [math.CA] ,Mutual information ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,[INFO.INFO-DM] Computer Science [cs]/Discrete Mathematics [cs.DM] ,[INFO.INFO-IT]Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-IT] Computer Science [cs]/Information Theory [cs.IT] ,[INFO.INFO-HC] Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing ,Model - Abstract
Position Paper; International audience; Shannon’s information theory, since its first introduction in 1948, has received much attention and successful applications in many domains. Apart from Fitts’ law and Hick’s law, which came out when experimental psychologists were enthusiastic about applying information theory to various areas of psychology, the relation of information theory to human-computer interaction (HCI) has not been clear. Even the two above-mentioned “laws” remain controversial in both psychology and HCI. On the other hand, in recent years, information theory has started to directly inspire or contribute to HCI research.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.