6 results
Search Results
2. Benefits of Zero-Phase or Linear Phase Filters to Design Multiscale Entropy: Theory and Application.
- Author
-
Grivel, Eric, Berthelot, Bastien, Colin, Gaetan, Legrand, Pierrick, and Ibanez, Vincent
- Subjects
- *
PINK noise , *SIGNAL filtering , *ENTROPY , *TOPOLOGICAL entropy , *TUNNEL design & construction - Abstract
In various applications, multiscale entropy (MSE) is often used as a feature to characterize the complexity of the signals in order to classify them. It consists of estimating the sample entropies (SEs) of the signal under study and its coarse-grained (CG) versions, where the CG process amounts to (1) filtering the signal with an average filter whose order is the scale and (2) decimating the filter output by a factor equal to the scale. In this paper, we propose to derive a new variant of the MSE. Its novelty stands in the way to get the sequences at different scales by avoiding distortions during the decimation step. To this end, a linear-phase or null-phase low-pass filter whose cutoff frequency is well suited to the scale is used. Interpretations on how the MSE behaves and illustrations with a sum of sinusoids, as well as white and pink noises, are given. Then, an application to detect attentional tunneling is presented. It shows the benefit of the new approach in terms of p value when one aims at differentiating the set of MSEs obtained in the attentional tunneling state from the set of MSEs obtained in the nominal state. It should be noted that CG versions can be replaced not only for the MSE but also for other variants. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Entropy-Based Methods for Motor Fault Detection: A Review.
- Author
-
Aguayo-Tapia, Sarahi, Avalos-Almazan, Gerardo, and Rangel-Magdaleno, Jose de Jesus
- Subjects
- *
MANUFACTURING processes , *ARTIFICIAL intelligence , *SIGNAL processing , *MOTORS , *ENTROPY , *TOPOLOGICAL entropy , *MONITORING of machinery - Abstract
In the signal analysis context, the entropy concept can characterize signal properties for detecting anomalies or non-representative behaviors in fiscal systems. In motor fault detection theory, entropy can measure disorder or uncertainty, aiding in detecting and classifying faults or abnormal operation conditions. This is especially relevant in industrial processes, where early motor fault detection can prevent progressive damage, operational interruptions, or potentially dangerous situations. The study of motor fault detection based on entropy theory holds significant academic relevance too, effectively bridging theoretical frameworks with industrial exigencies. As industrial sectors progress, applying entropy-based methodologies becomes indispensable for ensuring machinery integrity based on control and monitoring systems. This academic endeavor enhances the understanding of signal processing methodologies and accelerates progress in artificial intelligence and other modern knowledge areas. A wide variety of entropy-based methods have been employed for motor fault detection. This process involves assessing the complexity of measured signals from electrical motors, such as vibrations or stator currents, to form feature vectors. These vectors are then fed into artificial-intelligence-based classifiers to distinguish between healthy and faulty motor signals. This paper discusses some recent references to entropy methods and a summary of the most relevant results reported for fault detection over the last 10 years. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. It Ain't Necessarily So: Ludwig Boltzmann's Darwinian Notion of Entropy.
- Author
-
Gimbel, Steven
- Subjects
- *
SECOND law of thermodynamics , *HISTORY of physics , *PHYSICAL laws , *TOPOLOGICAL entropy - Abstract
Ludwig Boltzmann's move in his seminal paper of 1877, introducing a statistical understanding of entropy, was a watershed moment in the history of physics. The work not only introduced quantization and provided a new understanding of entropy, it challenged the understanding of what a law of nature could be. Traditionally, nomological necessity, that is, specifying the way in which a system must develop, was considered an essential element of proposed physical laws. Yet, here was a new understanding of the Second Law of Thermodynamics that no longer possessed this property. While it was a new direction in physics, in other important scientific discourses of that time—specifically Huttonian geology and Darwinian evolution, similar approaches were taken in which a system's development followed principles, but did so in a way that both provided a direction of time and allowed for non-deterministic, though rule-based, time evolution. Boltzmann referred to both of these theories, especially the work of Darwin, frequently. The possibility that Darwin influenced Boltzmann's thought in physics can be seen as being supported by Boltzmann's later writings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Vulnerability Analysis Method Based on Network and Copula Entropy.
- Author
-
Chen, Mengyuan, Liu, Jilan, Zhang, Ning, and Zheng, Yichao
- Subjects
- *
FINANCIAL security , *CURVATURE , *GRAPH theory , *TOPOLOGICAL entropy - Abstract
With the deepening of the diversification and openness of financial systems, financial vulnerability, as an endogenous attribute of financial systems, becomes an important measurement of financial security. Based on a network analysis, we introduce a network curvature indicator improved by Copula entropy as an innovative metric of financial vulnerability. Compared with the previous network curvature analysis method, the CE-based curvature proposed in this paper can measure market vulnerability and systematic risk with significant advantages. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Multi-Additivity in Kaniadakis Entropy.
- Author
-
Scarfone, Antonio M. and Wada, Tatsuaki
- Subjects
- *
REAL numbers , *ENTROPY , *TOPOLOGICAL entropy - Abstract
It is known that Kaniadakis entropy, a generalization of the Shannon–Boltzmann–Gibbs entropic form, is always super-additive for any bipartite statistically independent distributions. In this paper, we show that when imposing a suitable constraint, there exist classes of maximal entropy distributions labeled by a positive real number ℵ > 0 that makes Kaniadakis entropy multi-additive, i.e., S κ [ p A ∪ B ] = (1 + ℵ) S κ [ p A ] + S κ [ p B ] , under the composition of two statistically independent and identically distributed distributions p A ∪ B (x , y) = p A (x) p B (y) , with reduced distributions p A (x) and p B (y) belonging to the same class. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.