Back to Search
Start Over
Essential Number of Principal Components and Nearly Training-Free Model for Spectral Analysis.
- Source :
-
IEEE transactions on pattern analysis and machine intelligence [IEEE Trans Pattern Anal Mach Intell] 2024 Aug 02; Vol. PP. Date of Electronic Publication: 2024 Aug 02. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- Learning-enabled spectroscopic analysis, promising for automated real-time analysis of chemicals, is facing several challenges. Firstly, a typical machine learning model requires a large number of training samples that physical systems can not provide. Secondly, it requires the testing samples to be in range with the training samples, which often is not the case in the real world. Further, a spectroscopy device is limited by its memory size, computing power, and battery capacity. That requires highly efficient learning models for on-site analysis. In this paper, by analyzing multi-gas mixtures and multi-molecule suspensions, we first show that orders of magnitude reduction of data dimension can be achieved as the number of principal components that need to be retained is the same as the independent constituents in the mixture. From this principle, we designed highly compact models in which the essential principal components can be directly extracted from the interrelations between the individual chemical properties and principal components; and only a few training samples are required. Our model can predict the constituent concentrations that have not been seen in the training dataset and provide estimations of measurement noises. This approach can be extended as an effectively standardized method for principle component extraction.
Details
- Language :
- English
- ISSN :
- 1939-3539
- Volume :
- PP
- Database :
- MEDLINE
- Journal :
- IEEE transactions on pattern analysis and machine intelligence
- Publication Type :
- Academic Journal
- Accession number :
- 39093672
- Full Text :
- https://doi.org/10.1109/TPAMI.2024.3436860