Back to Search
Start Over
Signal denoising based on bias-variance of intersection of confidence interval.
- Source :
- Signal, Image & Video Processing; Nov2024, Vol. 18 Issue 11, p8089-8103, 15p
- Publication Year :
- 2024
-
Abstract
- The parameters estimation through bias-to-variance ratio optimization has been suggested for various applications in literature. The explicit values of bias or variance for a "near to optimal" solution are generally not needed. However, the additional noise components vary the bias or variance randomly. This work proposes an adaptive Intersection of Confidence Intervals (ICI)-based method for the mitigation of noise component by balancing bias to variance trade-off. The presented method is a non-linear and non-parametric method of local polynomial regression (LPR). Unlike curve fitting by shrinkage or adaptation to higher harmonics representation in the wavelet transform method, the proposed technique optimized the bias of variance by estimating the signals based on smoothing parameter. The optimization in the bias or variance of estimation has a direct impact on the removal of noise components. A well-denoised signal brings precision in the local fitting of the curve in a kernel regression problem that uses the parameter obtained from ICI method. The Nadaraya-Watson kernel is used in this approach where the values of point-wise smoothing parameter is kept constant throughout regressions. The comparisons of the results of proposed method are carried out with the latest wavelet denoising to ensure its superiority in performance. The implementation complexity and memory requirements are also discussed in detail. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 18631703
- Volume :
- 18
- Issue :
- 11
- Database :
- Complementary Index
- Journal :
- Signal, Image & Video Processing
- Publication Type :
- Academic Journal
- Accession number :
- 179636368
- Full Text :
- https://doi.org/10.1007/s11760-024-03453-1