26 results on '"Deledalle, Charles-Alban"'
Search Results
2. Block-Based Refitting in ℓ12 Sparse Regularization
- Author
-
Deledalle, Charles-Alban, Papadakis, Nicolas, Salmon, Joseph, and Vaiter, Samuel
- Published
- 2021
- Full Text
- View/download PDF
3. NORCAMA: Change analysis in SAR time series by likelihood ratio change matrix clustering
- Author
-
Su, Xin, Deledalle, Charles-Alban, Tupin, Florence, and Sun, Hong
- Published
- 2015
- Full Text
- View/download PDF
4. Local behavior of sparse analysis regularization: Applications to risk estimation
- Author
-
Vaiter, Samuel, Deledalle, Charles-Alban, Peyré, Gabriel, Dossal, Charles, and Fadili, Jalal
- Published
- 2013
- Full Text
- View/download PDF
5. Edge-Based Multi-modal Registration and Application for Night Vision Devices
- Author
-
Sutour, Camille, Aujol, Jean-François, Deledalle, Charles-Alban, and de Senneville, Baudouin Denis
- Published
- 2015
- Full Text
- View/download PDF
6. Poisson Noise Reduction with Non-local PCA
- Author
-
Salmon, Joseph, Harmany, Zachary, Deledalle, Charles-Alban, and Willett, Rebecca
- Published
- 2014
- Full Text
- View/download PDF
7. Supervised classification of solar features using prior information
- Author
-
De Visscher Ruben, Delouille Véronique, Dupont Pierre, and Deledalle Charles-Alban
- Subjects
Solar image processing ,Corona ,Statistics and probability ,Classification ,Meteorology. Climatology ,QC851-999 - Abstract
Context: The Sun as seen by Extreme Ultraviolet (EUV) telescopes exhibits a variety of large-scale structures. Of particular interest for space-weather applications is the extraction of active regions (AR) and coronal holes (CH). The next generation of GOES-R satellites will provide continuous monitoring of the solar corona in six EUV bandpasses that are similar to the ones provided by the SDO-AIA EUV telescope since May 2010. Supervised segmentations of EUV images that are consistent with manual segmentations by for example space-weather forecasters help in extracting useful information from the raw data. Aims: We present a supervised segmentation method that is based on the Maximum A Posteriori rule. Our method allows integrating both manually segmented images as well as other type of information. It is applied on SDO-AIA images to segment them into AR, CH, and the remaining Quiet Sun (QS) part. Methods: A Bayesian classifier is applied on training masks provided by the user. The noise structure in EUV images is non-trivial, and this suggests the use of a non-parametric kernel density estimator to fit the intensity distribution within each class. Under the Naive Bayes assumption we can add information such as latitude distribution and total coverage of each class in a consistent manner. Those information can be prescribed by an expert or estimated with an Expectation-Maximization algorithm. Results: The segmentation masks are in line with the training masks given as input and show consistency over time. Introduction of additional information besides pixel intensity improves upon the quality of the final segmentation. Conclusions: Such a tool can aid in building automated segmentations that are consistent with some ground truth’ defined by the users.
- Published
- 2015
- Full Text
- View/download PDF
8. How to Compare Noisy Patches? Patch Similarity Beyond Gaussian Noise
- Author
-
Deledalle, Charles-Alban, Denis, Loïc, and Tupin, Florence
- Published
- 2012
- Full Text
- View/download PDF
9. Non-local Methods with Shape-Adaptive Patches (NLM-SAP)
- Author
-
Deledalle, Charles-Alban, Duval, Vincent, and Salmon, Joseph
- Published
- 2012
- Full Text
- View/download PDF
10. Blind atmospheric turbulence deconvolution.
- Author
-
Deledalle, Charles‐Alban and Gilles, Jérôme
- Abstract
A new blind image deconvolution technique is developed for atmospheric turbulence deblurring to overcome limitations of 'generic' blind deconvolution algorithms that do not take into account the complicated physics of the turbulence. The originality of the proposed approach relies on an actual physical model, known as the Fried kernel, that quantifies the impact of the atmospheric turbulence on the optical resolution of images. While the original expression of the Fried kernel can seem cumbersome at first sight, the authors show that it can be reparameterised in a much simpler form. This simple expression allows to efficiently embed this kernel in the proposed blind atmospheric turbulence deconvolution (BATUD) algorithm. BATUD is an iterative algorithm that alternately performs deconvolution and estimates the Fried kernel by jointly relying on a Gaussian mixture model prior to natural image patches and controlling for the square Euclidean norm of the Fried kernel. Numerical experiments show that the proposed blind deconvolution algorithm behaves well in different simulated turbulence scenarios, as well as on real images. Not only BATUD outperforms state‐of‐the‐art approaches used in atmospheric turbulence deconvolution in terms of image quality metrics but is also faster. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. Machine learning in acoustics: Theory and applications.
- Author
-
Bianco, Michael J., Gerstoft, Peter, Traer, James, Ozanich, Emma, Roch, Marie A., Gannot, Sharon, and Deledalle, Charles-Alban
- Subjects
MACHINE learning ,ACOUSTICS ,ARCHITECTURAL acoustics ,EARTH sciences ,MARINE sciences ,SOUND reverberation ,THEORY-practice relationship ,AUTOMATIC speech recognition - Abstract
Acoustic data provide scientific and engineering insights in fields ranging from biology and communications to ocean and Earth science. We survey the recent advances and transformative potential of machine learning (ML), including deep learning, in the field of acoustics. ML is a broad family of techniques, which are often based in statistics, for automatically detecting and utilizing patterns in data. Relative to conventional acoustics and signal processing, ML is data-driven. Given sufficient training data, ML can discover complex relationships between features and desired labels or actions, or between features themselves. With large volumes of training data, ML can discover models describing complex acoustic phenomena such as human speech and reverberation. ML in acoustics is rapidly developing with compelling results and significant future promise. We first introduce ML, then highlight ML developments in four acoustics research areas: source localization in speech processing, source localization in ocean acoustics, bioacoustics, and environmental sounds in everyday scenes. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
12. Ratio-Based Multitemporal SAR Images Denoising: RABASAR.
- Author
-
Zhao, Weiying, Deledalle, Charles-Alban, Denis, Loic, Maitre, Henri, Nicolas, Jean-Marie, and Tupin, Florence
- Subjects
- *
IMAGE denoising , *SYNTHETIC aperture radar , *STRUCTURE-activity relationships , *FASTING , *TIME series analysis - Abstract
In this paper, we propose a fast and efficient multitemporal despeckling method. The key idea of the proposed approach is the use of the ratio image, provided by the ratio between an image and the temporal mean of the stack. This ratio image is easier to denoise than a single image thanks to its improved stationarity. Besides, temporally stable thin structures are well preserved thanks to the multitemporal mean. The proposed approach can be divided into three steps: 1) estimation of a “superimage” by temporal averaging and possibly spatial denoising; 2) denoising of the ratio between the noisy image of interest and the “superimage”; and 3) computation of the denoised image by remultiplying the denoised ratio by the “superimage.” Because of the improved spatial stationarity of the ratio images, denoising these ratio images with a speckle-reduction method is more effective than denoising images from the original multitemporal stack. The amount of data that is jointly processed is also reduced compared to other methods through the use of the “superimage” that sums up the temporal stack. The comparison with several state-of-the-art reference methods shows better results numerically (peak signal-noise-ratio and structure similarity index) as well as visually on simulated and synthetic aperture radar (SAR) time series. The proposed ratio-based denoising framework successfully extends single-image SAR denoising methods to time series by exploiting the persistence of many geometrical structures. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. Accelerating GMM-Based Patch Priors for Image Restoration: Three Ingredients for a $100{\times}$ Speed-Up.
- Author
-
Parameswaran, Shibin, Deledalle, Charles-Alban, Denis, Loic, and Nguyen, Truong Q.
- Subjects
- *
IMAGE reconstruction , *IMAGE processing , *GAUSSIAN mixture models , *IMAGE quality analysis , *SIGNAL denoising - Abstract
Image restoration methods aim to recover the underlying clean image from corrupted observations. The expected patch log-likelihood (EPLL) algorithm is a powerful image restoration method that uses a Gaussian mixture model (GMM) prior on the patches of natural images. Although it is very effective for restoring images, its high runtime complexity makes the EPLL ill-suited for most practical applications. In this paper, we propose three approximations to the original EPLL algorithm. The resulting algorithm, which we call the fast-EPLL (FEPLL), attains a dramatic speed-up of two orders of magnitude over EPLL while incurring a negligible drop in the restored image quality (less than 0.5 dB). We demonstrate the efficacy and versatility of our algorithm on a number of inverse problems, such as denoising, deblurring, super-resolution, inpainting, and devignetting. To the best of our knowledge, the FEPLL is the first algorithm that can competitively restore a $512\times 512$ pixel image in under 0.5 s for all the degradations mentioned earlier without specialized code optimizations, such as CPU parallelization or GPU implementation. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
14. Image Denoising with Generalized Gaussian Mixture Model Patch Priors.
- Author
-
Deledalle, Charles-Alban, Parameswaran, Shibin, and Nguyen, Truong Q.
- Subjects
GAUSSIAN mixture models ,IMAGE denoising ,IMAGE reconstruction ,IMAGE processing ,ALPHA rhythm - Abstract
Patch priors have become an important component of image restoration. A powerful approach in this category of restoration algorithms is the popular expected patch log-likelihood (EPLL) algorithm. EPLL uses a Gaussian mixture model (GMM) prior learned on clean image patches as a way to regularize degraded patches. In this paper, we show that a generalized Gaussian mixture model (GGMM) captures the underlying distribution of patches better than a GMM. Even though GGMM is a powerful prior to combine with EPLL, the non-Gaussianity of its components presents major challenges to be applied to a computationally intensive process of image restoration. Specifically, each patch has to undergo a patch classification step and a shrinkage step. These two steps can be efficiently solved with a GMM prior but are computationally impractical when using a GGMM prior. In this paper, we provide approximations and computational recipes for fast evaluation of these two steps, so that EPLL can embed a GGMM prior on an image with more than tens of thousands of patches. Our main contribution is to analyze the accuracy of our approximations based on thorough theoretical analysis. Our evaluations indicate that the GGMM prior is consistently a better fit for modeling image patch distribution and performs better on average in image denoising task. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
15. PARISAR: Patch-Based Estimation and Regularized Inversion for Multibaseline SAR Interferometry.
- Author
-
Ferraioli, Giampaolo, Deledalle, Charles-Alban, Denis, Loic, and Tupin, Florence
- Subjects
- *
ANTENNAS (Electronics) , *SYNTHETIC aperture radar , *IMAGING systems , *INTERFEROMETRY , *DIFFRACTION patterns , *REGULARIZATION parameter - Abstract
Reconstruction of elevation maps from a collection of synthetic aperture radar (SAR) images obtained in interferometric configuration is a challenging task. Reconstruction methods must overcome two difficulties: the strong interferometric noise that contaminates the data and the 2 phase ambiguities. Interferometric noise requires some form of smoothing among pixels of identical height. Phase ambiguities can be solved, up to a point, by combining linkage to the neighbors and a global optimization strategy to prevent from being trapped in local minima. This paper introduces a reconstruction method, PARISAR, that achieves both a resolution-preserving denoising and a robust phase unwrapping (PhU) by combining nonlocal denoising methods based on patch similarities and total-variation regularization. The optimization algorithm, based on graph cuts, identifies the global optimum. Combining patch-based speckle reduction methods and regularization-based PhU requires solving several issues: 1) computational complexity, the inclusion of nonlocal neighborhoods strongly increasing the number of terms involved during the regularization, and 2) adaptation to varying neighborhoods, patch comparison leading to large neighborhoods in homogeneous regions and much sparser neighborhoods in some geometrical structures. PARISAR solves both issues. We compare PARISAR with other reconstruction methods both on numerical simulations and satellite images and show a qualitative and quantitative improvement over state-of-the-art reconstruction methods for multibaseline SAR interferometry. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
16. Texture Reconstruction Guided by a High-Resolution Patch.
- Author
-
El Gheche, Mireille, Aujol, Jean-Francois, Berthoumieu, Yannick, and Deledalle, Charles-Alban
- Subjects
HIGH resolution imaging ,TEXTURE analysis (Image processing) ,IMAGE reconstruction ,MATHEMATICAL regularization ,HISTOGRAMS - Abstract
In this paper, we aim at super-resolving a low-resolution texture under the assumption that a high-resolution patch of the texture is available. To do so, we propose a variational method that combines two approaches that are texture synthesis and image reconstruction. The resulting objective function holds a nonconvex energy that involves a quadratic distance to the low-resolution image, a histogram-based distance to the high-resolution patch, and a nonlocal regularization that links the missing pixels with the patch pixels. As for the histogram-based measure, we use a sum of Wasserstein distances between the histograms of some linear transformations of the textures. The resulting optimization problem is efficiently solved with a primal-dual proximal method. Experiments show that our method leads to a significant improvement, both visually and numerically, with respect to the state-of-the-art algorithms for solving similar problems. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
17. Image Zoom Completion.
- Author
-
Hidane, Moncef, El Gheche, Mireille, Aujol, Jean-Francois, Berthoumieu, Yannick, and Deledalle, Charles-Alban
- Subjects
PROGRAM transformation ,IMAGE reconstruction ,DIGITAL images ,SIGNAL convolution ,HIGH resolution imaging - Abstract
We consider the problem of recovering a high-resolution image from a pair consisting of a complete low-resolution image and a high-resolution but incomplete one. We refer to this task as the image zoom completion problem. After discussing possible contexts in which this setting may arise, we introduce a nonlocal regularization strategy, giving full details concerning the numerical optimization of the corresponding energy and discussing its benefits and shortcomings. We also derive two total variation-based algorithms and evaluate the performance of the proposed methods on a set of natural and textured images. We compare the results and get with those obtained with two recent state-of-the-art single-image super-resolution algorithms. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
18. Estimation of the Noise Level Function Based on a Nonparametric Detection of Homogeneous Image Regions.
- Author
-
Sutour, Camille, Deledalle, Charles-Alban, and Aujol, Jean-François
- Subjects
ACOUSTIC transients ,APPROXIMATION theory ,NONPARAMETRIC estimation ,NOISE ,LOUDNESS - Abstract
We propose a two-step algorithm that automatically estimates the noise level function of stationary noise from a single image, i.e., the noise variance as a function of the image intensity. First, the image is divided into small square regions and a nonparametric test is applied to decide whether each region is homogeneous or not. Based on Kendall's τ coefficient (a rank-based measure of correlation), this detector has a nondetection rate independent of the unknown distribution of the noise, provided that it is at least spatially uncorrelated. Moreover, we prove, on a toy example, that its overall detection error vanishes with respect to the region size as soon as the signal to noise ratio level is nonzero. Once homogeneous regions are detected, the noise level function is estimated as a second order polynomial minimizing the l
1 error on the statistics of these regions. Numerical experiments show the efficiency of the proposed approach in estimating the noise level function, with a relative error under 10% obtained on a large data set. We illustrate the interest of the approach for an image denoising application. [ABSTRACT FROM AUTHOR]- Published
- 2015
- Full Text
- View/download PDF
19. NL-SAR: A Unified Nonlocal Framework for Resolution-Preserving (Pol)(In)SAR Denoising.
- Author
-
Deledalle, Charles-Alban, Denis, Loïc, Tupin, Florence, Reigber, Andreas, and Jäger, Marc
- Subjects
- *
IMAGING systems , *SYNTHETIC aperture radar , *INTERFEROMETRY , *POLARIMETRY , *REMOTE sensing - Abstract
Speckle noise is an inherent problem in coherent imaging systems such as synthetic aperture radar. It creates strong intensity fluctuations and hampers the analysis of images and the estimation of local radiometric, polarimetric, or interferometric properties. Synthetic aperture radar (SAR) processing chains thus often include a multilooking (i.e., averaging) filter for speckle reduction, at the expense of a strong resolution loss. Preservation of point-like and fine structures and textures requires to adapt locally the estimation. Nonlocal (NL)-means successfully adapt smoothing by deriving data-driven weights from the similarity between small image patches. The generalization of nonlocal approaches offers a flexible framework for resolution-preserving speckle reduction. We describe a general method, i.e., NL-SAR, that builds extended nonlocal neighborhoods for denoising amplitude, polarimetric, and/or interferometric SAR images. These neighborhoods are defined on the basis of pixel similarity as evaluated by multichannel comparison of patches. Several nonlocal estimations are performed, and the best one is locally selected to form a single restored image with good preservation of radar structures and discontinuities. The proposed method is fully automatic and handles single and multilook images, with or without interferometric or polarimetric channels. Efficient speckle reduction with very good resolution preservation is demonstrated both on numerical experiments using simulated data, airborne, and spaceborne radar images. The source code of a parallel implementation of NL-SAR is released with this paper. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
20. Two-Step Multitemporal Nonlocal Means for Synthetic Aperture Radar Images.
- Author
-
Xin Su, Deledalle, Charles-Alban, Tupin, Florence, and Hong Sun
- Subjects
- *
SYNTHETIC aperture radar , *IMAGING systems , *ESTIMATION theory , *INFORMATION storage & retrieval systems , *REDUNDANCY in engineering - Abstract
This paper presents a denoising approach for multitemporal synthetic aperture radar (SAR) images based on the concept of nonlocal means (NLM). It exploits the information redundancy existing in multitemporal images by a two-step strategy. The first step realizes a nonlocal weighted estimation driven by the redundancy in time, whereas the second step makes use of the nonlocal estimation in space. Using patch similarity miss-registration estimation, we also adapted this approach to the case of unregistered SAR images. The experiments illustrate the efficiency of the proposed method to denoise multitemporal images while preserving new information. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
21. Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection.
- Author
-
Deledalle, Charles-Alban, Vaiter, Samuel, Fadili, Jalal, and Peyré, Gabriel
- Subjects
ALGORITHM research ,INVERSE problems ,MATHEMATICAL optimization ,IMAGE reconstruction ,NONSMOOTH optimization - Abstract
Algorithms for solving variational regularization of ill-posed inverse problems usually involve operators that depend on a collection of continuous parameters. When the operators enjoy some (local) regularity, these parameters can be selected using the so-called Stein Unbiased Risk Estimator (SURE). While this selection is usually performed by an exhaustive search, we address in this work the problem of using the SURE to efficiently optimize for a collection of continuous parameters of the model. When considering nonsmooth regularizers, such as the popular ℓ
1 -norm corresponding to soft-thresholding mapping, the SURE is a discontinuous function of the parameters preventing the use of gradient descent optimization techniques. Instead, we focus on an approximation of the SURE based on finite differences as proposed by Ramani and Unser for the Monte-Carlo SURE approach. Under mild assumptions on the estimation mapping, we show that this approximation is a weakly differentiable function of the parameters and its weak gradient, coined the Stein Unbiased GrAdient estimator of the Risk (SUGAR), provides an asymptotically (with respect to the data dimension) unbiased estimate of the gradient of the risk. Moreover, in the particular case of soft-thresholding, it is proved to also be a consistent estimator. This gradient estimate can then be used as a basis for performing a quasi-Newton optimization. The computation of the SUGAR relies on the closed-form (weak) differentiation of the nonsmooth function. We provide its expression for a large class of iterative methods including proximal splitting methods and apply our strategy to regularizations involving nonsmooth convex structured penalties. Illustrations of various image restoration and matrix completion problems are given. [ABSTRACT FROM AUTHOR]- Published
- 2014
- Full Text
- View/download PDF
22. Exploiting Patch Similarity for SAR Image Processing: The nonlocal paradigm.
- Author
-
Deledalle, Charles-Alban, Denis, Loic, Poggi, Giovanni, Tupin, Florence, and Verdoliva, Luisa
- Abstract
Most current synthetic aperture radar (SAR) systems offer high-resolution images featuring polarimetric, interferometric, multifrequency, multiangle, or multidate information. SAR images, however, suffer from strong fluctuations due to the speckle phenomenon inherent to coherent imagery. Hence, all derived parameters display strong signal-dependent variance, preventing the full exploitation of such a wealth of information. Even with the abundance of despeckling techniques proposed over the last three decades, there is still a pressing need for new methods that can handle this variety of SAR products and efficiently eliminate speckle without sacrificing the spatial resolution. Recently, patch-based filtering has emerged as a highly successful concept in image processing. By exploiting the redundancy between similar patches, it succeeds in suppressing most of the noise with good preservation of texture and thin structures. Extensions of patch-based methods to speckle reduction and joint exploitation of multichannel SAR images (interferometric, polarimetric, or PolInSAR data) have led to the best denoising performance in radar imaging to date. We give a comprehensive survey of patch-based nonlocal filtering of SAR images, focusing on the two main ingredients of the methods: measuring patch similarity and estimating the parameters of interest from a collection of similar patches. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
23. NL-InSAR: Nonlocal Interferogram Estimation.
- Author
-
Deledalle, Charles-Alban, Denis, Loïc, and Tupin, Florence
- Subjects
- *
SYNTHETIC aperture radar , *INTERFEROMETRY , *IMAGE reconstruction , *ESTIMATION theory , *REFLECTANCE , *IMAGING systems , *NOISE measurement , *PIXELS - Abstract
Interferometric synthetic aperture radar (SAR) data provide reflectivity, interferometric phase, and coherence images, which are paramount to scene interpretation or low-level processing tasks such as segmentation and 3-D reconstruction. These images are estimated in practice from a Hermitian product on local windows. These windows lead to biases and resolution losses due to the local heterogeneity caused by edges and textures. This paper proposes a nonlocal approach for the joint estimation of the reflectivity, the interferometric phase, and the coherence images from an interferometric pair of coregistered single-look complex (SLC) SAR images. Nonlocal techniques are known to efficiently reduce noise while preserving structures by performing the weighted averaging of similar pixels. Two pixels are considered similar if the surrounding image patches are “resembling.” Patch similarity is usually defined as the Euclidean distance between the vectors of graylevels. In this paper, a statistically grounded patch-similarity criterion suitable to SLC images is derived. A weighted maximum likelihood estimation of the SAR interferogram is then computed with weights derived in a data-driven way. Weights are defined from the intensity and interferometric phase and are iteratively refined based both on the similarity between noisy patches and on the similarity of patches from the previous estimate. The efficiency of this new interferogram construction technique is illustrated both qualitatively and quantitatively on synthetic and true data. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
24. Iterative Weighted Maximum Likelihood Denoising With Probabilistic Patch-Based Weights.
- Author
-
Deledalle, Charles-Alban, Denis, Loïc, and Tupin, Florence
- Subjects
- *
NOISE control , *IMAGE processing , *PIXELS , *SIGNAL-to-noise ratio , *SYNTHETIC aperture radar , *RANDOM noise theory - Abstract
Image denoising is an important problem in image processing since noise may interfere with visual or automatic interpretation. This paper presents a new approach for image denoising in the case of a known uncorrelated noise model. The proposed filter is an extension of the nonlocal means (NL means) algorithm introduced by Buades et al, which performs a weighted average of the values of similar pixels. Pixel similarity is defined in NL means as the Euclidean distance between patches (rectangular windows centered on each two pixels). In this paper, a more general and statistically grounded similarity criterion is proposed which depends on the noise distribution model. The denoising process is expressed as a weighted maximum likelihood estimation problem where the weights are derived in a data-driven way. These weights can be iteratively refined based on both the similarity between noisy patches and the similarity of patches extracted from the previous estimate. We show that this iterative process noticeably improves the denoising performance, especially in the case of low signal-to-noise ratio images such as synthetic aperture radar (SAR) images. Numerical experiments illustrate that the technique can be successfully applied to the classical case of additive Gaussian noise but also to cases such as multiplicative speckle noise. The proposed denoising technique seems to improve on the state of the art performance in that latter case. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
25. MuLoG, or How to Apply Gaussian Denoisers to Multi-Channel SAR Speckle Reduction?
- Author
-
Deledalle CA, Denis L, Tabti S, and Tupin F
- Abstract
Speckle reduction is a longstanding topic in synthetic aperture radar (SAR) imaging. Since most current and planned SAR imaging satellites operate in polarimetric, interferometric, or tomographic modes, SAR images are multi-channel and speckle reduction techniques must jointly process all channels to recover polarimetric and interferometric information. The distinctive nature of SAR signal (complex-valued, corrupted by multiplicative fluctuations) calls for the development of specialized methods for speckle reduction. Image denoising is a very active topic in image processing with a wide variety of approaches and many denoising algorithms available, almost always designed for additive Gaussian noise suppression. This paper proposes a general scheme, called MuLoG (MUlti-channel LOgarithm with Gaussian denoising), to include such Gaussian denoisers within a multi-channel SAR speckle reduction technique. A new family of speckle reduction algorithms can thus be obtained, benefiting from the ongoing progress in Gaussian denoising, and offering several speckle reduction results often displaying method-specific artifacts that can be dismissed by comparison between results.
- Published
- 2017
- Full Text
- View/download PDF
26. Adaptive regularization of the NL-means: application to image and video denoising.
- Author
-
Sutour C, Deledalle CA, and Aujol JF
- Subjects
- Computer Simulation, Image Interpretation, Computer-Assisted methods, Reproducibility of Results, Sensitivity and Specificity, Signal Processing, Computer-Assisted, Signal-To-Noise Ratio, Algorithms, Artifacts, Image Enhancement methods, Models, Statistical, Photography methods, Video Recording methods
- Abstract
Image denoising is a central problem in image processing and it is often a necessary step prior to higher level analysis such as segmentation, reconstruction, or super-resolution. The nonlocal means (NL-means) perform denoising by exploiting the natural redundancy of patterns inside an image; they perform a weighted average of pixels whose neighborhoods (patches) are close to each other. This reduces significantly the noise while preserving most of the image content. While it performs well on flat areas and textures, it suffers from two opposite drawbacks: it might over-smooth low-contrasted areas or leave a residual noise around edges and singular structures. Denoising can also be performed by total variation minimization-the Rudin, Osher and Fatemi model-which leads to restore regular images, but it is prone to over-smooth textures, staircasing effects, and contrast losses. We introduce in this paper a variational approach that corrects the over-smoothing and reduces the residual noise of the NL-means by adaptively regularizing nonlocal methods with the total variation. The proposed regularized NL-means algorithm combines these methods and reduces both of their respective defaults by minimizing an adaptive total variation with a nonlocal data fidelity term. Besides, this model adapts to different noise statistics and a fast solution can be obtained in the general case of the exponential family. We develop this model for image denoising and we adapt it to video denoising with 3D patches.
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.