23 results on '"bilateral filtering"'
Search Results
2. An effective image annotation using self-attention based stacked bidirectional capsule network
- Author
-
Palekar, Vikas and Kumar L, Sathish
- Published
- 2025
- Full Text
- View/download PDF
3. MuSwin-Mob: An automated person identification system using periocular images based on hybrid deep learning model.
- Author
-
Bhamare, Deepali R. and Patil, Pravin S.
- Subjects
GRAPH neural networks ,TRANSFORMER models ,OPTIMIZATION algorithms ,IMAGE recognition (Computer vision) ,FEATURE extraction - Abstract
• To introduce an efficient automated periocular recognition model for person identification based on hybrid deep learning approaches. • To pre-process Quad Enhanced Histogram equalization (QuEhe) and Extended Bilateral filtering (ExBiF) for image enhancement. • To extract the effective features using Mutual Conversion Swin Patch Transformer Assisted Coati Depth Wise Mobile Net (MuSwin-Mob). • To identify the person using Graph Neural Network Based Super Glue Matching Algorithm (GNN_SGM) with enhanced recognition accuracy. Due to its ability to detect the periocular image with the original image, periocular images have become quite popular for security purposes. Periocular recognition uses a variety of strategies and techniques to identify the surrounding image based on certain eye features. Several techniques have been developed to provide efficient Periocular image recognition. However, it fails because of problems like poor performance, high computational complexity, restricted feature extraction, etc. To overcome these issues, the proposed technique is developed and provides efficient periocular image recognition. Initially, images are collected from the dataset and pre-processed to improve recognition of image contrast and noise reduction. The features of the periocular image and query images are extracted using Mutual Conversion Swin Patch Transformer Assisted Coati Depth Wise Mobile Net (MuSwin-Mob), which is classified as local and global features. The appropriate features are selected using the Exaggerated Archer Fish optimization algorithm (ExAFo). Finally, the Graph Neural Network Based Super Glue Matching Algorithm (GNN_SGM) model is used to match the features from both the query and the periocular image based on similar periocular features. The performance for the four datasets, Periocular recognition datasets, UBIPr dataset, Facemask detection dataset, and Glasses Vs without glasses dataset, is processed independently. For four datasets, the accuracy of the proposed model is determined to be 98.83 %, 99.58 %, 99.16 %, and 99.21 %. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
4. Multi-scale deep feature learning network with bilateral filtering for SAR image classification.
- Author
-
Geng, Jie, Jiang, Wen, and Deng, Xinyang
- Subjects
- *
DEEP learning , *SYNTHETIC aperture radar , *PLURALITY voting , *IMAGE sensors , *CLASSIFICATION - Abstract
Synthetic aperture radar (SAR) image classification using deep neural network has drawn great attention, which generally requires various layers of deep model for feature learning. However, a deeper neural network will result in overfitting with limited training samples. In this paper, a multi-scale deep feature learning network with bilateral filtering (MDFLN-BF) is proposed for SAR image classification, which aims to extract discriminative features and reduce the requirement of labeled samples. In the proposed framework, MDFLN is proposed to extract features from SAR image on multiple scales, where the SAR image is stratified into different scales and a full convolutional network is utilized to extract features from each scale sub-image. Then, features of multiple scales are classified by multiple softmax classifiers and combined by majority vote algorithm. Further, bilateral filtering is developed to optimize the classification map based on spatial relation, which aims to improve the spatial smoothness. Experiments are tested on three SAR images with different sensors, bands, resolutions, and polarizations in order to prove the generalization ability. It is demonstrated that the proposed MDFLN-BF is able to yield superior results than other related deep networks. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
5. An image processing approach to feature-preserving B-spline surface fairing.
- Author
-
Kawasaki, Taro, Jayaraman, Pradeep Kumar, Shida, Kentaro, Zheng, Jianmin, and Maekawa, Takashi
- Subjects
- *
IMAGE processing , *FEATURE extraction , *REVERSE engineering , *CLOUD computing , *IMAGE reconstruction - Abstract
Reverse engineering of 3D industrial objects such as automobiles and electric appliances is typically performed by fitting B-spline surfaces to scanned point cloud data with a fairing term to ensure smoothness, which often smooths out sharp features. This paper proposes a radically different approach to constructing fair B-spline surfaces, which consists of fitting a surface without a fairing term to capture sharp edges, smoothing the normal field of the constructed surface with feature preservation, and reconstructing the B-spline surface from the smoothed normal field. The core of our method is an image processing based feature-preserving normal field fairing technique. This is inspired by the success of many recent research works on the use of normal field for reconstructing mesh models, and makes use of the impressive simplicity and effectiveness of bilateral-like filtering for image denoising. In particular, our approach adaptively partitions the B-spline surface into a set of segments such that each segment has approximately uniform parameterization, generates an image from each segment in the parameter space whose pixel values are the normal vectors of the surface, and then applies a bilateral filter in the parameter domain to fair the normal field. As a result, our approach inherits the advantages of image bilateral filtering techniques and is able to effectively smooth B-spline surfaces with feature preservation as demonstrated by various examples. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
6. Image denoising feedback framework using split Bregman approach.
- Author
-
Kim, Jeong Heon, Akram, Farhan, and Choi, Kwang Nam
- Subjects
- *
IMAGE denoising , *SPLIT Bregman method , *IMAGE processing , *PIXELS , *FILTERING software - Abstract
In this paper, an image denoising feedback framework is proposed for both color and range images. The proposed method works on an error minimization principle using split Bregman method. At first image is denoised by computing means in the local neighborhood. The pixels that have big differences from the center of the local neighborhood compared to the noise variance are then extracted from the denoised image. There is a low correlation between the extracted pixels and their local neighborhood. This information is fed to the feedback function and denoising is performed again, iteratively, to minimize the error. In most cases, the proposed framework yields best results both qualitatively and quantitatively. It shows better denoising results than the bilateral filtering when the edge information in the input images is affected by intense noise. Moreover, during the denoising process feedback function ensures that the edges are not over smoothed. The proposed framework is applied to denoise both color and range images, which shows it works effectively on a wide variety of images unlike the evaluated state-of-the-art denoising methods. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
7. A fast 3D adaptive bilateral filter for ultrasound volume visualization.
- Author
-
Kwon, Koojoo, Kim, Min-Su, and Shin, Byeong-Seok
- Subjects
- *
ULTRASONIC imaging , *VISUALIZATION , *IMAGINATION , *SIGNAL filtering , *SPATIAL filters - Abstract
Background and objective This paper introduces an effective noise removal method for medical ultrasound volume data. Ultrasound data usually need to be filtered because they contain significant noise. Conventional two-dimensional (2D) filtering methods cannot use the implicit information between adjacent layers, and existing 3D filtering methods are slow because of complicated filter kernels. Even though one filter method utilizes simple filters for speed, it is inefficient at removing noise and does not take into account the characteristics of ultrasound sampling. To solve this problem, we introduce a fast filtering method using parallel bilateral filtering and adjust the filter window size proportionally according to its position. Methods We devised a parallel bilateral filtering by obtaining a 3D summed area table of a quantized spatial filter. The filtering method is made adaptive by changing the kernel window size according to the distance from the ultrasound signal transmission point. Results Experiments were performed to compare the noise removal and loss of original data of the anisotropic diffusion filtering, bilateral filtering, and adaptive bilateral filtering of ultrasound volume-rendered images. The results show that the adaptive filter correctly takes into account the sampling characteristics of the ultrasound volumes. Conclusions The proposed method can more efficiently remove noise and minimize distortion from ultrasound data than existing simple or non-adaptive filtering methods. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
8. Comparison of different image denoising algorithms for Chinese calligraphy images.
- Author
-
Huang, Zhi-Kai, Li, Zhi-Hong, Huang, Han, Li, Zhi-Biao, and Hou, Ling-Ying
- Subjects
- *
COMPARATIVE studies , *CHINESE calligraphy , *IMAGE denoising , *IMAGE reconstruction , *NEURAL computers , *ALGORITHMS - Abstract
Rubbing is one of the most universal and perhaps the oldest of the techniques that have been used in printmaking. A carefully made rubbing provides an accurate and full-scale facsimile of the surface reproduced. However, many rubbing have been destroyed or lacked a good ways to identify them by certain events, while some other contained a large white background, or have become illegible due to erosion. In order to correct interpretation of these images, some image restoration techniques are employed. Image denoising is one of the important fields in the restoration arena. But, a great challenge of image denoising is how to preserve the edges and all fine details of a rubbing image while reducing the noise. This paper presents a comprehensive comparative study of image denoising techniques relying on Anisotropic Diffusion filter, Wiener filter, TV (Total Variation), NLM (Non-Local Means, NLM), Bilateral filtering. A quantitative measure of comparison is provided by the PSNR, MSE, SNR, UQI and SSIM of the image. Finally, the paper also analyzes its effect of denoising on rubbings with various algorithm and points out the advantages and disadvantages in application. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
9. Real Time Visibility Enhancement for Single Image Haze Removal.
- Author
-
Kumari, Apurva and Sahoo, S.K.
- Subjects
HAZE ,METEOROLOGICAL optics ,ATMOSPHERIC brown clouds ,WEATHER ,ATMOSPHERIC pressure - Abstract
In this paper, we propose an efficient method to remove haze from a single input image. Here, we presented an approach which is based on Fast Fourier Transform. Transmission map is refined by the dark channel prior method and Fast Fourier Transform. Finally the scene radiance is corrected using the visibility restoration model. Qualitative and quantitative results demonstrated that this method can effectively remove the bad weather condition and enhance the contrast of the input images and performs well in comparison with bilateral filtering. Moreover, the proposed method can significantly reduce the computational complexity. The use of Fast Fourier Transform in these images makes our approach faster by 88% in comparison to the bilateral filtering method. The main advantage of the proposed approach is suitable for images with too much of the sky background. Proposed method, due to its speed and ability to improve visibility, may be used in many systems such as surveillance, consumer electronics and remote sensing. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
10. Bilateral filtering inspired locality preserving projections for hyperspectral images.
- Author
-
Li, Xinrong, Pan, Jing, He, Yuqing, and Liu, Changshu
- Subjects
- *
HYPERSPECTRAL imaging systems , *DATA analysis , *INFORMATION theory , *REMOTE sensing , *EUCLIDEAN distance - Abstract
As a high-dimensional data, hyperspectral image contains rich information for agricultural remote sensing classification. Locality preserving projections (LPPs) have been widely used for extracting compact and discriminative information from such high-dimensional data. The objective function of LPP is formulated as a sum of the difference between transformed low dimensional vectors weighed by a function of the difference between images. The weights are crucial for LPP which enforce reduced feature vectors preserving the locality property in the original high dimensional space. In this paper, we borrow the idea of weight design of bilateral filtering to re-design the weights in LPP. The weights in bilateral filtering depend not only on the Euclidean distance of pixels (i.e., spatial weight) but also on the intensity differences (i.e., range weight). Analogously, we design the weights in our improved LPP (called bilateral LPP and abbreviated to BLPP) as a multiplication of a function of Euclidean distance ‖ x i − x j ‖ of the original images (i.e., spatial weight) and a function of the Euclidean distance ‖ f ( x i ) − f ( x j ) ‖ of the features extracted from the images (i.e., range weight, a.k.a., feature weight). The spatial weight measures the similarity in spatial space whereas the feature weight measures the similarity in feature space which reveals the content of the images. Thus, the proposed BLPP utilizes both the spatial information and the image content information, which results in higher recognition rate. Experimental results on the Salinas and Indian Pine hyperspectral databases demonstrate the effectiveness of BLPP. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
11. Enhancement of range data using a structure-aware filter.
- Author
-
Park, Min Ki, Lee, Yong Yi, Jang, In Yeop, and Lee, Kwan H.
- Subjects
- *
INFRARED detectors , *DATA acquisition systems , *CAMERAS , *STATISTICAL smoothing , *INFORMATION filtering systems - Abstract
Range data acquired by affordable IR depth sensors are significantly noisy and lack shape detail, making it difficult to obtain high-quality scans using depth cameras. Most methods to enhance noisy data do not preserve the original structure of the measured scene during data refinement, particularly at sharp edges and object boundaries. We present a novel approach to enhance the range data in a structure-aware manner based on the normal information in a scene. We first computed reliable normals by smoothing out noise and sharpening the edges of individual points using color-guided normal smoothing. Based on the normal information, the noisy range data were refined by a 3D bilateral filtering technique while preserving ridges, valleys, and depth discontinuities. In a comparison, our method outperformed previous techniques in terms of noise suppression and faithful structure reconstruction. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
12. An implicit sliding-motion preserving regularisation via bilateral filtering for deformable image registration.
- Author
-
Papież, Bartłomiej W., Heinrich, Mattias P., Fehrenbach, Jérome, Risser, Laurent, and Schnabel, Julia A.
- Subjects
- *
PLEURA diseases , *IMAGE processing , *MEDICAL imaging systems , *MATHEMATICAL regularization , *KERNEL (Mathematics) , *COMPUTED tomography - Abstract
Several biomedical applications require accurate image registration that can cope effectively with complex organ deformations. This paper addresses this problem by introducing a generic deformable registration algorithm with a new regularization scheme, which is performed through bilateral filtering of the deformation field. The proposed approach is primarily designed to handle smooth deformations both between and within body structures, and also more challenging deformation discontinuities exhibited by sliding organs. The conventional Gaussian smoothing of deformation fields is replaced by a bilateral filtering procedure, which compromises between the spatial smoothness and local intensity similarity kernels, and is further supported by a deformation field similarity kernel. Moreover, the presented framework does not require any explicit prior knowledge about the organ motion properties (e.g. segmentation) and therefore forms a fully automated registration technique. Validation was performed using synthetic phantom data and publicly available clinical 4D CT lung data sets. In both cases, the quantitative analysis shows improved accuracy when compared to conventional Gaussian smoothing. In addition, we provide experimental evidence that masking the lungs in order to avoid the problem of sliding motion during registration performs similarly in terms of the target registration error when compared to the proposed approach, however it requires accurate lung segmentation. Finally, quantification of the level and location of detected sliding motion yields visually plausible results by demonstrating noticeable sliding at the pleural cavity boundaries. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
13. Bilateral two-dimensional least mean square filter for infrared small target detection.
- Author
-
Zhao, Yao, Pan, Haibin, Du, Changping, Peng, Yanrong, and Zheng, Yao
- Subjects
- *
INFRARED detectors , *MEAN square algorithms , *GAUSSIAN processes , *ANALYSIS of variance , *PREDICTION models - Abstract
Highlights: [•] Variance of Gaussian filter and step size are adjusted adaptively. [•] Leftward filter is added. [•] Prediction error is separated by its plus–minus sign. [•] Four images of prediction error are fused. [•] Clouds and noises are significantly suppressed. [Copyright &y& Elsevier]
- Published
- 2014
- Full Text
- View/download PDF
14. A multiresolution framework for local similarity based image denoising
- Author
-
Rajpoot, Nasir and Butt, Irfan
- Subjects
- *
IMAGE processing , *RANDOM noise theory , *WHITE noise theory , *SIMILARITY transformations , *IMAGE reconstruction , *WAVELETS (Mathematics) , *PIXELS - Abstract
Abstract: In this paper, we present a generic framework for denoising of images corrupted with additive white Gaussian noise based on the idea of regional similarity. The proposed framework employs a similarity function using the distance between pixels in a multidimensional feature space, whereby multiple feature maps describing various local regional characteristics can be utilized, giving higher weight to pixels having similar regional characteristics. An extension of the proposed framework into a multiresolution setting using wavelets and scale space is presented. It is shown that the resulting multiresolution multilateral (MRM) filtering algorithm not only eliminates the coarse-grain noise but can also faithfully reconstruct anisotropic features, particularly in the presence of high levels of noise. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
15. HDR Image Generation based on Intensity Clustering and Local Feature Analysis
- Author
-
Jo, Kang-Hyun and Vavilin, Andrey
- Subjects
- *
HIGH dynamic range imaging , *INFORMATION filtering , *ALGORITHMS , *PIXELS , *ARBITRARY constants , *ENTROPY , *STATISTICAL weighting , *IMAGE analysis - Abstract
Abstract: This paper describes a cluster-based method for combining differently exposed images in order to increase their dynamic range. Initially an image is decomposed into a set of arbitrary shaped regions. For each region we compute a utility function which is based on the amount of presented information and an entropy. This function is used to select the most appropriate exposure for each region. After the exposures are selected, a bilateral filtering is applied in order to make the interregional transitions smooth. As a result we obtain weighting coefficients for each exposure and pixel. An output image is combined from clusters of input images using weights. Each pixel of the output image is calculated as a weighted sum of exposures. The proposed method allows recovering details from overexposed and underexposed parts of image without producing additional noise. Our experiments show effectiveness of the algorithm for the high dynamic range scenes. It requires no information about shutter speed or camera parameters. This method shows robust results even if the exposure difference between input images is 2-stops or higher. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
16. Feasibility of Dose Reduction Using Novel Denoising Techniques for Low kV (80 kV) CT Enterography: Optimization and Validation.
- Author
-
Guimarães, Luís S., Fletcher, Joel G., Yu, Lifeng, Huprich, James E., Fidler, Jeff L., Manduca, Armando, Ramirez-Giraldo, Juan Carlos, Holmes, David R., and McCollough, Cynthia H.
- Abstract
Rationale and Objectives: The aim of this study was to optimize and validate projection-space denoising (PSDN) strategies for application to 80-kV computed tomographic (CT) data to achieve 50% dose reduction. Materials and Methods: Image data obtained at 80 kV (mean CT dose index volume, 7.9 mGy) from dual-source, dual-energy CT enterographic (CTE) exams in 42 patients were used. For each exam, nine 80 kV image data sets were reconstructed using PSDN (three levels of intensity) with or without image-based denoising and compared to commercial reconstruction kernels. For optimization, qualitative analysis selected optimal denoising strategies, with quantitative analysis measuring image contrast, noise, and sharpness (full width at half maximum bowel wall thickness, maximum CT number gradient). For validation, two radiologists examined image quality, comparing low-dose 80-kV optimally denoised images to full-dose mixed-voltage images. Results: PSDN algorithms generated the best 80-kV image quality (41 of 42 patients), while the commercial kernels produced the worst (39 of 42) (P < .001). Overall, 80-kV PSDN approaches resulted in higher contrast (mean, 332 vs 290 Hounsfield units), slightly less noise (mean, 20 vs 26 Hounsfield units), but slightly decreased image sharpness (relative bowel wall thickness, 1.069 vs 1.000) compared to full-dose mixed-voltage images. Mean image quality scores for full-dose CTE images were 4.9 compared to 4.5 for optimally denoised half-dose 80-kV CTE images and 3.1 for nondenoised 80-kV CTE images (P < .001). Conclusion: Optimized denoising strategies improve the quality of 80-kV CTE images such that CT data obtained at 50% of routine dose levels approaches the image quality of full-dose exams. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
17. Wavelet domain non-linear filtering for MRI denoising
- Author
-
Anand, C. Shyam and Sahambi, Jyotinder S.
- Subjects
- *
MAGNETIC resonance imaging , *IMAGE processing , *DIAGNOSTIC imaging , *MEDICAL imaging systems , *MEDICAL equipment - Abstract
Abstract: Feature-preserved denoising is of great interest in medical image processing. This article presents a wavelet-based bilateral filtering scheme for noise reduction in magnetic resonance images. Undecimated wavelet transform is employed to provide effective representation of the noisy coefficients. Bilateral filtering of the approximate coefficients improves the denoising efficiency and effectively preserves the edge features. Denoising is done in the square magnitude domain, where the noise tends to be signal independent and is additive. The proposed method has been adapted specifically to Rician noise. The visual and the diagnostic quality of the denoised image is well preserved. The quantitative and the qualitative measures used as the quality metrics demonstrate the ability of the proposed method for noise suppression. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
18. A principled approach to image denoising with similarity kernels involving patches
- Author
-
De Decker, Arnaud, Lee, John A., and Verleysen, Michel
- Subjects
- *
IMAGE analysis , *KERNEL functions , *PIXELS , *MATHEMATICAL convolutions , *IMAGE quality analysis , *COMPUTATIONAL complexity - Abstract
Abstract: Denoising is a cornerstone of image analysis and remains a very active research field. This paper deals with image filters that rely on similarity kernels to compute weighted pixel averages. Whereas similarities have been based on the comparison of isolated pixel values until recently, modern filters extend the paradigm to groups of pixels called patches. Significant quality improvements result from the mere replacement of pixel differences with patch-to-patch comparisons directly into the filter. Our objective is to cast this generalization within the framework of mode estimation. Starting from objective functions that are extended to patches, this leads us to slightly different formulations of filters proposed in the literature, such as the local M-smoothers, bilateral filters, and the nonlocal means. A fast implementation of these new filters relying on separable linear-time convolutions is detailed. Experiments show that this principled approach further improves the denoising quality without increasing the computational complexity. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
19. Circular spatial filtering under high-noise-variance conditions
- Author
-
Bhoi, Nilamani and Meher, Sukadev
- Subjects
- *
RANDOM noise theory , *COMPUTER graphics , *SIGNAL-to-noise ratio , *STOCHASTIC processes , *DIGITAL image processing , *GRAPHICAL user interfaces - Abstract
Abstract: This paper introduces a novel circular spatial filtering scheme for suppressing additive white Gaussian noise (AWGN) under high-noise-variance conditions. In this method, a circular spatial-domain window, whose weights are derived from two independent functions: (i) spatial distance and (ii) gray level distance, is employed for filtering. The proposed filter is different from the Bilateral filter [Tomasi C, Manduchi R. Bilateral filtering for gray and color images. In: Proceedings of the IEEE internal conference on computer vision 1998. p. 836–46] and performs well under high-noise conditions. It is capable of smoothing Gaussian noise as well as retaining detailed information of images. It gives significant performance in terms of peak-signal-to-noise ratio (PSNR) and universal quality index (UQI) and outperforms many known existing spatial-domain and wavelet-domain filters. The filtered image also gives better visual quality than the existing methods. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
20. Adaptive bilateral filtering of image signals using local phase characteristics
- Author
-
Wong, Alexander
- Subjects
- *
ADAPTIVE filters , *SIGNAL-to-noise ratio , *SIGNAL processing , *SHANNON'S model (Communication) - Abstract
Abstract: This paper presents a novel perceptually based method for noise reduction of image signals characterized by low signal to noise ratios. The proposed method exploits the local phase characteristics of an image signal to perform bilateral filtering in an adaptive manner. The proposed method takes advantage of the human perception system to preserve perceptually significant signal detail while suppressing perceptually significant noise in the image signal. Experimental results show that the proposed method is effective at removing signal noise while enhancing perceptual quality both quantitatively and qualitatively. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
21. A common framework for nonlinear diffusion, adaptive smoothing, bilateral filtering and mean shift
- Author
-
Barash, Danny and Comaniciu, Dorin
- Subjects
- *
DIFFUSION , *SMOOTHING (Numerical analysis) , *MATHEMATICAL analysis - Abstract
In this paper, a common framework is outlined for nonlinear diffusion, adaptive smoothing, bilateral filtering and mean shift procedure. Previously, the relationship between bilateral filtering and the nonlinear diffusion equation was explored by using a consistent adaptive smoothing formulation. However, both nonlinear diffusion and adaptive smoothing were treated as local processes applying a 3×3 window at each iteration. Here, these two approaches are extended to an arbitrary window, showing their equivalence and stressing the importance of using large windows for edge-preserving smoothing. Subsequently, it follows that bilateral filtering is a particular choice of weights in the extended diffusion process that is obtained from geometrical considerations. We then show that kernel density estimation applied in the joint spatial–range domain yields a powerful processing paradigm—the mean shift procedure, related to bilateral filtering but having additional flexibility. This establishes an attractive relationship between the theory of statistics and that of diffusion and energy minimization. We experimentally compare the discussed methods and give insights on their performance. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
22. Real-time bas-relief generation from a 3D mesh.
- Author
-
Zhang, Yu-Wei, Zhou, Yi-Qi, Zhao, Xiao-Feng, and Yu, Gang
- Subjects
REAL-time computing ,ALGORITHMS ,FEATURE extraction ,CONTROL theory (Engineering) ,MATHEMATICAL mappings ,IMAGE analysis ,SUBROUTINES (Computer programs) - Abstract
Abstract: Most of the existing approaches to bas-relief generation operate in image space, which is quite time-consuming in practice. This paper presents a different bas-relief generation algorithm based on geometric compression and starting from a 3D mesh input. The feature details are first extracted from the original objects using a spatial bilateral filtering technique. Then, a view-dependent coordinate mapping method is applied to build the height domain for the current view. After fitting the compression datum plane, the algorithm uses an adaptive compression function to scale and combine the Z values of the base mesh and the fine details. This approach offers control over the level of detail, making it flexible for the adjustment of the appearance of details. For a typical input mesh with 100k triangles, this algorithm computes a bas-relief in 0.214s. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
23. Development of a coupled aerosol lidar data quality assurance and control scheme with Monte Carlo analysis and bilateral filtering.
- Author
-
Wang, Haibo, Yang, Ting, and Wang, Zifa
- Abstract
Mie-scatter lidar can capture the vertical distribution of aerosols, and a high degree of quantification of lidar data would be capable of coupling with a chemical transport model (CTM). Thus, we develop a data quality assurance and control scheme for aerosol lidar (TRANSFER) that mainly includes a Monte Carlo uncertainty analysis (MCA) and bilateral filtering (BF). The AErosol RObotic NETwork (AERONET) aerosol optical depth (AOD) is utilized as the ground truth to evaluate the validity of TRANSFER, and the result exhibits a sharp 41% (0.36) decrease in root mean square error (RMSE), elucidating an acceptable overall performance of TRANSFER. The maximum removal of uncertainties appears in MCA with an RMSE of 0.08 km−1, followed by denoising (DN) with 50% of MCA in RMSE. BF can smooth interior data without destroying the edge of the structure. The most noteworthy correction occurs in summer with an RMSE of 0.15 km−1 and Pearson correlation coefficient of 0.8, and the least correction occurs in winter with values of 0.07 km−1 and 0.93, respectively. Overestimations of raw data are mostly identified, and representative values occur with weak southerly winds, low visibility, high relative humidity (RH) and high concentrations of both ground fine particulate matter (PM 2.5) and ozone. Apart from long-term variations, the intuitional variation in a typical overestimated pollution episode, especially represented by vertical profiles, shows a favorable performance of TRANSFER during stages of transport and local accumulation, as verified by backward trajectories. Few underestimation cases are mainly attributed to BF smoothing data with a sudden decrease. The main limitation of TRANSFER is the zigzag profiles found in a few cases with very small extinction coefficients. As a supplement to the research community of aerosol lidar and an exploration under complicated pollution in China, TRANSFER can aid in the preprocessing of lidar data-powered applications. Unlabelled Image • TRANSFEF can sharply decrease absolute deviation of raw Lidar data of 41%. • The maximum removal of uncertainties appears in Monte Carlo uncertainty analysis of which De-noising is a supplement. • Bilateral filtering can smooth interior data without destroying the edge of structure. • The most noteworthy correction occurs in summer, while the least one is in winter. • Representative overestimation episode happens with weak southerly wind, low visibility, high RH and both high concentration of ground PM 2.5 and ozone. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.