3,437 results on '"data smoothing"'
Search Results
2. Smoothing and approximation of grassland fire loading data for engineering structures by Capped Extended Support Vector Regression
- Author
-
Wang, Qihan, Li, Junxing, Gao, Wei, Li, Guoyin, Liu, Xinpei, and Bradford, Mark A.
- Published
- 2024
- Full Text
- View/download PDF
3. Application of data smoothing and principal component analysis to develop a parameter ranking system for the anaerobic digestion process
- Author
-
Kim, Moonil, Chul, Park, Kim, Wan, and Cui, Fenghao
- Published
- 2022
- Full Text
- View/download PDF
4. Predicting Heart Diseases by Selective Machine Learning Algorithms.
- Author
-
UMAR, N., HASSAN, S. K., UMAR, A., and AHMED, S. S.
- Abstract
Heart disease is among the leading causes of mortality worldwide. As a result, it’s critical to diagnose patients appropriately and promptly. Consequently, the objective of this paper was to predict heart diseases using selective machine learning algorithms. The leverage technique was evaluated using the Cleveland heart disease dataset. In this study five classifiers were trained and tested with the unsmooth Cleveland dataset and the smooth Cleveland dataset. The results obtained showed all the classifiers performed better when tested with the smooth dataset with an accuracy of 98.11% than when tested with the unsmooth dataset with an accuracy of 89.71% The leverage technique performed better than works found in literature reviewed. These results show that feature engineering using data smoothing is effective for improved heart disease prediction. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. Identifying Cyclical Patterns of Behavior Using a Moving-Average, Data-Smoothing Manipulation.
- Author
-
Retzlaff, Billie J., Craig, Andrew R., Owen, Todd M., Greer, Brian D., O'Donnell, Alex, and Fisher, Wayne W.
- Subjects
- *
SLEEP , *STATISTICAL smoothing , *MOVING average process , *DATA analysis , *MEDICAL personnel - Abstract
For some individuals, rates of destructive behavior change in a predictable manner, irrespective of the contingencies programmed. Identifying such cyclical patterns can lead to better prediction of destructive behavior and may allow for the identification of relevant biological processes. However, identifying cyclical patterns of behavior can be difficult when using traditional methods of visual analysis. We describe a data-manipulation method, called data smoothing, in which one averages the data across time points within a specified window (e.g., 3, 5, or 7 days). This approach minimizes variability in the data and can increase the saliency of cyclical behavior patterns. We describe two cases for which we identified cyclical patterns in daily occurrences of destructive behavior, and we demonstrate the importance of analyzing smoothed data across various windows when using this approach. We encourage clinicians to analyze behavioral data in this way when rates vary independently of programmed contingencies and other potentially controlling variables have been ruled out (e.g., behavior variability related to sleep behavior). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Score filtering for contextualized noise suppression of Poisson‐distributed geophysical signals.
- Author
-
Altdorff, Daniel and Schrön, Martin
- Subjects
STATISTICAL smoothing ,ELECTROMAGNETIC induction ,MOVING average process ,REMOTE sensing ,TIME series analysis - Abstract
Geophysical and remote sensing products that rely on Poisson‐distributed measurement signals, such as cosmic‐ray neutron sensing (CRNS) and gamma spectrometry, often face challenges due to inherent Poisson noise. Common techniques to enhance signal stability include data aggregation or smoothing (e.g., moving averages and interpolation). However, these methods typically reduce the ability to resolve detailed temporal (stationary data) and spatial (mobile data) features. In this study, we introduced a method for contextual noise suppression tailored to Poisson‐distributed data, utilizing a discrete score attribution system. This score filter evaluates each observation against eight different criteria to assess its consistency with surrounding values, assigning a score between 0 (very unlikely) and 8 (very likely) to indicate whether the observation is likely to act as noise. These scores can then be used to flag or remove data points based on user‐defined thresholds. We tested the score filter's effectiveness on both stationary and mobile CRNS data, as well as on gamma‐ray spectrometry and electromagnetic induction (EMI) recordings. In our examples, the score filter consistently outperformed established filters, for example Savitzky–Golay and Kalman, in direct competition when applied to CRNS time series data. Additionally, the score filter substantially reduced Poisson noise in mobile CRNS, gamma‐ray spectrometry and EMI data. The scoring system also provides a context‐sensitive evaluation of individual observations or aggregates, assessing their conformity within the dataset. Given its general applicability, customizable criteria and very low computational demands, the proposed filter is easy to implement and holds promise as a valuable tool for denoising geophysical data and applications in other fields. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Testing protocols for smoothing datasets of hydraulic variables acquired during unsteady flows.
- Author
-
Baydaroğlu, Özlem, Muste, Marian, Cikmaz, Atiye Beyza, Kim, Kyeongdong, Meselhe, Ehab, and Demir, Ibrahim
- Subjects
- *
UNSTEADY flow , *THEORY of wave motion , *STREAMFLOW , *STATISTICAL smoothing , *STREAM measurements - Abstract
Flood wave propagation involves complex flow variable dependencies. Continuous in situ hydrograph peak magnitude and timing data provide the most relevant information for understanding these dependencies. New acoustic instruments can produce experimental evidence by extracting usable signals from noisy datasets. This study presents a new screening protocol to smoothen streamflow data from unwanted influences and noise generated by flow perturbations and observational fluctuations. The protocol combines quantitative (statistical fitness parameters) and qualitative (domain expert judgments) evaluations. It is tested with 18 smoothing methods to identify optimal data conditioning candidates. Sensitivity analyses assess the validity, generality, and scalability of the procedures. The goal of this analysis is to set a mathematical foundation for empirical results that can lead to unified, general conclusions on principles or protocols for unsteady flows propagating in open channels, formulating practical guidance for future data acquisition and processing, and using in situ data to better support data-driven modeling efforts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Improving Radio Signal from Baghdad University Radio Telescope Using the Savitzky-Golay Filter.
- Author
-
Hussein, Zahraa A. and Mahdi, Hareth S.
- Subjects
COLLEGE radio stations ,RADIO astronomy ,ASTRONOMICAL observations ,GEODETIC astronomy ,STATISTICAL smoothing - Abstract
Copyright of Iraqi Journal of Physics is the property of Republic of Iraq Ministry of Higher Education & Scientific Research (MOHESR) and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
9. A filter calibration method for laser-scanned weld toe geometries
- Author
-
Finn Renken, Matthias Jung, Sören Ehlers, and Moritz Braun
- Subjects
Weld toe measurement ,Laser scanning ,Filter calibration ,Data smoothing ,Universal filter ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
The scanning of weld seams can be used to evaluate the local weld toe geometry for fatigue assessments. Laser scanned weld seam profiles often contain noise which complicates the accurate measurement of the weld toe geometry. For that reason, filtering of the scanned data is necessary. The issue at hand is that a filtering method can significantly affect the measurement results. Therefore, a calibration of the filter input parameters is needed. In this study, a calibration method for filtered laser-scanned weld profiles is presented by using artificial weld toe geometries. The adjustment of different filter functions is achieved by using an optimization method on predefined weld toes with an artificial noise. The resulting input data for the filter functions is tested on a real specimen to verify the method. Through the calibration method it is possible to achieve satisfactory measurement results with precisely set input parameters for the filter functions. The most suitable filter functions for the measurement of the weld toe are the Gaussian and the Lowpass filter. Both functions are adequate as a universally applicable filter. For the evaluation of the measurement results of the radii and angles, a tolerance range is introduced, which is defined by the theoretically minimum measurable radii and angles. Using an adjusted Lowpass filter and a point distance of 0.07 mm set by the laser scanner, a measurement within the tolerance range of 0.2 mm is achievable for the weld toe radius. For the weld toe angle, the tolerance range of 1.5° is achieved for the majority of measurements.
- Published
- 2024
- Full Text
- View/download PDF
10. Improving Radio Signal from Baghdad University Radio Telescope Using the Savitzky-Golay Filter
- Author
-
Zahraa A. Hussein and Hareth S. Mahdi
- Subjects
Baghdad University Radio Telescope BURT ,Data Smoothing ,Radio Astronomy ,Savitzky Golay Filter ,Signal Processing ,Physics ,QC1-999 - Abstract
Astronomical radio observations from a small radio telescope suffer from various types of noise. Hence, astronomers continuously search for new techniques to eliminate or reduce such noise and obtain more accurate results. This research investigates the impact of implementing the Savitzky-Golay filter on enhancing radio observation signals retrieved from the Baghdad University Radio Telescope (BURT). Observations from BURT were carried out for different Galactic coordinates, and then a MATLAB code was written and used to implement the Savitzky-Golay filter for the collected data. This process provides an assessment of the ability of the filter to reduce noise and improve the quality of the signal. The results of this research clearly showed that applying the Savitzky-Golay filter reduces the noise and enhances the signal of astronomical radio observations. However, the filter should be used appropriately to preserve the original features of the signal. In conclusion, the filter is considered an efficient tool for enhancing the radio signal by reducing the noise and smoothing the signal. Therefore, the filter provides a substantial contribution and improvement to the field of radio astronomy.
- Published
- 2024
- Full Text
- View/download PDF
11. Modeling Infectious Disease Trend using Sobolev Polynomials
- Author
-
Rolly Czar Joseph Castillo, Victoria May Mendoza, Jose Ernie Lope, and Renier Mendoza
- Subjects
data smoothing ,sobolev polynomials ,covid-19 ,mpox ,schistosomiasis ,whittaker-henderson method ,Biology (General) ,QH301-705.5 ,Mathematics ,QA1-939 - Abstract
Trend analysis plays an important role in infectious disease control. An analysis of the underlying trend in the number of cases or the mortality of a particular disease allows one to characterize its growth. Trend analysis may also be used to evaluate the effectiveness of an intervention to control the spread of an infectious disease. However, trends are often not readily observable because of noise in data that is commonly caused by random factors, short-term repeated patterns, or measurement error. In this paper, a smoothing technique that generalizes the Whittaker-Henderson method to infinite dimension and whose solution is represented by a polynomial is applied to extract the underlying trend in infectious disease data. The solution is obtained by projecting the problem to a finite-dimensional space using an orthonormal Sobolev polynomial basis obtained from Gram-Schmidt orthogonalization procedure and a smoothing parameter computed using the Philippine Eagle Optimization Algorithm, which is more efficient and consistent than a hybrid model used in earlier work. Because the trend is represented by the polynomial solution, extreme points, concavity, and periods when infectious disease cases are increasing or decreasing can be easily determined. Moreover, one can easily generate forecast of cases using the polynomial solution. This approach is applied in the analysis of trends, and in forecasting cases of different infectious diseases.
- Published
- 2023
- Full Text
- View/download PDF
12. A New Approach for Electromagnetic Log Prediction Using Electrical Logs, South California.
- Author
-
Aftab, S., Leisi, A., Manaman, N. Shad, and Moghadam, R. Hamidzadeh
- Subjects
ELECTRIC logging ,DATA logging ,ROCK properties ,MACHINE learning ,STATISTICAL smoothing ,PROPERTIES of fluids - Abstract
Well logging data shows the change of physical properties of rocks and fluids in lithology units with depth. Well logging is one of the main parts of natural resources exploration. But in some cases, due to the lack of geophysical equipment or due to high exploration costs, it is not possible to record some geophysical logs. In this paper, electromagnetic log predicted using electrical logs for the first time. In such cases, estimating the desired log using other geophysical logs is a suitable solution. For the estimation of geophysical logs, machine learning algorithms are used in most cases. In this research, a new strategy developed for processing and preparation of geophysical logs. This strategy consists of three parts: data smoothing, correlation intensifier, and MLR (Multiple Linear Regression) or ANN (Artificial Neural Network). The purpose of the data smoothing and correlation intensifier section is to remove outliers and identify the pattern of main changes in the log data, and as a result, the accuracy in estimating the target log increases. In this article, the determination of the electromagnetic log has been done using electric logs. The well logging data have been recorded in Southern California and the Central Valley. A total of six wells have been selected, four wells for MLR and ANN training and two wells for testing. By applying data smoothing and correlation intensifier to these data, the correlation between electrical and electromagnetic data increased significantly and caused the estimation accuracy of electromagnetic log to be above 95%. The use of this strategy is not limited to the estimation of electromagnetic log and can be used in all well logging data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Application of Artificial Intelligence Tools, Data Processing, and Analysis in the Forecasting of Level and Flow Variables in Wells with Little Data from the Morroa Aquifer
- Author
-
Manrique, Carlos Cohen, Villa, J. L., Month, A. A., Velilla, G. Perez, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Prates, Raquel Oliveira, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Figueroa-García, Juan Carlos, editor, Hernández, German, editor, Villa Ramirez, Jose Luis, editor, and Gaona García, Elvis Eduardo, editor
- Published
- 2023
- Full Text
- View/download PDF
14. Extended Smoothing Methods for Sparse Test Data Based on Zero-Padding.
- Author
-
Zhou, Pan, Shi, Tuo, Xin, Jianghui, Li, Yaowei, Lv, Tian, and Zang, Liguo
- Subjects
DISCRETE Fourier transforms ,WAVENUMBER ,STATISTICAL smoothing ,TEST methods - Abstract
Aiming at the problem of sparse measurement points due to test conditions in engineering, a smoothing method based on zero-padding in the wavenumber domain is proposed to increase data density. Firstly, the principle of data extension and smoothing is introduced. The core idea of this principle is to extend the discrete data series by zero-padding in the wavenumber domain. The conversion between the spatial and wavenumber domains is achieved using the Discrete Fourier Transform (DFT) and the Inverse Discrete Fourier Transform (IDFT). Then, two sets of two-dimensional discrete random data are extended and smoothed, respectively, and the results verify the effectiveness and feasibility of the algorithm. The method can effectively increase the density of test data in engineering tests, achieve smoothing and extend the application to areas related to data processing. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. Searching for Alternatives to the Savitzky–Golay Filter in the Spectral Processing Domain.
- Author
-
Kałka, Andrzej J. and Turek, Andrzej M.
- Subjects
- *
FAST Fourier transforms , *SIGNAL processing , *FOURIER transforms , *NUMERICAL differentiation , *STATISTICAL smoothing - Abstract
An elegant, well-established effective data filter concept, proposed originally by Abraham Savitzky and Marcel J.E. Golay, is undoubtedly a very effective tool, however not free from limitations and drawbacks. Despite the latter, over the years it has become a "monopolist" in many fields of spectra processing, claiming a "commercial" superiority over alternative approaches, which would potentially allow to obtain equivalent or in some cases even more reliable results. In order to show that basic operations performed on spectral datasets, like smoothing or differentiation, do not have to be equated to the application of the one particular single algorithm, several of such alternatives are briefly presented within this paper and discussed with regard to their practical realization. A special emphasis is put on the fast Fourier methodology (FFT), being widespread in the general domain of signal processing. Finally, a user-friendly Matlab routine, in which the outlined algorithms are implemented, is shared, so that one can select and apply the technique of spectral data processing more adequate for their individual requirements without the need to code it prior to use. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. On smoothing of data using Sobolev polynomials
- Author
-
Rolly Czar Joseph Castillo and Renier Mendoza
- Subjects
data smoothing ,whittaker-henderson method ,sobolev polynomials ,high-frequency data ,approximation ,generalized cross validation score ,Mathematics ,QA1-939 - Abstract
Data smoothing is a method that involves finding a sequence of values that exhibits the trend of a given set of data. This technique has useful applications in dealing with time series data with underlying fluctuations or seasonality and is commonly carried out by solving a minimization problem with a discrete solution that takes into account data fidelity and smoothness. In this paper, we propose a method to obtain the smooth approximation of data by solving a minimization problem in a function space. The existence of the unique minimizer is shown. Using polynomial basis functions, the problem is projected to a finite dimension. Unlike the standard discrete approach, the complexity of our method does not depend on the number of data points. Since the calculated smooth data is represented by a polynomial, additional information about the behavior of the data, such as rate of change, extreme values, concavity, etc., can be drawn. Furthermore, interpolation and extrapolation are straightforward. We demonstrate our proposed method in obtaining smooth mortality rates for the Philippines, analyzing the underlying trend in COVID-19 datasets, and handling incomplete and high-frequency data.
- Published
- 2022
- Full Text
- View/download PDF
17. Data smoothing with applications to edge detection
- Author
-
Al-Jamal Mohammad F., Baniabedalruhman Ahmad, and Alomari Abedel-Karrem
- Subjects
data smoothing ,numerical differentiation ,noisy data ,diffusion equation ,regularization ,edge detection ,65f22 ,47a52 ,35k20 ,65d19 ,Mathematics ,QA1-939 - Abstract
The aim of this paper is to present a new stable method for smoothing and differentiating noisy data defined on a bounded domain Ω⊂RN\Omega \subset {{\mathbb{R}}}^{N} with N≥1N\ge 1. The proposed method stems from the smoothing properties of the classical diffusion equation; the smoothed data are obtained by solving a diffusion equation with the noisy data imposed as the initial condition. We analyze the stability and convergence of the proposed method and we give optimal convergence rates. One of the main advantages of this method lies in multivariable problems, where some of the other approaches are not easily generalized. Moreover, this approach does not require strong smoothness assumptions on the underlying data, which makes it appealing for detecting data corners or edges. Numerical examples demonstrate the feasibility and robustness of the method even with the presence of a large amounts of noise.
- Published
- 2022
- Full Text
- View/download PDF
18. An Automated Flight Parameter Estimatıon Technıque Using Genetic Algorithm
- Author
-
Anand, Nikhil, Sabarinath, A., Mettu, Ramesh, Mohan, Chindu, Nisha, S., Geetha, S., Ayyappan, G., Chaari, Fakher, Series Editor, Haddar, Mohamed, Series Editor, Kwon, Young W., Series Editor, Gherardini, Francesco, Series Editor, Ivanov, Vitalii, Series Editor, Cavas-Martínez, Francisco, Series Editor, Trojanowska, Justyna, Series Editor, Saran, V. H., editor, and Misra, Rakesh Kumar, editor
- Published
- 2021
- Full Text
- View/download PDF
19. Deterministic Prediction Theory
- Author
-
Daras, Nicholas J. and Rassias, Themistocles M., editor
- Published
- 2021
- Full Text
- View/download PDF
20. On smoothing of data using Sobolev polynomials.
- Author
-
Castillo, Rolly Czar Joseph and Mendoza, Renier
- Subjects
SOBOLEV spaces ,APPROXIMATION theory ,GENERALIZATION ,INTERPOLATION ,SMOOTHNESS of functions - Abstract
Data smoothing is a method that involves finding a sequence of values that exhibits the trend of a given set of data. This technique has useful applications in dealing with time series data with underlying fluctuations or seasonality and is commonly carried out by solving a minimization problem with a discrete solution that takes into account data fidelity and smoothness. In this paper, we propose a method to obtain the smooth approximation of data by solving a minimization problem in a function space. The existence of the unique minimizer is shown. Using polynomial basis functions, the problem is projected to a finite dimension. Unlike the standard discrete approach, the complexity of our method does not depend on the number of data points. Since the calculated smooth data is represented by a polynomial, additional information about the behavior of the data, such as rate of change, extreme values, concavity, etc., can be drawn. Furthermore, interpolation and extrapolation are straightforward. We demonstrate our proposed method in obtaining smooth mortality rates for the Philippines, analyzing the underlying trend in COVID-19 datasets, and handling incomplete and high-frequency data. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
21. Data filtering methods for SARS-CoV-2 wastewater surveillance
- Author
-
Rezgar Arabzadeh, Daniel Martin Grünbacher, Heribert Insam, Norbert Kreuzinger, Rudolf Markt, and Wolfgang Rauch
- Subjects
data smoothing ,pandemic management ,sars-cov-2 ,signal filtering ,virus monitoring ,wastewater-based epidemiology ,Environmental technology. Sanitary engineering ,TD1-1066 - Abstract
In the case of SARS-CoV-2 pandemic management, wastewater-based epidemiology aims to derive information on the infection dynamics by monitoring virus concentrations in the wastewater. However, due to the intrinsic random fluctuations of the viral signal in wastewater caused by several influencing factors that cannot be determined in detail (e.g. dilutions; number of people discharging; variations in virus excretion; water consumption per day; transport and fate processes in sewer system), the subsequent prevalence analysis may result in misleading conclusions. It is thus helpful to apply data filtering techniques to reduce the noise in the signal. In this paper we investigate 13 smoothing algorithms applied to the virus signals monitored in four wastewater treatment plants in Austria. The parameters of the algorithms have been defined by an optimization procedure aiming for performance metrics. The results are further investigated by means of a cluster analysis. While all algorithms are in principle applicable, SPLINE, Generalized Additive Model and Friedman's Super Smoother are recognized as superior methods in this context (with the latter two having a tendency to over-smoothing). A first analysis of the resulting datasets indicates the positive effect of filtering to the correlation of the viral signal to monitored incidence values. HIGHLIGHTS The random component in the timeline of SARS-CoV-2 virus concentration makes data filtering necessary.; Thirteen common filtering techniques are investigated for their potential to smooth the virus signals.; SPLINE, GAM and Friedman's Super Smoother are seen as superior algorithms for smoothing SARS-CoV-2 signals.;
- Published
- 2021
- Full Text
- View/download PDF
22. Modelling Norm Scores with the cNORM Package in R
- Author
-
Sebastian Gary, Wolfgang Lenhard, and Alexandra Lenhard
- Subjects
regression-based norming ,continuous norming ,inferential norming ,data smoothing ,curve fitting ,percentile estimation ,Psychology ,BF1-990 - Abstract
In this article, we explain and demonstrate how to model norm scores with the cNORM package in R. This package is designed specifically to determine norm scores when the latent ability to be measured covaries with age or other explanatory variables such as grade level. The mathematical method used in this package draws on polynomial regression to model a three-dimensional hyperplane that smoothly and continuously captures the relation between raw scores, norm scores and the explanatory variable. By doing so, it overcomes the typical problems of classical norming methods, such as overly large age intervals, missing norm scores, large amounts of sampling error in the subsamples or huge requirements with regard to the sample size. After a brief introduction to the mathematics of the model, we describe the individual methods of the package. We close the article with a practical example using data from a real reading comprehension test.
- Published
- 2021
- Full Text
- View/download PDF
23. Comparison of Different Smoothing Methods for Initial Data of the DSN-PC Sensor Network
- Author
-
Vas, Ádám, Tóth, László, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Prates, Raquel Oliveira, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Vishnevskiy, Vladimir M., editor, Samouylov, Konstantin E., editor, and Kozyrev, Dmitry V., editor
- Published
- 2020
- Full Text
- View/download PDF
24. Constrained data smoothing via optimal control.
- Author
-
Dontchev, Assen L., Kolmanovsky, Ilya V., and Tran, Trung B.
- Subjects
STATISTICAL smoothing ,CONTINUOUS functions ,SMOOTHING (Numerical analysis) ,PROBLEM solving ,SPLINES ,INTEGRATORS - Abstract
The article considers a problem of best smoothing in a strip, where the objective is to find a function f:[0,1]→ℝ$$ f:\kern0.3em \left[0,1\right]\to \mathbb{R} $$ that satisfies bilateral constraints on its values, d(t)≤f(t)≤e(t)$$ d(t)\le f(t)\le e(t) $$ for all 0≤t≤1$$ 0\le t\le 1 $$ and minimizes a weighted sum of the L2$$ {L}_2 $$‐norm of the second derivative and squared deviations from specified values, yi$$ {y}_i $$, at discrete points 0=t1
- Published
- 2022
- Full Text
- View/download PDF
25. A filter calibration method for laser-scanned weld toe geometries
- Author
-
Renken, Finn, Jung, Matthias, Ehlers, Sören, Braun, Moritz, Renken, Finn, Jung, Matthias, Ehlers, Sören, and Braun, Moritz
- Abstract
The scanning of weld seams can be used to evaluate the local weld toe geometry for fatigue assessments. Laser scanned weld seam profiles often contain noise which complicates the accurate measurement of the weld toe geometry. For that reason, filtering of the scanned data is necessary. The issue at hand is that a filtering method can significantly affect the measurement results. Therefore, a calibration of the filter input parameters is needed. In this study, a calibration method for filtered laser-scanned weld profiles is presented by using artificial weld toe geometries. The adjustment of different filter functions is achieved by using an optimization method on predefined weld toes with an artificial noise. The resulting input data for the filter functions is tested on a real specimen to verify the method. Through the calibration method it is possible to achieve satisfactory measurement results with precisely set input parameters for the filter functions. The most suitable filter functions for the measurement of the weld toe are the Gaussian and the Lowpass filter. Both functions are adequate as a universally applicable filter. For the evaluation of the measurement results of the radii and angles, a tolerance range is introduced, which is defined by the theoretically minimum measurable radii and angles. Using an adjusted Lowpass filter and a point distance of 0.07 mm set by the laser scanner, a measurement within the tolerance range of 0.2 mm is achievable for the weld toe radius. For the weld toe angle, the tolerance range of 1.5° is achieved for the majority of measurements.
- Published
- 2024
26. A binary search algorithm for univariate data approximation and estimation of extrema by piecewise monotonic constraints.
- Author
-
Demetriou, Ioannis C.
- Subjects
SEARCH algorithms ,APPROXIMATION algorithms ,LEAST squares ,CONVEX functions ,GEOPHYSICS ,FORTRAN - Abstract
The piecewise monotonic approximation problem makes the least changes to n univariate noisy data so that the piecewise linear interpolant to the new values is composed of at most k monotonic sections. The term "least changes" is defined in the sense of a global sum of strictly convex functions of changes. The main difficulty in this calculation is that the extrema of the interpolant have to be found automatically, but the number of all possible combinations of extrema can be O (n k - 1) , which makes not practicable to test each one separately. It is known that the case k = 1 is straightforward, and that the case k > 1 reduces to partitioning the data into at most k disjoint sets of adjacent data and solving a k = 1 problem for each set. Some ordering relations of the extrema are studied that establish three quite efficient algorithms by using a binary search method for partitioning the data. In the least squares case the total work is only O (n σ + k σ log 2 σ) computer operations when k ≥ 3 and is only O (n) when k = 1 or 2, where σ - 2 is the number of sign changes in the sequence of the first differences of the data. Fortran software has been written for this case and the numerical results indicate superior performance to existing algorithms. Some examples with real data illustrate the method. Many applications of the method arise from bioinformatics, energy, geophysics, medical imaging, and peak finding in spectroscopy, for instance. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
27. Principal component analysis-assisted selection of optimal denoising method for oil well transient data
- Author
-
Bing Zhang, Khafiz Muradov, and Akindolu Dada
- Subjects
Intelligent well ,Downhole gauge ,Pressure and temperature transient analysis (PTTA) ,Data smoothing ,wavelet threshold denoising ,Principal component analysis (PCA) ,Petroleum refining. Petroleum products ,TP690-692.5 ,Petrology ,QE420-499 - Abstract
Abstract Oil and gas production wells are often equipped with modern, permanent or temporary in-well monitoring systems, either electronic or fiber-optic, typically for measurement of downhole pressure and temperature. Consequently, novel methods of pressure and temperature transient analysis (PTTA) have emerged in the past two decades, able to interpret subtle thermodynamic effects. Such analysis demands high-quality data. High-level reduction in data noise is often needed in order to ensure sufficient reliability of the PTTA. This paper considers the case of a state-of-the-art intelligent well equipped with fiber-optic, high-precision, permanent downhole gauges. This is followed by screening, development, verification and application of data denoising methods that can overcome the limitation of the existing noise reduction methods. Firstly, the specific types of noise contained in the original data are analyzed by wavelet transform, and the corresponding denoising methods are selected on the basis of the wavelet analysis. Then, the wavelet threshold denoising method is used for the data with white noise and white Gaussian noise, while a data smoothing method is used for the data with impulse noise. The paper further proposes a comprehensive evaluation index as a useful denoising success metrics for optimal selection of the optimal combination of the noise reduction methods. This metrics comprises a weighted combination of the signal-to-noise ratio and smoothness value where the principal component analysis was used to determine the weights. Thus the workflow proposed here can be comprehensively defined solely by the data via its processing and analysis. Finally, the effectiveness of the optimal selection methods is confirmed by the robustness of the PTTA results derived from the de-noised measurements from the above-mentioned oil wells.
- Published
- 2020
- Full Text
- View/download PDF
28. Extended Smoothing Methods for Sparse Test Data Based on Zero-Padding
- Author
-
Pan Zhou, Tuo Shi, Jianghui Xin, Yaowei Li, Tian Lv, and Liguo Zang
- Subjects
two-dimensional discrete data ,data extensions ,data smoothing ,zero-padding ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Aiming at the problem of sparse measurement points due to test conditions in engineering, a smoothing method based on zero-padding in the wavenumber domain is proposed to increase data density. Firstly, the principle of data extension and smoothing is introduced. The core idea of this principle is to extend the discrete data series by zero-padding in the wavenumber domain. The conversion between the spatial and wavenumber domains is achieved using the Discrete Fourier Transform (DFT) and the Inverse Discrete Fourier Transform (IDFT). Then, two sets of two-dimensional discrete random data are extended and smoothed, respectively, and the results verify the effectiveness and feasibility of the algorithm. The method can effectively increase the density of test data in engineering tests, achieve smoothing and extend the application to areas related to data processing.
- Published
- 2023
- Full Text
- View/download PDF
29. A hidden Markov space–time model for mapping the dynamics of global access to food.
- Author
-
Bartolucci, Francesco and Farcomeni, Alessio
- Subjects
HIDDEN Markov models ,STATISTICAL smoothing ,MARKOV chain Monte Carlo ,UNITS of time ,DATA augmentation - Abstract
In order to analyse worldwide data about access to food, coming from a series of Gallup's world polls, we propose a hidden Markov model with both a spatial and a temporal component. This model is estimated by an augmented data MCMC algorithm in a Bayesian framework. Data are referred to a sample of more than 750 thousand individuals in 166 countries, widespread in more than two thousand areas, and cover the period 2007–2014. The model is based on a discrete latent space, with the latent state corresponding to a certain area and time occasion that depends on the states of neighbouring areas at the same time occasion, and on the previous state for the same area. The latent model also accounts for area‐time‐specific covariates. Moreover, the binary response variable (access to food, in our case) observed at individual level is modelled on the basis of individual‐specific covariates through a logistic model with a vector of parameters depending on the latent state. Model selection, in particular for the number of latent states, is based on the Watanabe–Akaike information criterion. The application shows the potential of the approach in terms of clustering the areas, data smoothing and prediction of prevalence for areas without sample units and over time. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
30. 基于 BAS_RVM 的 APU 涡轮剩余寿命预测.
- Author
-
吴 帅, 李艳军, 曹愈远, and 张博文
- Subjects
- *
WASTE gases , *SEARCH algorithms , *DATA recorders & recording , *STATISTICAL smoothing , *SENSITIVITY analysis , *KERNEL functions , *EXHAUST gas recirculation - Abstract
An optimized relevance vector machine(RVM)life prediction method is proposed for the remaining life of auxiliary power unit(APU). Firstly,an improved kernel function is proposed by taking into account both efficiency and accuracy. Furthermore,the beetle antennae search(BAS)algorithm is applied to optimize the kernel parameters of RVM. Secondly,through analyzing the historical A13 message and maintenance record data of an airline s APU,the exhaust gas temperature(EGT)parameters are extracted,corrected, and noise reduced,and the turbine performance degradation pattern library of EGT is established with polynomial regression. Finally,it is proved that compared with the traditional RVM,the efficiency and accuracy of the proposed algorithm in the APU turbine life prediction are improved by 40%,20%, respectively. In addition,the optimal initial step size and input dimension in the model are determined based on sensitivity analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. A machine learning model for quickly predicting the inner states of ironmaking blast furnaces.
- Author
-
Wu, Wenbo, Kuang, Shibo, Jiao, Lulu, and Yu, Aibing
- Subjects
- *
MACHINE learning , *BLAST furnaces , *STATISTICAL smoothing , *COMPARATIVE studies , *ALGORITHMS - Abstract
The inner states of ironmaking blast furnaces (BFs) govern their overall performance and thus are crucial for efficient and reliable BF production. However, the current control methods cannot directly consider the inner states because of the difficulty of accessing them. This paper introduces a machine learning (ML) model designed to predict the inner states according to injection parameters promptly. The model employs a modified ensemble learning method using data from a well-established mechanistic model. Two key modifications are implemented. A preprocessing method addresses the low prediction accuracy caused by large gradient data. A stack-based structure improves robustness across various inner states. Comparative analysis shows the proposed model predicts inner states with higher accuracy than existing ML models. Furthermore, the model outputs consistent resolutions while maintaining identical change trends for some key variables. The developed model offers a promising approach for implementing real-time BF prediction. [Display omitted] • A stacked-based model is developed for fast predicting blast furnaces (BFs) • It trains three individual learners and combines them to form a stronger predictor. • CFD data with the slip averaging algorithm applied is used to train the model. • The model's reliability in simulating BF performance is demonstrated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Finding Patterns in Time Series
- Author
-
Gentle, James E., Wilson, Seunghye J., Gentle, James E., Series Editor, Härdle, Wolfgang Karl, Series Editor, Mori, Yuichi, Series Editor, Lu, Henry Horng-Shing, editor, and Shen, Xiaotong, editor
- Published
- 2018
- Full Text
- View/download PDF
33. Estimating functions and derivatives via adaptive penalized splines.
- Author
-
Yang, Lianqiang, Ding, Mengzhen, Hong, Yongmiao, and Wang, Xuejun
- Subjects
- *
SPLINES , *STATISTICAL smoothing - Abstract
Adaptive penalized splines via radial basis are constructed to estimate regression functions and their derivatives. A weight vector based on the range of observations is embedded into the penalty matrix, which remarkably improves the adaptability of the penalized spline smoothing model. Fast computation and comparison with traditional spline models are studied, and the empirical results and simulations show that the new method outperforms smoothing splines, traditional penalized splines and local polynomial smoothing when estimating regression functions and their derivatives, particularly when the observations have inhomogeneous variation. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
34. Improvement of Norm Score Quality via Regression-Based Continuous Norming.
- Author
-
Lenhard, Wolfgang and Lenhard, Alexandra
- Subjects
- *
PSYCHOMETRICS , *REGRESSION analysis , *STATISTICAL sampling , *STATISTICAL models - Abstract
The interpretation of psychometric test results is usually based on norm scores. We compared semiparametric continuous norming (SPCN) with conventional norming methods by simulating results for test scales with different item numbers and difficulties via an item response theory approach. Subsequently, we modeled the norm scores based on random samples with varying sizes either with a conventional ranking procedure or SPCN. The norms were then cross-validated by using an entirely representative sample of N = 840,000 for which different measures of norming error were computed. This process was repeated 90,000 times. Both approaches benefitted from an increase in sample size, with SPCN reaching optimal results with much smaller samples. Conventional norming performed worse on data fit, age-related errors, and number of missings in the norm tables. The data fit in conventional norming of fixed subsample sizes varied with the granularity of the age brackets, calling into question general recommendations for sample sizes in test norming. We recommend that test norms should be based on statistical models of the raw score distributions instead of simply compiling norm tables via conventional ranking procedures. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
35. A New Method for Estimating Small Area Demographics and Its Application to Long-Term Population Projection
- Author
-
Inoue, Takashi and Swanson, David A., editor
- Published
- 2017
- Full Text
- View/download PDF
36. LineSmooth: An Analytical Framework for Evaluating the Effectiveness of Smoothing Techniques on Line Charts.
- Author
-
Rosen, Paul and Quadri, Ghulam Jilani
- Subjects
VISUAL analytics ,STATISTICAL smoothing ,FREQUENCY-domain analysis ,VISUALIZATION - Abstract
We present a comprehensive framework for evaluating line chart smoothing methods under a variety of visual analytics tasks. Line charts are commonly used to visualize a series of data samples. When the number of samples is large, or the data are noisy, smoothing can be applied to make the signal more apparent. However, there are a wide variety of smoothing techniques available, and the effectiveness of each depends upon both nature of the data and the visual analytics task at hand. To date, the visualization community lacks a summary work for analyzing and classifying the various smoothing methods available. In this paper, we establish a framework, based on 8 measures of the line smoothing effectiveness tied to 8 low-level visual analytics tasks. We then analyze 12 methods coming from 4 commonly used classes of line chart smoothing-rank filters, convolutional filters, frequency domain filters, and subsampling. The results show that while no method is ideal for all situations, certain methods, such as Gaussian filters and TOPOLOGY-based subsampling, perform well in general. Other methods, such as low-pass CUTOFF filters and Douglas-peucker subsampling, perform well for specific visual analytics tasks. Almost as importantly, our framework demonstrates that several methods, including the commonly used UNIFORM subsampling, produce low-quality results, and should, therefore, be avoided, if possible. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
37. Separation theorems for the extrema of best piecewise monotonic approximations to successive data.
- Author
-
Demetriou, I. C.
- Subjects
- *
DIOPHANTINE approximation , *ERROR functions , *CONVEX functions , *SUNSPOTS , *COMBINATORICS , *STATISTICAL smoothing , *MONOTONE operators - Abstract
Separation properties of the local extrema of best piecewise monotonic approximations to measurements of a real univariate function are of fundamental importance to the development of efficient algorithms that calculate these approximations. Piecewise monotonic approximation to n data is expressed as the minimization of some strictly convex function of the data errors subject to the restriction that the piecewise linear interpolant to the approximated values consists of at most k monotonic sections, where k is a prescribed positive integer. The major task is to determine automatically the positions of the joins of these sections, which is a combinatorial problem that can require about O (n k − 1) combinations in order to find an optimal one. We state theorems which prove the remarkable property that the local maxima of optimal approximations with k−1 monotonic sections are separated by the local maxima of optimal approximations with k monotonic sections, and local minima are separated similarly. We describe briefly a suitable technique that makes use of this property and gives the global solution in O (n 2 + k n log 2 n) computer operations. Some numerical results show large gains in efficiency over existing methods. Further, as an illustration, we apply the technique to 39,082 observations of daily sunspots. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
38. An entropy-based approach for a robust least squares spline approximation.
- Author
-
Brugnano, Luigi, Giordano, Domenico, Iavernaro, Felice, and Rubino, Giorgia
- Subjects
- *
DISTRIBUTION (Probability theory) , *SPLINES , *LEAST squares , *OUTLIER detection , *OUTLIERS (Statistics) , *STATISTICAL smoothing - Abstract
We consider the weighted least squares spline approximation of a noisy dataset. By interpreting the weights as a probability distribution, we maximize the associated entropy subject to the constraint that the mean squared error is prescribed to a desired (small) value. Acting on this error yields a robust regression method that automatically detects and removes outliers from the data during the fitting procedure, by assigning them a very small weight. We discuss the use of both spline functions and spline curves. A number of numerical illustrations have been included to disclose the potentialities of the maximal-entropy approach in different application fields. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Curve Fitting
- Author
-
Zielesny, Achim, Kacprzyk, Janusz, Series editor, Jain, Lakhmi C., Series editor, and Zielesny, Achim
- Published
- 2016
- Full Text
- View/download PDF
40. Comparison of Methods for Smoothing Environmental Data with an Application to Particulate Matter PM10
- Author
-
Martina Čampulová
- Subjects
data smoothing ,trend filtering ,environmental data ,particulate matter PM10 ,Agriculture ,Biology (General) ,QH301-705.5 - Abstract
Data smoothing is often required within the environmental data analysis. A number of methods and algorithms that can be applied for data smoothing have been proposed. This paper gives an overview and compares the performance of different smoothing procedures that estimate the trend in the data, based on the surrounding noisy observations that can be applied on environmental data. The considered methods include kernel regression with both global and local bandwidth, moving average, exponential smoothing, robust repeated median regression, trend filtering and approach based on discrete Fourier and discrete wavelet transform. The methods are applied to real data obtained by measurement of PM10 concentrations and compared in a simulation study.
- Published
- 2018
- Full Text
- View/download PDF
41. Functional approach to analysis of daily tax revenues
- Author
-
Jovita Gudan and Alfredas Račkauskas
- Subjects
functional data analysis ,data smoothing ,registration ,prediction ,Mathematics ,QA1-939 - Abstract
We present a functional data analysis approach to modeling and analyzing daily tax revenues. The main features of daily tax revenue we need to extract are some patterns within calendar months which can be used for prediction. As standard seasonal time series techniques cannot be used due to varying number of banking days per calendar month and presence of seasonality between and within months we interpret monthly tax revenues as curves obtained from daily data. Standard smoothing techniques and registration taking into account time variability are used for data preparation.
- Published
- 2019
- Full Text
- View/download PDF
42. Functional QTL mapping and genomic prediction of canopy height in wheat measured using a robotic field phenotyping platform.
- Author
-
Lyra, Danilo H, Virlet, Nicolas, Sadeghi-Tehran, Pouria, Hassall, Kirsty L, Wingen, Luzie U, Orford, Simon, Griffiths, Simon, Hawkesford, Malcolm J, and Slavov, Gancho T
- Subjects
- *
FORECASTING , *GROWING season , *STATISTICAL power analysis , *ALTITUDES , *PREDICTION models , *WHEAT - Abstract
Genetic studies increasingly rely on high-throughput phenotyping, but the resulting longitudinal data pose analytical challenges. We used canopy height data from an automated field phenotyping platform to compare several approaches to scanning for quantitative trait loci (QTLs) and performing genomic prediction in a wheat recombinant inbred line mapping population based on up to 26 sampled time points (TPs). We detected four persistent QTLs (i.e. expressed for most of the growing season), with both empirical and simulation analyses demonstrating superior statistical power of detecting such QTLs through functional mapping approaches compared with conventional individual TP analyses. In contrast, even very simple individual TP approaches (e.g. interval mapping) had superior detection power for transient QTLs (i.e. expressed during very short periods). Using spline-smoothed phenotypic data resulted in improved genomic predictive abilities (5–8% higher than individual TP prediction), while the effect of including significant QTLs in prediction models was relatively minor (<1–4% improvement). Finally, although QTL detection power and predictive ability generally increased with the number of TPs analysed, gains beyond five or 10 TPs chosen based on phenological information had little practical significance. These results will inform the development of an integrated, semi-automated analytical pipeline, which will be more broadly applicable to similar data sets in wheat and other crops. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
43. Improving the simultaneous application of the DSN-PC and NOAA GFS datasets.
- Author
-
Vas, Ádám, Owino, Oluoch Josphat, and Tóth, László
- Subjects
- *
DISTRIBUTED sensors , *SENSOR networks , *WEATHER forecasting , *MEASUREMENT errors , *STATISTICAL smoothing - Abstract
Our surface-based sensor network, called Distributed Sensor Network for Prediction Calculations (DSN-PC) obviously has limitations in terms of vertical atmospheric data. While efforts are being made to approximate these upper-air parameters from surface-level, as a first step it was necessary to test the network's capability of making distributed computations by applying a hybrid approach. We accessed public databases like NOAA Global Forecast System (GFS) and the initial values for the 2-dimensional computational grid were produced by using both DSN-PC measurements and NOAA GFS data for each grid point. However, though the latter consists of assimilated and initialized (smoothed) data the stations of the DSN-PC network provide raw measurements which can cause numerical instability due to measurement errors or local weather phenomena. Previously we simultaneously interpolated both DSN-PC and GFS data. As a step forward, we wanted for our network to have a more significant role in the production of the initial values. Therefore it was necessary to apply 2D smoothing algorithms on the initial conditions. We found significant difference regarding numerical stability between calculating with raw and smoothed initial data. Applying the smoothing algorithms greatly improved the prediction reliability compared to the cases when raw data were used. The size of the grid portion used for smoothing has a significant impact on the goodness of the forecasts and it's worth further investigation. We could verify the viability of direct integration of DSN-PC data since it provided forecast errors similar to the previous approach. In this paper we present one simple method for smoothing our initial data and the results of the weather prediction calculations. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
44. W.F. Sheppard's Smoothing Method: A Precursor to Local Polynomial Regression.
- Author
-
Murray, Lori and Bellhouse, David
- Subjects
- *
POLYNOMIALS , *STATISTICAL smoothing , *TWENTIETH century , *MOMENTS method (Statistics) , *ACTUARIES , *SMOOTHING (Numerical analysis) - Abstract
Summary: W.F. Sheppard has been much overlooked in the history of statistics although his work produced significant contributions. He developed a polynomial smoothing method and corrections of moment estimates for grouped data as well as extensive normal probability tables that have been widely used since the 20th century. Sheppard presented his smoothing method for actuaries in a series of publications during the early 20th century. Population data consist of irregularities, and some adjustment or smoothing of the data is often necessary. Simple techniques, such as Spencer's summation formulae involving arithmetic operations and moving averages, were commonly practised by actuaries to smooth out equally spaced data. Sheppard's method, however, is a polynomial smoothing method based on central differences. We will show how Sheppard's smoothing method was a significant milestone in the development of smoothing techniques and a precursor to local polynomial regression. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. 基于DEM分辨率的侵蚀学坡长尺度效应研究.
- Author
-
樊 宇, 郭伟玲, 吴 江, 土 祥, and 赵 令
- Abstract
In this paper, the influence of DEM resolution to the distributed erosion slope length was divided into two forms of data smoothing and sampling interval. It explored the influence of the two different resolution components to the degree of distributed erosion slope length extraction and the horizontal resolution to the spatial distribution structure of erosion slope length. The site for the case study was Xiannangou watershed which is located at Ansai County, Shaanxi, China. It was found that with the decrease of DEM resolution from 2.5 m to 100 m, the average value of slope length was increased logarithmically. Through the resampling method to build a series of DEM, the influence of DEM resolution to the distributed erosion slope length was divided into data smoothing effect and sampling interval effect to discuss. The main conclusion is that the effect of the sampling interval on the slope length for the length of slope length extracted from 2.5 m to 100 m resolution DEM is greater than that of data smoothing. The spatial distribution structure of slope length which extracted from 10 m to 100 m resolution DEM was analyzed by semi variation function. The result shows that the slope length semi-variation function agrees well with the spherical model. As the DEM resolution decreases rapidly, the spatial structure variation and the random variation of slope length decrease gradually, contrary the scope of spatial relevance is increased. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
46. Fast prediction and control of air core in hydrocyclone by machine learning to stabilize operations.
- Author
-
Ye, Qing, Kuang, Shibo, Duan, Peibo, Zou, Ruiping, and Yu, Aibing
- Abstract
Operation stability significantly impacts hydrocyclone separation performance during wastewater treatment, sludge processing, and microplastic removal from water. The air core inside a hydrocyclone is an important indicator of operation stability. This paper presents a machine learning model designed for fast prediction and control of air core profiles. The model is built upon a modified graph neural network (GNN). It is trained by the data generated from a well-established and validated computational fluid dynamics (CFD) model. This GNN-based surrogate model has undergone two modifications to enhance its prediction accuracy. One is data smoothing, to mitigate the adverse effects of the drastic data change in spatial distributions. The other is the loss function modification to incorporate the air core information acquired by the CFD model. The predicted air cores are compared with the original GNN and random forest (RF) against the CFD results. It shows that the new surrogate model can reproduce air profiles and have higher accuracy than other models in predicting spatial distribution results among different error metrics. Furthermore, this surrogate model is combined with the genetic algorithm to optimize the air core. The proposed machine learning model framework offers a promising avenue for the prediction and control of hydrocyclones. [Display omitted] • CFD simulation results are used to form the database of machine learning models. • Graph Neural Network (GNN) is accordingly trained for air core prediction. • Data smoothing and loss function modification are introduced to improve the GNN. • The modified GNN outperforms the random forest (RF) and the original GNN. • The new GNN and the genetic algorithm are combined to optimize the air core. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Wavelet-Based Kalman Smoothing Method for Uncertain Parameters Processing: Applications in Oil Well-Testing Data Denoising and Prediction
- Author
-
Xin Feng, Qiang Feng, Shaohui Li, Xingwei Hou, Mengqiu Zhang, and Shugui Liu
- Subjects
low-distortion processing ,oil well-testing data ,wavelet analysis ,Kalman prediction ,data smoothing ,data compression ,Chemical technology ,TP1-1185 - Abstract
The low-distortion processing of well-testing geological parameters is a key way to provide decision-making support for oil and gas field development. However, the classical processing methods face many problems, such as the stochastic nature of the data, the randomness of initial parameters, poor denoising ability, and the lack of data compression and prediction mechanisms. These problems result in poor real-time predictability of oil operation status and difficulty in offline interpreting the played back data. Given these, we propose a wavelet-based Kalman smoothing method for processing uncertain oil well-testing data. First, we use correlation and reconstruction errors as analysis indicators and determine the optimal combination of decomposition scale and vanishing moments suitable for wavelet analysis of oil data. Second, we build a ground pressure measuring platform and use the pressure gauge equipped with the optimal combination parameters to complete the downhole online wavelet decomposition, filtering, Kalman prediction, and data storage. After the storage data are played back, the optimal Kalman parameters obtained by particle swarm optimization are used to complete the data smoothing for each sample. The experiments compare the signal-to-noise ratio and the root mean square error before and after using different classical processing models. In addition, robustness analysis is added. The proposed method, on the one hand, has the features of decorrelation and compressing data, which provide technical support for real-time uploading of downhole data; on the other hand, it can perform minimal variance unbiased estimates of the data, filter out the interference and noise, reduce the reconstruction error, and make the data have a high resolution and strong robustness.
- Published
- 2020
- Full Text
- View/download PDF
48. A Tool for Comparing Outbreak Detection Algorithms
- Author
-
Şahin, Yasin, Nagamalai, Dhinaharan, editor, Kumar, Ashok, editor, and Annamalai, Annamalai, editor
- Published
- 2013
- Full Text
- View/download PDF
49. Curve Fitting
- Author
-
Zielesny, Achim, Kacprzyk, Janusz, editor, Jain, Lakhmi C., editor, and Zielesny, Achim
- Published
- 2011
- Full Text
- View/download PDF
50. Analyzing the Squared Distance-to-Measure Gradient Flow System with k-Order Voronoi Diagrams.
- Author
-
O'Neil, Patrick and Wanner, Thomas
- Subjects
- *
VORONOI polygons , *OPTICAL scanners , *STATISTICAL smoothing , *COMPUTED tomography , *LEAST squares , *DATA analysis , *AIRBORNE-based remote sensing , *WORK design - Abstract
Point cloud data arises naturally from 3-dimensional scanners, LiDAR sensors, and industrial computed tomography (i.e. CT scans) among other sources. Most point clouds obtained through experimental means exhibit some level of noise, inhibiting mesh reconstruction algorithms and topological data analysis techniques. To alleviate the problems caused by noise, smoothing algorithms are often employed as a preprocessing step. Moving least squares is one such technique, however, many of these techniques are designed to work on surfaces in R3. As interesting point clouds can naturally live in higher dimensions, we seek a method for smoothing higher dimensional point clouds. To this end, we turn to the distance to measure function. In this paper, we provide a theoretical foundation for studying the gradient flow induced by the squared distance to measure function, as introduced by Chazal, Cohen-Steiner, and Mérigot. In particular, we frame the gradient flow as a Filippov system and find a method for solving the squared distance to measure gradient flow, induced by the uniform empirical measure, using higher order Voronoi diagrams. In contrast to some existing techniques, this gradient flow provides a smoothing algorithm which computationally scales with dimensionality. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.