94 results on '"resample"'
Search Results
2. An Effective Methodology for Diabetes Prediction in the Case of Class Imbalance.
- Author
-
Toleva, Borislava, Atanasov, Ivan, Ivanov, Ivan, and Hooper, Vincent
- Subjects
- *
MACHINE learning , *ETIOLOGY of diabetes , *BLOOD sugar , *HUMAN body , *DATA modeling , *DEEP learning - Abstract
Diabetes causes an increase in the level of blood sugar, which leads to damage to various parts of the human body. Diabetes data are used not only for providing a deeper understanding of the treatment mechanisms but also for predicting the probability that one might become sick. This paper proposes a novel methodology to perform classification in the case of heavy class imbalance, as observed in the PIMA diabetes dataset. The proposed methodology uses two novel steps, namely resampling and random shuffling prior to defining the classification model. The methodology is tested with two versions of cross validation that are appropriate in cases of class imbalance—k-fold cross validation and stratified k-fold cross validation. Our findings suggest that when having imbalanced data, shuffling the data randomly prior to a train/test split can help improve estimation metrics. Our methodology can outperform existing machine learning algorithms and complex deep learning models. Applying our proposed methodology is a simple and fast way to predict labels with class imbalance. It does not require additional techniques to balance classes. It does not involve preselecting important variables, which saves time and makes the model easy for analysis. This makes it an effective methodology for initial and further modeling of data with class imbalance. Moreover, our methodologies show how to increase the effectiveness of the machine learning models based on the standard approaches and make them more reliable. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
3. Data Preprocessing
- Author
-
Ramdani, Fatwa and Ramdani, Fatwa
- Published
- 2023
- Full Text
- View/download PDF
4. An Evaluation of Lidar, EU-DEM and SRTM-Derived Terrain Parameters for Hydrologic Applications in Țibleș and Rodnei Mountains (Romania)
- Author
-
CARINA STRAPAZAN, ISTVÁN KOCSIS, IOAN-AUREL IRIMUȘ, and LÓRÁNT BÁLINT-BÁLINT
- Subjects
digital elevation models ,lidar ,srtm ,eu-dem ,resample ,accuracy ,morphometry ,rstudio ,geospatial analysis ,Geography. Anthropology. Recreation ,Geography (General) ,G1-922 - Abstract
Over the years numerous geospatial data sets have become accessible to users in the form of various types of digital elevation models (DEMs) at different resolutions. DEMs are often used to study the behavior and hydrological response of watersheds, and so came to be considered as a reflection of their physiographic characteristics. Accurate determination of a catchment's morphometric parameters plays a crucial role in distributed hydrological modelling and river flow estimation. This study is divided into two parts and objectives; the first part examines the accuracy of DEMs from different sources (EU-DEM, SRTM and LIDAR) in deriving terrain attributes by comparison, and the second one investigates the ability of resampling the 3 m LIDAR DEM to coarser cell resolutions, to accurately represent the extracted hydrological features. In order to evaluate the quality and precision of SRTM and EU-DEM, the high-resolution 3 m LIDAR DEM was used as a reference data set due to its higher degree of accuracy. Firstly, this data set was resampled to 25 m and 30 m to match the EU-DEM and SRTM cell size, and all of them were re-projected in order to have the same Stereo 70 coordinate system for Romania. A comparison has been carried out between the derived hydrologic and terrain variables of the different DEMs. For the second part of this research, LIDAR DEM was also resampled to 10 m and subsequently, another similar evaluation was made, but this time with regards to different cell resolutions (3 m, 10 m, 25 m and 30 m). Several catchments of various drainage areas (Țibleș, Runc, Sălăuța and Valea Caselor) located in Țibleș and Rodnei Mountains were chosen as study areas for this research. Several resampling techniques available in ArcMap were evaluated, and the comparative analyzes were carried out using the R software. Results revealed not only the LiDAR's superior accuracy as compared to the other data sets, but also the possibilities offered by the latter for deriving the hydrological characteristics of a mountainous area, contingent upon what the user aims to achieve.
- Published
- 2023
- Full Text
- View/download PDF
5. Fault Diagnosis of Rolling Bearings Based on Two-step Transfer Learning and EfficientNetV2
- Author
-
Du Kangning and Ning Shaohui
- Subjects
Rolling bearing ,Two-step transfer learning ,EfficientNetV2 ,Class-imbalance ,Resample ,Mechanical engineering and machinery ,TJ1-1570 - Abstract
A rolling bearing fault diagnosis model based on two-step transfer learning and EfficientNetV2 (TSTE) is proposed for the real fault diagnosis environment in engineering, where the scarcity of available data leads to the low accuracy of the intelligent diagnosis model in bearing health status diagnosis. Firstly, the model is trained on the full life time bearing data set and then the model shallow weights are freezed to train it on the multi-condition bearing data set for the first transfer learning. Secondly, by constructing a class-imbalance dataset, the impact of scarcity of available data on fault diagnosis performance is studied in actual fault environments is studied. Then, the synthetic minority oversampling technique (SMOTE) oversampling method and the edited nearest neighbors (ENN) under the sampling method are proposed to expand the fault data, reconstructing the class-imbalanced dataset into a class-balanced dataset. Finally, the model is trained on the class-balanced dataset, freezing the model's bottom weights and training the model deeper for a second transfer learning, enabling the model to take control of the failure characteristics of the balanced dataset. The experiments are evaluated by a variety of metrics, while comparing it with other methods, and using the Grad-CAM method for feature visualization. The results show that the proposed method is able to transfer the fault diagnosis knowledge accumulated by the model in a laboratory environment to actual engineering equipment. It is suitable for the diagnosis of rolling bearing faults in situations where test data is scarce.
- Published
- 2023
- Full Text
- View/download PDF
6. Particle Filter Based on Harris Hawks Optimization Algorithm for Underwater Visual Tracking.
- Author
-
Yang, Junyi, Yao, Yutong, and Yang, Donghe
- Subjects
OPTIMIZATION algorithms ,TRACKING algorithms ,UNDERWATER construction ,DATABASES ,ENERGY consumption ,NEUTRINO detectors - Abstract
Due to the complexity of the underwater environment, tracking underwater targets via traditional particle filters is a challenging task. To resolve the problem that the tracking accuracy of a traditional particle filter is low due to the sample impoverishment caused by resampling, in this paper, a new tracking algorithm using Harris-hawks-optimized particle filters (HHOPF) is proposed. At the same time, the problem of particle filter underwater target feature construction and underwater target scale transformation is addressed, the corrected background-weighted histogram method is introduced into underwater target feature recognition, and the scale filter is combined to realize target scaling transformation during tracking. In addition, to enhance the computational speed of underwater target tracking, this paper constructs a nonlinear escape energy using the Harris hawks algorithm in order to balance the exploration and exploitation processes. Based on the proposed HHOPF tracker, we performed detection and evaluation using the Underwater Object Tracking (UOT100) vision database. The proposed method is compared with evolution-based tracking algorithms and particle filters, as well as with recent tracker-based correlation filters and some other state-of-the-art tracking methods. By comparing the results of tracking using the test data sets, it is determined that the presented algorithm improves the overlap accuracy and tracking accuracy by 11% compared with other algorithms. The experiments demonstrate that the presented HHOPF visual tracking provides better tracking results. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Interpolators
- Author
-
Dickman, Arie and Dickman, Arie
- Published
- 2022
- Full Text
- View/download PDF
8. A Multiple Interpolation Algorithm to Improve Resampling Accuracy in Data Triggers.
- Author
-
Cao, Mengtao, Xu, Fangyuan, Jia, Hanbo, Zhou, Lei, Ji, Eryou, and Wu, Jin
- Subjects
FIELD programmable gate arrays ,GALERKIN methods ,INTERPOLATION ,INTERPOLATION algorithms ,ANALOG-to-digital converters - Abstract
To address the problem of low trigger accuracy during trigger resampling and variable sampling rate trigger resampling using a fixed sampling rate analog-to-digital converter (ADC), this paper proposes an interpolation method combining sinc interpolation and linear interpolation to improve accuracy, based on a digital trigger. After behavior simulation verification and actual field programmable gate array (FPGA) test verification, the data collected by two 3GSps 12-bit ADCs were subjected to 8-times sinc interpolation followed by 16-times linear interpolation processing, after which the original trigger resampling accuracy was increased by 128 times and the sampling rate could be realized to vary between 100 MHz and 1 GHz. A signal–noise ratio (SNR) of 46.80 dBFS, a spurious free dynamic range (SFDR) of 45.91 dB, and an effective number of bits (ENOB) of 7.48 bits were obtained by direct trigger resampling without algorithm processing in the behavior simulation. Meanwhile, an SNR of 58.98 dBFS, an SFDR of 60.96 dB, and an ENOB of 9.42 bits were obtained by trigger resampling after algorithm processing. Due to the influence of analog link signal loss and signal interference on the development board, an SNR, SFDR and ENOB of 51.97 dBFS, 61.26 dB, and 8.32 bits, respectively, were obtained from the trigger resampling in the FPGA test. The experimental results show that the algorithm has not only improved the triggering accuracy but has also improved the SNR, SFDR, and ENOB parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. An Upscaling–Downscaling Optimal Seamline Detection Algorithm for Very Large Remote Sensing Image Mosaicking.
- Author
-
Chai, Xuchao, Chen, Jianyu, Mao, Zhihua, and Zhu, Qiankun
- Subjects
- *
REMOTE sensing , *GRAPH algorithms , *SAMPLING (Process) , *ALGORITHMS , *OPTICAL remote sensing - Abstract
For the mosaicking of multiple remote sensing images, obtaining the optimal stitching line in the overlapping region is a key step in creating a seamless mosaic image. However, for very large remote sensing images, the computation of finding seamlines involves a huge amount of image pixels. To handle this issue, we propose a stepwise strategy to obtain pixel-level optimal stitching lines for large remote sensing images via an upscaling–downscaling image sampling procedure. First, the resolution of the image is reduced and the graph cut algorithm is applied to find an energy-optimal seamline in the reduced image. Then, a stripe along the preliminary seamline is identified from the overlap area to remove the other inefficient nodes. Finally, the graph cut algorithm is applied nested within the identified stripe to seek the pixel-level optimal seamline of the original image. Compared to the existing algorithms, the proposed method produces fewer spectral differences between stitching lines and less-crossed features in the experiments. For a wide range of remote sensing images involving large data, the new method uses less than 10 percent of the time needed by the SLIC+ graph cut method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Evaluation of Flood Hazard Potency in Jakarta based on Multi-criteria Analysis.
- Author
-
Mashita Fauzia Hannum, Raden Roro, Santikayasa, I Putu, and Dasant, Bambang Dwi
- Subjects
- *
FLOODS , *METEOROLOGICAL precipitation , *SOCIOECONOMICS , *ANALYTIC hierarchy process - Abstract
The frequency of flood events in Indonesia has increased since 1990, especially in the capital city of Jakarta. Flood events have affected socioeconomic activities, and have threaten community health in flood prone areas. Although many efforts have been performed to reduced flood impacts, research on flood hazard remains a research challenge. This study aims to map level of flood hazard in Jakarta and to determine the most affected factors that cause flood. First, we defined factors that influence flood, and combined an analytical hierarchy process (AHP) to determine their weighted values and GIS approach to determine their score values. The combination of weight and score value determined the flood hazard index (FHI). The sensitivity analysis and validation then were applied to determine the robustness of the approaches. Our results show that the most influenced factors determining flood hazard were rainfall intensity, land use, and slope, whereas geology is the less factor. Based on the sensitivity analysis and FHI validation, our approaches were able to represent 59% flood disaster in Jakarta. The pattern of FHI value was high in north areas and low in south areas. The findings indicated that north areas are more flood prone than south areas. Further, this research contributes to the improved approach of flood mitigation in Jakarta [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. Particle Filter Based on Harris Hawks Optimization Algorithm for Underwater Visual Tracking
- Author
-
Junyi Yang, Yutong Yao, and Donghe Yang
- Subjects
particle filter ,Harris hawks optimization algorithm ,visual tracking ,resample ,Naval architecture. Shipbuilding. Marine engineering ,VM1-989 ,Oceanography ,GC1-1581 - Abstract
Due to the complexity of the underwater environment, tracking underwater targets via traditional particle filters is a challenging task. To resolve the problem that the tracking accuracy of a traditional particle filter is low due to the sample impoverishment caused by resampling, in this paper, a new tracking algorithm using Harris-hawks-optimized particle filters (HHOPF) is proposed. At the same time, the problem of particle filter underwater target feature construction and underwater target scale transformation is addressed, the corrected background-weighted histogram method is introduced into underwater target feature recognition, and the scale filter is combined to realize target scaling transformation during tracking. In addition, to enhance the computational speed of underwater target tracking, this paper constructs a nonlinear escape energy using the Harris hawks algorithm in order to balance the exploration and exploitation processes. Based on the proposed HHOPF tracker, we performed detection and evaluation using the Underwater Object Tracking (UOT100) vision database. The proposed method is compared with evolution-based tracking algorithms and particle filters, as well as with recent tracker-based correlation filters and some other state-of-the-art tracking methods. By comparing the results of tracking using the test data sets, it is determined that the presented algorithm improves the overlap accuracy and tracking accuracy by 11% compared with other algorithms. The experiments demonstrate that the presented HHOPF visual tracking provides better tracking results.
- Published
- 2023
- Full Text
- View/download PDF
12. Improve the Efficiency of the Classifiers Using Resample Technique on Image Segmentation Dataset
- Author
-
Naga RamaDevi, G., Janga Reddy, M., Baswaraj, D., Angrisani, Leopoldo, Series Editor, Arteaga, Marco, Series Editor, Panigrahi, Bijaya Ketan, Series Editor, Chakraborty, Samarjit, Series Editor, Chen, Jiming, Series Editor, Chen, Shanben, Series Editor, Chen, Tan Kay, Series Editor, Dillmann, Rüdiger, Series Editor, Duan, Haibin, Series Editor, Ferrari, Gianluigi, Series Editor, Ferre, Manuel, Series Editor, Hirche, Sandra, Series Editor, Jabbari, Faryar, Series Editor, Jia, Limin, Series Editor, Kacprzyk, Janusz, Series Editor, Khamis, Alaa, Series Editor, Kroeger, Torsten, Series Editor, Liang, Qilian, Series Editor, Martín, Ferran, Series Editor, Ming, Tan Cher, Series Editor, Minker, Wolfgang, Series Editor, Misra, Pradeep, Series Editor, Möller, Sebastian, Series Editor, Mukhopadhyay, Subhas, Series Editor, Ning, Cun-Zheng, Series Editor, Nishida, Toyoaki, Series Editor, Pascucci, Federica, Series Editor, Qin, Yong, Series Editor, Seng, Gan Woon, Series Editor, Speidel, Joachim, Series Editor, Veiga, Germano, Series Editor, Wu, Haitao, Series Editor, Zhang, Junjie James, Series Editor, Kumar, Amit, editor, Paprzycki, Marcin, editor, and Gunjan, Vinit Kumar, editor
- Published
- 2020
- Full Text
- View/download PDF
13. Impact of the Structure of Data Pre-processing Pipelines on the Performance of Classifiers When Applied to Imbalanced Network Intrusion Detection System Dataset
- Author
-
Al-Mandhari, I., Guan, L., Edirisinghe, E. A., Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Bi, Yaxin, editor, Bhatia, Rahul, editor, and Kapoor, Supriya, editor
- Published
- 2020
- Full Text
- View/download PDF
14. PYTAF: A Python Tool for Spatially Resampling Earth Observation Data.
- Author
-
Zhao, Guangyu, Yang, Muqun, Gao, Yizhao, Zhan, Yizhe, Lee, H. Joe, and Di Girolamo, Larry
- Subjects
- *
PYTHON programming language , *EARTH (Planet) , *SOFTWARE development tools , *EARTH sciences , *SUPPLY & demand - Abstract
Earth observation data have revolutionized Earth science and significantly enhanced the ability to forecast weather, climate and natural hazards. The storage format of the majority of Earth observation data can be classified into swath, grid or point structures. Earth science studies frequently involve resampling between swath, grid and point data when combining measurements from multiple instruments, which can provide more insights into geophysical processes than using any single instrument alone. As the amount of Earth observation data increases each day, the demand for a high computational efficient tool to resample and fuse Earth observation data has never been greater. We present a software tool, called pytaf, that resamples Earth observation data stored in swath, grid or point structures using a novel block indexing algorithm. This tool is specially designed to process large scale datasets. The core functions of pytaf were implemented in C with OpenMP to enable parallel computations in a shared memory environment. A user-friendly python interface was also built. The tool has been extensively tested on supercomputers and successfully used to resample the data from five instruments on the EOS-Terra platform at a mission-wide scale. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Resampling algorithm for imbalanced data based on their neighbor relationship
- Author
-
Rui-feng LI, Wen-hai LI, Yan-li SUN, and Yang-yong WU
- Subjects
imbalanced data ,neighbor relationship ,resample ,local density ,classification ,Mining engineering. Metallurgy ,TN1-997 ,Environmental engineering ,TA170-171 - Abstract
The classification of imbalanced data has become a crucial and significant research issue in many data-intensive applications. The minority samples in such applications usually contain important information. This information plays an important role in data analysis. At present, two methods (improved algorithm and data set reconstruction) are used in machine learning and data mining to address the data set imbalance. Data set reconstruction is also known as the resampling method, which can modify the proportion of every class in the training data set without modifying the classification algorithm and has been widely used. As artificially increasing or reducing samples inevitably results in the increase in noise and loss of original data information, thus reducing the classification accuracy. A reasonable oversampling and undersampling algorithm are the core of the resampling method. To improve the classification accuracy of imbalanced data sets, a resampling algorithm based on the neighbor relationship of sample space was proposed. This method first evaluated the security level according to the spatial neighbor relations of minority samples and oversampled them through the synthetic minority oversampling technique guided by their security level. Then, the local density of majority samples was calculated according to their spatial neighbor relation to undersample the majority samples in a sample-intensive area. By the above two means, the data set can be balanced and the data size can be controlled to prevent overfitting to realize the classification equalization of the two categories. The training set and test set were generated via the method of 5 × 10 fold cross validation. After resampling the training set, the kernel extreme learning machine (KELM) was used as the classifier for training, and the test set was used for verification. The experimental results on a UCI imbalanced data set and measured circuit fault diagnosis data show that the proposed method is superior to other resampling algorithms.
- Published
- 2021
- Full Text
- View/download PDF
16. A Fractional- N Digitally Intensive PLL Achieving 428-fs Jitter and <−54-dBc Spurs Under 50-mV pp Supply Ripple.
- Author
-
Chen, Yue, Gong, Jiang, Staszewski, Robert Bogdan, and Babaie, Masoud
- Subjects
DIGITAL-to-analog converters ,SUCCESSIVE approximation analog-to-digital converters ,ANALOG-to-digital converters ,DC-to-DC converters ,PHASE-locked loops - Abstract
In this article, we present a 4.5–5.1-GHz fractional- $N$ digitally intensive phase-locked loop (DPLL) capable of maintaining its performance in face of a large supply ripple, thus enabling a direct connection to a switched-mode dc–dc converter. Supply pushing of its inductor–capacitor ($LC$) oscillator is suppressed by properly replicating the supply ripple onto the gate of its tail current transistor, while the optimum replication gain is determined by a new on- chip calibration loop tolerant of supply variations. A proposed configuration of cascading a supply-insensitive slope generator with an output of a current digital-to-analog converter (DAC) linearly converts the phase error timing into a corresponding voltage, which is then quantized by a successive approximation register (SAR) analog-to-digital converter (ADC) to generate a digital phase error. We also introduce a low-power ripple pattern estimation and cancellation algorithm to remove the phase error component due to the supply-induced delay variations of loop components. Implemented in 40-nm CMOS, the DPLL prototype achieves the performance of 428-fs rms jitter, <−55-dBc fractional spur, and <−54-dBc maximum spur while consuming 3.25 mW and being subjugated to a sinusoidal or sawtooth supply ripple of 50 mVpp at 50-MHz reference divided by 3, 6, or 12. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. A proposed method for handling an imbalance data in classification of blood type based on Myers-Briggs type indicator
- Author
-
Ahmad Taufiq Akbar, Rochmat Husaini, Bagus Muhammad Akbar, and Shoffan Saifullah
- Subjects
imbalance data ,blood type ,resample ,k-nearest neighbor ,mbti ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Blood type still leads to an assumption about its relation to some personality aspects. This study observes preprocessing methods for improving the classification accuracy of MBTI data to determine blood type. The training and testing data use 250 data from the MBTI questionnaire answers given by 250 respondents. The classification uses the k-Nearest Neighbor (k-NN) algorithm. Without preprocessing, k-NN results in about 32 % accuracy, so it needs some preprocessing to handle data imbalance before the classification. The proposed preprocessing consists of two-stage, the first stage is the unsupervised resample, and the second is the supervised resample. For the validation, it uses ten cross-validations. The result of k-Nearest Neighbor classification after using these proposed preprocessing stages has finally increased the accuracy, F-score, and recall significantly.
- Published
- 2020
- Full Text
- View/download PDF
18. Comparison of Sampling Techniques for Imbalanced Data Classification
- Author
-
Karn Nasritha, Kittisak Kerdprasop, and Nittaya Kerdprasop
- Subjects
imbalance data ,smote ,resample ,classification ,ensemble technique ,Information technology ,T58.5-58.64 - Abstract
Imbalanced data is a problem in the machine learning process for data classification, which results in low classification efficiency. It has also been found that random sampling techniques are used in several ways for solving low performance problems due to data imbalances. This research aims to compare sampling techniques for imbalanced data classification. The research was conducted on three data sets, which are Synthetic minority over-sampling technique, under-sampling technique and resample techniques for Imbalanced data preprocessing. Decision Tree, cart, random forest, support vector machine and artificial neural network algorithms are ensembled with adaboost and bagging algorithms to create models for data classification. Ten-fold cross validation was used to measure model performance. Performance was measured with precision, recall and f-measure. The results showed that resample techniques could improve the imbalanced data better than synthetic minority over-sampling technique. In addition, it was found that the random forest model, the adaboost ensemble with random forest model and the bagging ensemble with random forest model were efficient for data classification in this research.
- Published
- 2018
- Full Text
- View/download PDF
19. Portfolio Optimization and Diversification in China: Policy Implications for Vietnam and Other Emerging Markets.
- Author
-
Hong Vo, Duc
- Subjects
PORTFOLIO management (Investments) ,PORTFOLIO diversification ,EMERGING markets ,PORTFOLIO performance ,STOCK exchanges - Abstract
This article is conducted to examine risk, return, and portfolio optimization at the industry level in China over the period 2007–2016. On the ground of the classical Markowitz framework for portfolio optimization, the mean-semivariance optimization framework is established for China's stock market at the industry level. Findings from this study indicate that healthcare sector plays a significant role among 10 industries in China on a stand-alone basis. In addition, a significant change of rankings among the sectors in term of risk is found when the mean-semivariance optimization framework is used. We also find that utilizing this new framework helps improve the optimal portfolios in relation to performance, measured by Sortino ratio, and diversification. A simulation technique, generally known as resampling method, is also utilized to check the robustness of the estimates. While the use of this resampling method appears not to improve the performance of optimal portfolios compared with the mean-semivariance framework for China, there is a remarkable advance in diversification of the optimal portfolios. Implications for investors and the governments in Vietnam and other emerging markets have emerged from the study. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
20. Preprocessing method based on sample resampling for imbalanced data of electronic circuits.
- Author
-
LI Ruifeng, XU Aiqiang, SUN Weichao, and WU Yangyong
- Subjects
ELECTRONIC circuits ,MACHINE learning ,FAULT diagnosis ,SAMPLING methods ,ELECTRONIC data processing - Abstract
In order to solve the deficiency of fault state data and imbalance of whole test data in airborne electronic circuit, a data preprocessing method based on sample resampling is proposed. Firstly, extreme learning machine is used to training the original data set to select the correct classified samples. Secondly, the synthetic minority oversampling technique (SMOTE) is used to oversampling and local density under-sampling respectively for the minority and majority of the correct classified samples. And the misclassified majority samples are deleted as interference factors. In this way, the data set can be equalized, and the data size can be controlled to prevent over-fitting, and the detection rate of fault samples can be improved. Compared with other data resampling methods, the test data processing results show that the proposed method has a good and stable overall effect, which has a certain application value for the fault diagnosis of electronic circuit. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
21. Improving the QRS detection for one-channel ECG sensor.
- Author
-
Domazet, Ervin, Gusev, Marjan, and Guse, Marjan
- Subjects
- *
ELECTROCARDIOGRAPHY , *DETECTORS , *ANALOG data , *ARRHYTHMIA , *DATA encryption - Abstract
We analyzed several QRS detection algorithms in order to build a quality industrial beat detector, intended for a small, wearable, one channel electrocardiogram sensor with a sampling rate of 125 Hz, and analog-to-digital conversion of 10 bits. The research was a lengthy process that included building several hundred rules to cope with the QRS detection problems and finding an optimal threshold value for several parameters. We obtained 99.90% QRS sensitivity and 99.90% QRS positive predictive rate measured on the first channel of rescaled and resampled MIT-BIH Arrhythmia ECG database. Even more so, our solution works better than the algorithms for the original signals with a sampling rate of 360 Hz and analog-to-digital conversion of 11 bits. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. YOLO-based marine organism detection using two-terminal attention mechanism and difficult-sample resampling.
- Author
-
Zhou, Zhiyu, Hu, Yanjun, Yang, Xingfan, and Yang, Junyi
- Abstract
The presence of various types of noise in images of marine-life datasets, as well as the class imbalances in underwater datasets, can exacerbate the difficulty in achieving effective object detection. To address this problem, we proposed you only look once (YOLO)-based marine organism detection using a two-terminal attention mechanism and difficult-sample resampling process. First, a residual building unit (RBU) module with a two-terminal attention mechanism (RBU-TA) was proposed, incorporating a reinforced channel attention mechanism into a shortcut of the residual structure. The proposed method adaptively compressed noisy feature map channels, providing rich shallow image information for high-level deep convolutional features while avoiding shallow noise pollution. To address the imbalance of marine biological image classes, difficult-sample resampling was combined with a focal loss function to suppress excessive background negative samples and retrain targets that could be difficult to distinguish, thus improving their detection accuracy. Finally, the proposed method was validated using the underwater robot professional competition (URPC) and real-world underwater object detection (RUOD) datasets, and the mean average precision (MAP) values of the results improved by 10% and 7%, respectively. The proposed method greatly improved the target detection accuracy of organisms in complex marine environments. • A residual building unit with two-terminal attention mechanism is proposed. • To solve the problem of imbalance of marine biological image classes, we conducted difficult sample resampling combined with focal loss function. • The proposed method effectively improved the target detection accuracy of organisms in complex marine environments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Sub-Pixel Water-Sky-Line Detection Based on a Curve Fitting Method
- Author
-
Li, Linyang, Li, Chonghui, Zheng, Yong, Zhang, Chao, Sun, Jiadong, editor, Jiao, Wenhai, editor, Wu, Haitao, editor, and Shi, Chuang, editor
- Published
- 2013
- Full Text
- View/download PDF
24. Rapid Water-Sky-Line Detecting Algorithm in Marine Celestial Navigation
- Author
-
Li, Chonghui, Zheng, Yong, Yuan, Yulei, Yang, Yufei, Sun, Jiadong, editor, Liu, Jingnan, editor, Yang, Yuanxi, editor, and Fan, Shiwei, editor
- Published
- 2012
- Full Text
- View/download PDF
25. A Multiple Interpolation Algorithm to Improve Resampling Accuracy in Data Triggers
- Author
-
Mengtao Cao, Fangyuan Xu, Hanbo Jia, Lei Zhou, Eryou Ji, and Jin Wu
- Subjects
sinc interpolation ,linear interpolation ,resample ,FPGA ,trigger ,Computer Networks and Communications ,Hardware and Architecture ,Control and Systems Engineering ,Signal Processing ,Electrical and Electronic Engineering - Abstract
To address the problem of low trigger accuracy during trigger resampling and variable sampling rate trigger resampling using a fixed sampling rate analog-to-digital converter (ADC), this paper proposes an interpolation method combining sinc interpolation and linear interpolation to improve accuracy, based on a digital trigger. After behavior simulation verification and actual field programmable gate array (FPGA) test verification, the data collected by two 3GSps 12-bit ADCs were subjected to 8-times sinc interpolation followed by 16-times linear interpolation processing, after which the original trigger resampling accuracy was increased by 128 times and the sampling rate could be realized to vary between 100 MHz and 1 GHz. A signal–noise ratio (SNR) of 46.80 dBFS, a spurious free dynamic range (SFDR) of 45.91 dB, and an effective number of bits (ENOB) of 7.48 bits were obtained by direct trigger resampling without algorithm processing in the behavior simulation. Meanwhile, an SNR of 58.98 dBFS, an SFDR of 60.96 dB, and an ENOB of 9.42 bits were obtained by trigger resampling after algorithm processing. Due to the influence of analog link signal loss and signal interference on the development board, an SNR, SFDR and ENOB of 51.97 dBFS, 61.26 dB, and 8.32 bits, respectively, were obtained from the trigger resampling in the FPGA test. The experimental results show that the algorithm has not only improved the triggering accuracy but has also improved the SNR, SFDR, and ENOB parameters.
- Published
- 2023
- Full Text
- View/download PDF
26. Using adaptive line-transect sampling in airborne geophysics studies.
- Author
-
Moradi, M.
- Subjects
- *
GEOPHYSICS , *COPPER mining , *DATA mining , *CLUSTER sampling , *SAMPLING methods - Abstract
We introduce a design that combines elements from distance and adaptive cluster sampling designs. We propose a line-transect sampling method, where the sample-strips are selected by unequal selection probabilities, detectability of clusters is assumed imperfect and detectability of sample units belonging to each detected cluster is assumed perfect. Here, the application of distance sampling is broaden to airborne geophysics studies. We introduce efficient estimators for this new sample design. Also, we conduct two simulation studies. One of the populations is Sar Cheshmeh Copper mine with data from an airborne geophysics study and the other is an artificial population. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
27. A Novel Watershed Method Using Reslice and Resample Image
- Author
-
Peng, Shengcai, Gu, Lixu, Hommel, G., editor, and Huanye, Sheng, editor
- Published
- 2006
- Full Text
- View/download PDF
28. A Fractional-N Digitally Intensive PLL Achieving 428-fs Jitter and <-54-dBc Spurs Under 50-mV ppSupply Ripple
- Author
-
Chen, Y. (author), Gong, J. (author), Staszewski, R.B. (author), Babaie, M. (author), Chen, Y. (author), Gong, J. (author), Staszewski, R.B. (author), and Babaie, M. (author)
- Abstract
In this article, we present a 4.5-5.1-GHz fractional-N digitally intensive phase-locked loop (DPLL) capable of maintaining its performance in face of a large supply ripple, thus enabling a direct connection to a switched-mode dc-dc converter. Supply pushing of its inductor-capacitor (LC) oscillator is suppressed by properly replicating the supply ripple onto the gate of its tail current transistor, while the optimum replication gain is determined by a new on- chip calibration loop tolerant of supply variations. A proposed configuration of cascading a supply-insensitive slope generator with an output of a current digital-to-analog converter (DAC) linearly converts the phase error timing into a corresponding voltage, which is then quantized by a successive approximation register (SAR) analog-to-digital converter (ADC) to generate a digital phase error. We also introduce a low-power ripple pattern estimation and cancellation algorithm to remove the phase error component due to the supply-induced delay variations of loop components. Implemented in 40-nm CMOS, the DPLL prototype achieves the performance of 428-fs rms jitter, <-55-dBc fractional spur, and <-54-dBc maximum spur while consuming 3.25 mW and being subjugated to a sinusoidal or sawtooth supply ripple of 50 mVpp at 50-MHz reference divided by 3, 6, or 12., Electronics, QCD/Sebastiano Lab
- Published
- 2022
- Full Text
- View/download PDF
29. A Fractional-N Digitally Intensive PLL Achieving 428-fs Jitter and <-54-dBc Spurs Under 50-mV ppSupply Ripple
- Author
-
Jiang Gong, Yue Chen, Masoud Babaie, and Robert Bogdan Staszewski
- Subjects
successive approximation register (SAR) analog-to-digital converter (ADC) ,Computer science ,Ripple ,Jitter ,Phase locked loops ,Sawtooth wave ,inductor-capacitor (LC) oscillator ,digitally intensive phase-locked loop (DPLL) ,Sensitivity ,multimodulus divider (MMDIV) ,Control theory ,ripple replication and cancellation ,DPLL algorithm ,Hardware_INTEGRATEDCIRCUITS ,Oscillators ,Sensitivity (control systems) ,Electrical and Electronic Engineering ,Delays ,supply pushing ,Current digital-to-analog converter (DAC) ,ripple pattern estimation and cancellation ,Voltage ,slope generator (SG) ,Phase-locked loop ,CMOS ,resample ,Calibration ,dc-dc converter ,supply ripple - Abstract
In this article, we present a 4.5-5.1-GHz fractional-N digitally intensive phase-locked loop (DPLL) capable of maintaining its performance in face of a large supply ripple, thus enabling a direct connection to a switched-mode dc-dc converter. Supply pushing of its inductor-capacitor (LC) oscillator is suppressed by properly replicating the supply ripple onto the gate of its tail current transistor, while the optimum replication gain is determined by a new on- chip calibration loop tolerant of supply variations. A proposed configuration of cascading a supply-insensitive slope generator with an output of a current digital-to-analog converter (DAC) linearly converts the phase error timing into a corresponding voltage, which is then quantized by a successive approximation register (SAR) analog-to-digital converter (ADC) to generate a digital phase error. We also introduce a low-power ripple pattern estimation and cancellation algorithm to remove the phase error component due to the supply-induced delay variations of loop components. Implemented in 40-nm CMOS, the DPLL prototype achieves the performance of 428-fs rms jitter
- Published
- 2022
30. Modeling the Influence of River Cross-Section Data on a River Stage Using a Two-Dimensional/ Three-Dimensional Hydrodynamic Model.
- Author
-
Wei-Bo Chen and Wen-Cheng Liu
- Subjects
RIVERS ,HYDRODYNAMICS ,FLUID dynamics ,DATA analysis ,VISCOSITY - Abstract
A large amount of accurate river cross-section data is indispensable for predicting river stages. However, the measured river cross-section data are usually sparse in the transverse direction at each cross-section as well as in the longitudinal direction along the river channel. This study presents three algorithms to resample the river cross-section data points in both the transverse and longitudinal directions from the original data. A two-dimensional (2D) high-resolution unstructured-grid hydrodynamic model was used to assess the performance of the original and resampled cross-section data on a simulated river stage under low flow and high flow conditions. The simulated river stages are significantly improved using the resampled cross-section data based on the linear interpolation in the tidal river and non-tidal river segments. The resampled cross-section data based on the linear interpolation satisfactorily maintains the topographic and morphological features of the river channel, especially in the meandering river segment. Furthermore, the performance of the 2D and three-dimensional (3D) models on the simulated river stage was also evaluated using the resampled cross-section data. The results indicate that the 2D and 3D models reproduce similar river stages in both tidal and non-tidal river segments under the low flow condition. However, the 2D model overestimates the river stages in both the tidal and non-tidal river segments compared to the 3D model under the high flow condition. The model sensitivity was implemented to investigate the influence of bottom drag coefficient and vertical eddy viscosity on river stage using 2D and 3D models based on the linear interpolation method to resample river bed cross-section. The results reveal that the change of bottom drag coefficient has a minor impact on river stage, but the change of vertical eddy viscosity is insensitive to river stage. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
31. Fast GNSS satellite signal acquisition method based on multiple resamplings.
- Author
-
Wang, Yong and Mao, Gang
- Subjects
GLOBAL Positioning System ,RESAMPLING (Statistics) ,DATA acquisition systems - Abstract
A fast Global Navigation Satellite System (GNSS) satellite signal acquisition method based on resampling is presented. In contrast to traditional approaches, which perform a single-round search with a high data rate, the proposed method introduces a signal acquisition mechanism that uses data resampling. Starting from a resampled data rate slightly above the Nyquist frequency, the proposed method conducts multiple rounds of searches with an increasing sampling rate. After each round of searching, the satellites are sorted according to their relative signal strengths. By removing satellites at the bottom of each sorted list, the search space for satellite acquisition is continuously pruned. If a sufficient number of satellites are not acquired when the original data rate is reached, the method will switch to the weak-signal detection mode and use non-coherent integration for the satellites at the top on the list. The non-coherent integration process continues until either a sufficient number of satellites are acquired or the maximum number of steps is reached. The experimental results show that the proposed method can acquire the same set of satellites as traditional methods but with a considerably lower computational cost. The proposed method was implemented in a software-based GNSS receiver and can also be used in hardware-based receivers. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
32. Horizontal accuracy assessment of a novel algorithm for approximate a surface to a DEM
- Author
-
Domingo Barrera, María José Ibáñez, Salah Eddargani, Rocio Romero, Francisco J. Ariza-López, and Juan F. Reinoso-Gordo
- Subjects
Tensor product ,Horizontal accuracy ,Bernstein basis ,DEM ,Resample ,Control points ,Bézier ordinates ,Electrical and Electronic Engineering ,Spline ,Atomic and Molecular Physics, and Optics - Abstract
This study evaluates the horizontal positional accuracy of a new algorithm that defines a surface that approximates DEM data by means of a spline function. This algorithm allows evaluating the surface at any point in its definition domain and allows analytically estimating other parameters of interest, such as slopes, orientations, etc. To evaluate the accuracy achieved with the algorithm, we use a reference DEM 2 m × 2 m (DEMref) from which the derived DEMs are obtained at 4 m × 4 m, 8 m × 8 m and 16 m × 16 m (DEMder). For each DEMder its spline approximant is calculated, which is evaluated at the same points occupied by the DEMref cells, getting a resampled DEM 2 × 2 m (DEMrem). The horizontal accuracy is obtained by computing the area amongs the homologous contour lines derived from DEMref and DEMrem, respectively. It has been observed that the planimetric errors of the proposed algorithm are very small, even in flat areas, where you could expect major differences. Therefore, this algorithm could be used when an evaluation of the horizontal positional accuracy of a DEM product at lower resolution (DEMpro) and a different producing source than the higher resolution DEMref is wanted.
- Published
- 2021
33. On sample diversity in Particle Filter based robot SLAM.
- Author
-
Li, Xiuzhi, Jia, Songmin, and Cui, Wei
- Abstract
This article proposed a simple and effective methodology for improving diversity of samples in Particle Filter (PF). The motivation lies in the situation that resampling procedure which aims for amending particle degeneracy always leads to particle depletion and less diversity of particles has severe consequence on the filter estimation accuracy. Various resample approaches have been developed in recent years, including multi-nominal resample, residual resample, stratified resample and systematic resample. All of them, however, will inevitably lead to lose of diversity in scatter of particles because they simply replace lower weighed particles with higher weighed particles. In this paper, we developed practical MCMC solutions for drawing particles for PF. The selections of proposal distribution and convergent chain node are taken into careful considerations. It is revealed from the simulations and real experiment that the proposed resampling method is capable of improving the performance of Particle Filter. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
34. Imbalanced classification using support vector machine ensemble.
- Author
-
Tian, Jiang, Gu, Hong, and Liu, Wenqi
- Subjects
- *
CLASSIFICATION , *SUPPORT vector machines , *SET theory , *DATA analysis , *MATHEMATICAL decomposition , *STATISTICAL sampling , *CLUSTER analysis (Statistics) - Abstract
Imbalanced data sets often have detrimental effects on the performance of a conventional support vector machine (SVM). To solve this problem, we adopt both strategies of modifying the data distribution and adjusting the classifier. Both minority and majority classes are resampled to increase the generalization ability. For minority class, an one-class support vector machine model combined with synthetic minority oversampling technique is used to oversample the support vector instances. For majority class, we propose a new method to decompose the majority class into clusters and remove two clusters using a distance measure to lessen the effect of outliers. The remaining clusters are used to build an SVM ensemble with the oversampled minority patterns, the SVM ensemble can achieve better performance by considering potentially suboptimal solutions. Experimental results on benchmark data sets are provided to illustrate the effectiveness of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
35. Tests of Random Walk: A Comparison of Bootstrap Approaches.
- Author
-
Lima, Eduardo J. A. and Tabak, Benjamin M.
- Subjects
ESTIMATION theory ,MONTE Carlo method ,MATHEMATICAL statistics ,STOCHASTIC processes ,NUMERICAL calculations - Abstract
This paper compares different versions of the multiple variance ratio test based on bootstrap techniques for the construction of empirical distributions. It also analyzes the crucial issue of selecting optimal block sizes when block bootstrap procedures are used. The comparison of the different approaches using Monte Carlo simulations leads to the conclusion that methodologies using block bootstrap methods present better performance for the construction of empirical distributions of the variance ratio test. Moreover, the results are highly sensitive to methods employed to test the null hypothesis of random walk. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
36. The jackknife’s edge: Inference for censored regression quantiles.
- Author
-
Portnoy, Stephen
- Subjects
- *
JACKKNIFE (Statistics) , *MATHEMATICAL statistics , *REGRESSION quantiles , *DATA analysis , *RESAMPLING (Statistics) , *STATISTICAL bootstrapping - Abstract
Abstract: For censored data, it is very common for the tail of the survival function to be non-identifiable because of the abundance of censored observations in the tail. This is especially prominent in censored regression quantile analysis, and introduces a serious problem with inference, especially near the point of non-identifiability. The lack of readily estimable formulas for asymptotic variances requires the use of resampling methods. Unfortunately, the bootstrap (in any of its versions) generates samples for which the point of non-identifiability has sufficient variability over the pseudo-samples so that (in theory and in practice) an appreciable number of the bootstrap replicates can no longer estimate a quantile that the original data could estimate. This leads to very poor coverage probabilities. Thus, resampling methods that provide less variability over the pseudo-samples may be very helpful. The jackknife is one such method, though it is necessary to use a “delete- ” jackknife with of order larger than . Another alternative is to use randomly reweighted “bootstrap” samples with weights of the form , with of order . These approaches can be justified asymptotically. Furthermore, a simulation experiment shows that randomly sampling a relatively modest number of delete- jackknifed samples provides quite excellent coverage probabilities, substantially outperforming Bootstrap methods near the point of non-identifiability. This provides a counterexample to the commonly held notion that bootstrap methods are better than jackknife methods, and suggests the possible superiority of jackknife methods for a variety of situations with missing data or other cases of partial identifiability. [Copyright &y& Elsevier]
- Published
- 2014
- Full Text
- View/download PDF
37. Unit 11: Registration and Conflation
- Author
-
Unit 11, CCTP and Sutton, Paul
- Subjects
pixel ,ground control point ,transformation ,Ordinary Least Square Polynomial fitting ,resample ,raster ,vector ,overlay ,dissolve - Abstract
This unit discusses registration and conflation, two related procedures in which two or more geographic datasets are combined, compared or merged. It presents procedures for re-projecting or rubber-sheeting a spatially referenced dataset to another, and it discusses issues including data model conversions, scale aggregations and other GIS manipulation and analysis functions.
- Published
- 1998
38. Bootstrap Methods for Finite Populations.
- Author
-
Booth, James G., Butler, Ronald W., and Hall, Peter
- Subjects
- *
POPULATION research , *STATISTICAL bootstrapping , *CONFIDENCE intervals , *RESAMPLING (Statistics) , *STATISTICAL sampling , *STATISTICAL hypothesis testing , *SAMPLE size (Statistics) , *ARITHMETIC mean , *MEDIAN (Mathematics) - Abstract
We show that the familiar bootstrap plug-in rule of Efron has a natural analog in finite population settings In our method a characteristic of the population is estimated by the average value of the characteristic over a class of empirical populations constructed from the sample Our method extends that of Gross to situations in which the stratum sizes are not integer multiples of their respective sample sizes Moreover, we show that our method can be used to generate second-order correct confidence intervals for smooth functions of population means, a property that has not been established for other resampling methods suggested in the literature A second resampling method is proposed that also leads to second-order correct confidence intervals and is less computationally intensive than our bootstrap But a simulation study reveals that the second method can be quite unstable in some situations, whereas our bootstrap performs very well. [ABSTRACT FROM AUTHOR]
- Published
- 1994
- Full Text
- View/download PDF
39. Hexagonal scale invariant feature transform (H-SIFT) for facial feature extraction
- Author
-
Jamal Hussain Shah, Aisha Azeem, Mudassar Raza, and Muhammad Sharif
- Subjects
Pixel ,business.industry ,Coordinate system ,Feature extraction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,General Engineering ,Scale Invariant Feature Transform ,Resample ,Scale-invariant feature transform ,Pattern recognition ,Facial recognition system ,Image (mathematics) ,Hexagonal image ,Feature (computer vision) ,Computer Science::Computer Vision and Pattern Recognition ,Face (geometry) ,Computer Science::Multimedia ,Computer vision ,Artificial intelligence ,Face recognition ,business ,Mathematics - Abstract
Feature transformation and key-point identification is the solution to many local feature descriptors. One among such descriptor is the Scale Invariant Feature Transform (SIFT). A small effort has been made for designing a hexagonal sampled SIFT feature descriptor with its applicability in face recognition tasks. Instead of using SIFT on square image coordinates, the proposed work makes use of hexagonal converted image pixels and processing is applied on hexagonal coordinate system. The reason of using the hexagonal image coordinates is that it gives sharp edge response and highlights low contrast regions on the face. This characteristic allows SIFT descriptor to mark distinctive facial features, which were previously discarded by original SIFT descriptor. Furthermore, Fisher Canonical Correlation Analysis based discriminate procedure is outlined to give a more precise classification results. Experiments performed on renowned datasets revealed better performances in terms of feature extraction in robust conditions. All Rights Reserved © 2015 Universidad Nacional Autonoma de Mexico, Centro de Ciencias Aplicadas y Desarrollo Tecnologico. This is an open access item distributed under the Creative Commons CC License BY-NC-ND 4.0
- Published
- 2015
- Full Text
- View/download PDF
40. Перетворювач частоти дискретизації на FPGA
- Subjects
ресамплер ,Фарроу ,resample ,Farrow ,FPGA - Abstract
Розглянуто ефективний приклад побудови та практичної реалізації системи перетворення частоти дискретизації із використанням структури Фарроу, що виконано на FPGA Altera-Cyclone. An effective method for constructing and realizing a system for converting the sampling frequency using the Farrow structure based on FPGA Altera-Cyclone is considered. Рассмотрен эффективный способ построения и практической реализации системы преобразования частоты дискретизации с использованием структуры Фарроу, выполненной на базе FPGA Altera-Cyclone.
- Published
- 2018
41. Evaluation of deforestation in the yarí sabanas from a multitemporal analysis of satellite images landsat years 2010 and 2017 through digital image processing
- Author
-
Reyes González, Manuel Camilo and Riaño Pérez, Felipe
- Subjects
Clasificación Temática ,COBERTURA VEGETAL ,Cobertura ,DEFORESTACION ,Multitemporal Analysis ,Coverage ,SATELITES LANDSAT ,Resampleo ,Resample ,ANALISIS ESPACIAL (ESTADISTICA) ,Thematic Classification ,Análisis Multitemporal - Abstract
El propósito de la presente investigación es realizar un análisis multitemporal utilizando imágenes de satélite Landsat de los años 2010 y 2017, para determinar el cambio de cobertura de bosque en las sabanas del Yarí, debido a la deforestación que se ha presentando en la zona en los últimos años. Para realizar el estudio, se procesaron las dos imágenes por separado, inicialmente se realizó una clasificación temática y una evaluación de la exactitud con el propósito de observar cuales eran las coberturas predominantes en la zona de estudio, posteriormente se realizó un corregistro a las imágenes (resampleo) con el fin de que pudieran ser comparables espacialmente, y con ello estimar los cambios ocurridos de un año a otro con respecto a las coberturas presentes. The purpose of the present investigation is to carry out a multitemporal analysis using Landsat satellite images of 2010 and 2017, to determine the change of forest cover in the Yarí savannas, due to the deforestation that has been occurring in the area. In order to carry out the study, the two images were processed separately, initially a thematic classification and an evaluation of the accuracy were carried out with the purpose of observing the predominant coverage in the study area, then a correlation was performed to the images ( Resampleo) so that they could be spatially comparable, and thus estimate the changes that occurred from year to year with respect to the present coverage.
- Published
- 2017
42. Fault diagnosis using novel AdaBoost based discriminant locality preserving projection with resamples.
- Author
-
He, Yan-Lin, Zhao, Yang, Hu, Xiao, Yan, Xiao-Na, Zhu, Qun-Xiong, and Xu, Yuan
- Subjects
- *
FAULT diagnosis , *MANUFACTURING processes , *MATRIX decomposition - Abstract
Fault diagnosis plays a pivotal role in ensuring the safety of process industries. However, due to the diversity of process faults and the high coupling of fault data, it becomes very difficult to achieve high accuracy in the fault diagnosis of complex industrial processes. To address this concern, in this article, a novel AdaBoost-based discriminant locality preserving projection (DLPP) with resamples (A-DLPPR) model is proposed. The proposed A-DLPPR model has two features: to address the problem of matrix decomposition in DLPP, the bootstrap method is utilized to generate groups of resample data, and to obtain high classification accuracy, the AdaBoost-based classification technique is adopted. Finally, an effective fault diagnosis model using the proposed A-DLPPR model can be established. To validate the effectiveness of the proposed A-DLPPR model, the Tennessee Eastman process (TEP) is selected, and case studies using different kinds of TEP faults are conducted. The simulation results indicate that the proposed A-DLPPR model can achieve higher fault diagnosis accuracy than some other models, which verifies that in the field of complex industrial processes, the proposed A-DLPPR method can be used as an effective model for fault diagnosis. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
43. Modeling the Influence of River Cross-Section Data on a River Stage Using a Two-Dimensional/Three-Dimensional Hydrodynamic Model
- Author
-
Wei-Bo Chen and Wen-Cheng Liu
- Subjects
cross-section ,Drag coefficient ,lcsh:Hydraulic engineering ,010504 meteorology & atmospheric sciences ,0208 environmental biotechnology ,Geography, Planning and Development ,Flow (psychology) ,02 engineering and technology ,Aquatic Science ,Linear interpolation ,01 natural sciences ,Biochemistry ,Physics::Geophysics ,Cross section (physics) ,lcsh:Water supply for domestic and industrial purposes ,lcsh:TC1-978 ,river bed bathymetry ,hydrodynamic model ,Tidal river ,river stage ,resample ,2D/3D ,Geomorphology ,Physics::Atmospheric and Oceanic Physics ,0105 earth and related environmental sciences ,Water Science and Technology ,Hydrology ,lcsh:TD201-500 ,geography ,Turbulence modeling ,020801 environmental engineering ,geography.body_of_water ,Data point ,Stage (hydrology) ,Geology - Abstract
A large amount of accurate river cross-section data is indispensable for predicting river stages. However, the measured river cross-section data are usually sparse in the transverse direction at each cross-section as well as in the longitudinal direction along the river channel. This study presents three algorithms to resample the river cross-section data points in both the transverse and longitudinal directions from the original data. A two-dimensional (2D) high-resolution unstructured-grid hydrodynamic model was used to assess the performance of the original and resampled cross-section data on a simulated river stage under low flow and high flow conditions. The simulated river stages are significantly improved using the resampled cross-section data based on the linear interpolation in the tidal river and non-tidal river segments. The resampled cross-section data based on the linear interpolation satisfactorily maintains the topographic and morphological features of the river channel, especially in the meandering river segment. Furthermore, the performance of the 2D and three-dimensional (3D) models on the simulated river stage was also evaluated using the resampled cross-section data. The results indicate that the 2D and 3D models reproduce similar river stages in both tidal and non-tidal river segments under the low flow condition. However, the 2D model overestimates the river stages in both the tidal and non-tidal river segments compared to the 3D model under the high flow condition. The model sensitivity was implemented to investigate the influence of bottom drag coefficient and vertical eddy viscosity on river stage using 2D and 3D models based on the linear interpolation method to resample river bed cross-section. The results reveal that the change of bottom drag coefficient has a minor impact on river stage, but the change of vertical eddy viscosity is insensitive to river stage.
- Published
- 2017
- Full Text
- View/download PDF
44. Tests of RandomWalk: A Comparison of Bootstrap Approaches
- Author
-
Lima, Eduardo José Araújo and Tabak, Benjamin Miranda
- Subjects
Variance ratio ,Resample ,Random walk ,Bootstrap - Abstract
Made available in DSpace on 2016-10-10T03:52:04Z (GMT). No. of bitstreams: 5 Tests of Random Walk_ A Comparison of Bootstrap Approaches.pdf: 255091 bytes, checksum: 1ce8a4a06965b4696ff1825a517e6808 (MD5) license_url: 52 bytes, checksum: 3d480ae6c91e310daba2020f8787d6f9 (MD5) license_text: 21716 bytes, checksum: 282d2b1a583fb55b557e8a3be8d5dd05 (MD5) license_rdf: 23930 bytes, checksum: 6b71892b27c4389434057b8b0e86b43e (MD5) license.txt: 1872 bytes, checksum: 9ede5d1aaff3f6277cd24454ee44422e (MD5) Previous issue date: 2009 This paper compares different versions of the multiple variance ratio test based on bootstrap techniques for the construction of empirical distributions. It also analyzes the crucial issue of selecting optimal block sizes when block bootstrap procedures are used. The comparison of the different approaches using Monte Carlo simulations leads to the conclusion that methodologies using block bootstrap methods present better performance for the construction of empirical distributions of the variance ratio test.Moreover, the results are highly sensitive to methods employed to test the null hypothesis of random walk. Publicado
- Published
- 2009
45. Seasonal flood frequency analysis based on disaggregated urban runoff data
- Author
-
Matheussen, B. V., Thorolfsson, S. T., and Lettenmaier, D. P.
- Subjects
modelling ,design criteria ,resample ,snow hydrology ,bootstrap ,Urban drainage - Abstract
Current design procedures for urban drainage systems are based on the assumption that it is short duration summer rainfall that produces the highest annual runoff peaks. Despite this several researchers report that in cold climates urban flooding may also happen as a consequence of rain in combination with snowmelt. Because of this we chose to question the assumption of rainfall being the only factor controlling urban flooding in cold climates like in Trondheim, Norway. Through a combination of hydrological modelling and a temporal disaggregation method a time series of hourly runoff data (1946-2003) are generated for the Risvollan urban catchment in Trondheim, Norway. The hourly runoff data are then used to generate flood frequency curves (FFC) for the whole year and the summer (May-Oct) period only. The results showed that a FFC based on hourly urban runoff data for the whole year has a higher 10 year flood than a FFC generated from only summer data. This implies that urban drainage design methods used for cold climates, similar to Trondheim, should consider all seasons in the design procedure.
- Published
- 2005
- Full Text
- View/download PDF
46. A Short Prehistory of the Bootstrap
- Author
-
Peter Hall
- Subjects
Statistics and Probability ,Computer science ,General Mathematics ,Survey sampling ,Present day ,Prehistory ,Block bootstrap ,half-sample ,resampling ,sub-sample ,Resampling ,Statistics ,Econometrics ,Range (statistics) ,Spatial analysis ,Monte Carlo ,Mahalanobis distance ,moving block ,computer-intensive statistics ,sample survey ,confidence interval ,resample ,statistical experimentation ,Statistics, Probability and Uncertainty ,Jackknife resampling ,permutation test - Abstract
The contemporary development of bootstrap methods, from the time of Efron's early articles to the present day, is well documented and widely appreciated. Likewise, the relationship of bootstrap techniques to certain early work on permutation testing, the jackknife and cross-validation is well understood. Less known, however, are the connections of the bootstrap to research on survey sampling for spatial data in the first half of the last century or to work from the 1940s to the 1970s on subsampling and resampling. In a selective way, some of these early linkages will be explored, giving emphasis to developments with which the statistics community tends to be less familiar. Particular attention will be paid to the work of P. C. Mahalanobis, whose development in the 1930s and 1940s of moving-block sampling methods for spatial data has a range of interesting features, and to contributions of other scientists who, during the next 40 years, developed half-sampling, subsampling and resampling methods.
- Published
- 2003
47. A Short Prehistory of the Bootstrap
- Author
-
Hall, Peter and Hall, Peter
- Abstract
The contemporary development of bootstrap methods, from the time of Efron’s early articles to the present day, is well documented and widely appreciated. Likewise, the relationship of bootstrap techniques to certain early work on permutation testing, the jackknife and cross-validation is well understood. Less known, however, are the connections of the bootstrap to research on survey sampling for spatial data in the first half of the last century or to work from the 1940s to the 1970s on subsampling and resampling. In a selective way, some of these early linkages will be explored, giving emphasis to developments with which the statistics community tends to be less familiar. Particular attention will be paid to the work of P. C. Mahalanobis, whose development in the 1930s and 1940s of movingblock sampling methods for spatial data has a range of interesting features, and to contributions of other scientists who, during the next 40 years, developed half-sampling, subsampling and resampling methods.
- Published
- 2003
48. Web Based Particle Filters
- Author
-
Wang, Xingpu
- Subjects
- Resample, Bayes Inference, Branching, Particle filters
- Abstract
Abstract: In this thesis, we first introduce two basic problems of filter, the nonlinear filtering and model selection problem. We show that both of them can be solved by the unnormalized filter approach. Then several web based particle filter algorithms will be discussed. We extend the resampled and branching system on single computer platform to a web based platform. The performance and execution time of these algorithms will be compared upon two simulation models. We define a parameter, called ”Bootstrap Factor”, which is a reasonable way to compare different particle filters. By Bootstrap Factor, we show that the web based branching system performs much better than the double resampled system.
- Published
- 2016
49. On General Resampling Algorithms and their Performance in Distribution Estimation
- Author
-
Peter Hall and Enno Mammen
- Subjects
Statistics and Probability ,Sequence ,wild bootstrap ,Cornish-Fisher expansions ,Context (language use) ,mean ,Edgeworth series ,Bootstrap ,jackknife ,moment ,Moment (mathematics) ,Edgeworth expansion ,Sampling distribution ,distribution estimation ,resample ,Resampling ,62G15 ,62G05 ,Statistics, Probability and Uncertainty ,cumulant ,Jackknife resampling ,Algorithm ,Cumulant ,Mathematics - Abstract
Recent work of several authors has focussed on first-order properties (e.g., consistency) of general bootstrap algorithms, where the numbers of times that data values are resampled form an exchangeable sequence. In the present paper we develop second-order properties of such algorithms, in a very general setting. Performance is discussed in the context of distribution estimation, and formulae for higher-order moments and cumulants are developed. Arguing thus, necessary and sufficient conditions are given for general resampling algorithms to correctly capture second-order properties.
- Published
- 1994
- Full Text
- View/download PDF
50. Rastrová analýza pro GIS nástroj ArcGIS
- Author
-
Hrubý, Martin, Šilhavá, Jana, Hupšil, Radim, Hrubý, Martin, Šilhavá, Jana, and Hupšil, Radim
- Abstract
Práce se zabývá studiem geografického informačního systému ArcGIS. Zaměřuje se zejména na možnosti jeho rozšíření vlastními nadstavbami a způsob jejich programování. Dále jsou vysvětleny některé základní nástroje rastrové analýzy. Cílem této práce je návrh a realizace vlastního rozšíření ArcGISu, které bude poskytovat sadu nástrojů rastrové analýzy. Inspirací pro návrh je existující rozšíření - Spatial Analyst od firmy ESRI., This project is about studying geographic information system ArcGIS. It focuses on possibilities of extending ArcGIS by custom extensions and method of their programming. Furthermore some basic tools of raster analysis are ilustrated. This project's main objective is to design and implement custom implementation of ArcGIS extension, which provides a set of tools for raster analysis. Design is inspired by an existing extension - Spatial Analyst developed by ESRI.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.