934 results on '"postprocessing"'
Search Results
2. Automated Postprocessing of 3D-printed Multi-Material Polymer Parts Through a Robot-Based Modular System and Control Architecture
- Author
-
Krischke, Nikolai, Zeidler, Simon, Baranowski, Michael, Mogge, Pia, Ertingshausen, Lea, Kaltenbacher, Mario, Selle, Benedikt, Bauhofer, Hamid, Schabel, Sebastian, and Fleischer, Jürgen
- Published
- 2024
- Full Text
- View/download PDF
3. Evaluation, Calibration, and Application of Probabilistic 100-m Wind Speed Forecasts Produced by the WRF Ensemble Prediction System in Taiwan.
- Author
-
Chou, Shih-Chun, Chang, Hui-Ling, Toth, Zoltan, Lin, Pay-Liam, Hong, Jing-Shan, Wu, Yuan-Kang, Chen, Guan-Ru, Lu, Kuo-Chen, and Cheng, Chia-Ping
- Subjects
- *
ECONOMIC decision making , *RENEWABLE energy sources , *METEOROLOGICAL research , *WEATHER forecasting , *WIND forecasting , *TYPHOONS - Abstract
The use of renewable energy is on the rise; however, it brings more challenges for grid operations and unit scheduling due to the intermittent nature and limitations in predicting renewable power generation. High-quality meteorological forecasts play a critical role in the effective application and management of renewable energy resources. This study evaluates the quality and economic benefits of probabilistic forecasts for 100-m wind speeds from the Weather Research and Forecasting Model ensemble prediction system (WEPS). A multiple linear regression (MLR) method for calibration is also used to improve forecast quality. Additionally, this study explores the application of probabilistic hub-height wind speed forecasts in conjunction with an economic value analysis to determine whether wind turbines should be shut down during typhoon periods. The findings reveal that the forecast in the central offshore region of Taiwan exhibits the highest reliability and discrimination ability compared to those in the northern and southern regions. Additionally, employing the MLR calibration method significantly improved the reliability and discrimination ability of the probabilistic forecasts compared to the raw forecasts. Furthermore, during the typhoon seasons, almost all users, regardless of their cost–loss ratio, can benefit from basing their decisions on WEPS forecasts compared to using a single deterministic forecast. Importantly, decision-makers can benefit more from probabilistic forecasts than the ensemble mean forecasts when both are derived from the same ensemble prediction system, since the former incorporates forecast uncertainty. Significance Statement: High-quality meteorological forecasts significantly influence the efficient utilization and management of renewable energy resources. Therefore, this study uses a simple yet effective approach to correct systematic bias in probabilistic wind speed forecasts over Taiwan, aiming to provide decision-makers with more reliable forecasts for hub-height wind speeds. Given that strong winds during typhoon periods pose safety concerns for wind turbine operation, we use an economic value analysis to assist turbine operators in deciding whether to shut down wind turbines. Furthermore, our research demonstrates that ensemble probabilistic forecasts can yield greater economic benefits for decision-makers when compared to the commonly used ensemble mean forecasts. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
4. Laser‐Based Closed‐Loop Controlled Heat Treatment for Residual Stress Relief of Additively Manufactured AlSi10Mg Components.
- Author
-
Wenger, Robin, Hegele, Patrick, Hofele, Markus, Neuer, Johannes, and Riegel, Harald
- Subjects
HEAT treatment ,RESIDUAL stresses ,DEFORMATIONS (Mechanics) ,HEATING control ,LASER beams - Abstract
In laser powder bed fusion of metals small melt pools with extremely short solidification times and steep temperature gradients to the surrounding material exist due to the layer‐wise and selective melting process with small areas of energy input compared to the cooler, solid ambient material. This causes high thermally induced residual stresses (RS), which can lead to the immediate rejection of the component due to critical part deformation and inhomogeneous mechanical properties under load. To prevent this, furnace‐based stress relief heat treatments are commonly applied before cutting‐off the part from the built platform. In this study a novel closed‐loop controlled laser‐based heat treatment using temperature feedback through inline pyrometer measurement is investigated, enabling a fast and highly efficient postprocess stress relief. Therefore, a laser beam is guided by a scanner optics in a meandering pattern over the top surface of AlSi10Mg cantilever specimens. It enables a decrease of near‐surface RS from 158 to 5 N mm−2, resulting in a reduction of displacement by over 90% at material affecting depths up to 3 mm and area rates of 8 to 163 mm2 s−1. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. Enhancing NWP-Based Reference Evapotranspiration Forecasts: Role of ETo Approaches and Temperature Postprocessing.
- Author
-
Saminathan, Sakila and Mitra, Subhasis
- Subjects
WATER demand management ,NUMERICAL weather forecasting ,CLIMATIC zones ,IRRIGATION water ,SOLAR radiation - Abstract
Reference evapotranspiration (ETo) forecasts are essential for estimating irrigation water demand and agricultural water management. However, studies have not examined numerical weather prediction (NWP)–based ETo forecast enhancement with respect to different ETo approaches and climate zones in the Indian subcontinent. In this study, we use two probabilistic postprocessing techniques (PPT), namely, nonhomogeneous Gaussian regression (NGR) and Bayesian model averaging (BMA), and assess their performance in enhancing NWP-based ETo forecasts at short to medium-range time scales (1 to 7 days) over different climate zones in the Indian subcontinent. Weather variables from NWP model outputs are used to estimate the ETo forecasts. Two ETo approaches, namely, the food and agriculture organization (FAO)-Penman Monteith (PM) and temperature-based Hargreaves-Samani (HS) methods, are utilized for ETo estimation. The effectiveness of PPTs in enhancing the ETo forecasts using these approaches is also evaluated. Further, hydrologic forecasting studies have traditionally used postprocessed temperature forecasts toward forecasting of ETo in hydrologic models. However, the rationale of this approach is debatable. In this study, we also evaluate if the postprocessing of temperature forecasts produces comparable ETo forecast performance relative to the postprocessing of the ETo forecasts. Results revealed that raw ETo forecasts from both NWPs perform poorly, especially in the northern (polar zone) regions. Further, wind speed and solar radiation were found to be the dominant variables contributing to low ETo forecast skill over the region. The forecasts using the HS method were found to be less skillful than the forecasts from the PM approach. Postprocessing results indicate that both the PPTs are able to considerably enhance ETo forecast skill across all the climate zones and the NGR approach outperforms the BMA technique. The postprocessing was especially able to enhance the skill of forecasts in northern (polar zone) regions where the raw ETo forecast skill was particularly low. The estimation of ETo forecasts using temperature postprocessed ETo forecasts (EToT) revealed that temperature postprocessing does not considerably improve the accuracy of the EToT forecasts. Outcomes of this study have implications for hydrologic forecasting, irrigation water management, and development of irrigation-based decision-making systems in the Indian subcontinent. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
6. Status of Research on Assisted Laser Cladding and Laser Cladding Posttreatment: A Review.
- Author
-
Qiaofeng Bai, Chen, Chao, Li, Qihang, Zhao, Chunjiang, and Zhang, Jian
- Subjects
ENERGY density ,MANUFACTURING defects ,LASER deposition ,SURFACE coatings ,SURFACE preparation ,SURFACES (Technology) - Abstract
Laser cladding technology has become one of the most effective surface modification technologies to date due to its high energy density, small heat affected zone, low dilution rate and high cooling rate. However, with the increasing demand for products, the uneven tissue distribution, cracks, pores and other defects of the clad layer seriously limit its application and development. The performance and surface quality of the cladding layer need to be further improved. On this basis, the effects of different energy fields, such as electromagnetic fields, ultrasonic fields and induction heating, on the tissue properties of coatings are outlined and the mechanisms of the effects of various auxiliary technologies on the tissue properties and tissue transformation mechanism are reviewed. In addition, different posttreatment processes (heat treatment, surface strengthening, etc.) of laser cladding coatings are reviewed, providing a reference for further improvements in the microstructure and properties of the cladding layer and discussing existing problems and future development directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Fit comparison of interim crowns manufactured with open and proprietary 3D printing modes versus milling technology: An in vitro study.
- Author
-
Morón‐Conejo, Belén, Berrendero, Santiago, Bai, Shizhu, Martínez‐Rus, Francisco, and Pradies, Guillermo
- Subjects
- *
MATERIALS testing , *IN vitro studies , *CLEANING compounds , *DATA analysis , *DENTURES , *DENTAL materials , *KRUSKAL-Wallis Test , *DENTAL crowns , *DESCRIPTIVE statistics , *GUMS & resins , *IMMERSION in liquids , *COMMERCIAL product evaluation , *MANUFACTURING industries , *STATISTICS , *DENTAL technology , *THREE-dimensional printing , *COMPARATIVE studies , *PROSTHESIS design & construction , *EVALUATION - Abstract
Objectives: This study aimed to assess the fit of interim crowns produced using DLP‐based 3D printing with different manufacturing workflows—open and proprietary—versus milling technology. Methods: A total of 120 crowns were evaluated using the replica technique. The control group (Mill, n = 30) was manufactured via subtractive technology. Experimental groups were printed using a DLP printer (SprintRay Pro95). In the proprietary mode (SR100, n = 30), manufacturer resin was used with a 100‐μm layer thickness (LT) and a splashing cleaning postprocessing. In the open mode, validated resin was used. Group B100 (n = 30) had a 100‐μm LT, and group B50 (n = 30) had a 50‐μm followed by postprocessing in an ultrasonic bath with full immersion in isopropyl alcohol. Kruskal–Wallis tests with Bonferroni correction was applied after normal analysis (α = 0.05). Results: Group B50 exhibited the best overall fit (123.87 ± 67.42 μm), which was comparable to the gold standard Milling group, which demonstrated the lowest marginal fit (p = 0.760). SR100 showed significantly poorer performance compared to Mill, B50, and B100 (p < 0.001). Conclusions: 3D printed and milled interim crowns generally demonstrated clinically acceptable fit, with the exception of the SR100 group. Postprocessing notably influenced crown fit, with the open mode with total immersion in isopropyl alcohol being superior. Clinical Significance: The present study demonstrates that the selection of an optimal manufacturing and postprocessing workflow results in superior fit for interim crowns. This enables dental professionals to evaluate protocols and ensure reliable outcomes with improved clinical outcomes in interim crown fabrication. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Enhancing the Mechanical Properties of Co-Cr Dental Alloys Fabricated by Laser Powder Bed Fusion: Evaluation of Quenching and Annealing as Heat Treatment Methods.
- Author
-
Konieczny, Bartlomiej, Szczesio-Wlodarczyk, Agata, Andrearczyk, Artur, Januszewicz, Bartlomiej, Lipa, Sebastian, Zieliński, Rafał, and Sokolowski, Jerzy
- Subjects
- *
HEAT treatment , *FACE centered cubic structure , *RESIDUAL stresses , *VICKERS hardness , *TENSILE strength , *DENTAL metallurgy - Abstract
Residual stresses and anisotropic structures characterize laser powder bed fusion (L-PBF) products due to rapid thermal changes during fabrication, potentially leading to microcracking and lower strength. Post-heat treatments are crucial for enhancing mechanical properties. Numerous dental technology laboratories worldwide are adopting the new technologies but must invest considerable time and resources to refine them for specific requirements. Our research can assist researchers in identifying thermal processes that enhance the mechanical properties of dental Co-Cr alloys. In this study, high cooling rates (quenching) and annealing after quenching were evaluated for L-PBF Co-Cr dental alloys. Cast samples (standard manufacturing method) were tested as a second reference material. Tensile strength, Vickers hardness, microstructure characterization, and phase identification were performed. Significant differences were found among the L-PBF groups and the cast samples. The lowest tensile strength (707 MPa) and hardness (345 HV) were observed for cast Starbond COS. The highest mechanical properties (1389 MPa, 535 HV) were observed for the samples subjected to the water quenching and reheating methods. XRD analysis revealed that the face-centered cubic (FCC) and hexagonal close-packed (HCP) phases are influenced by the composition and heat treatment. Annealing after quenching improved the microstructure homogeneity and increased the HCP content. L-PBF techniques yielded superior mechanical properties compared to traditional casting methods, offering efficiency and precision. Future research should focus on fatigue properties. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A Comparative Study of AI-Based Automated and Manual Postprocessing of Head and Neck CT Angiography: An Independent External Validation with Multi-Vendor and Multi-Center Data.
- Author
-
Li, Kunhua, Yang, Yang, Niu, Shengwen, Yang, Yongwei, Tian, Bitong, Huan, Xinyue, and Guo, Dajing
- Subjects
- *
CRANIAL radiography , *SCALE analysis (Psychology) , *ANEURYSMS , *RESEARCH funding , *ARTIFICIAL intelligence , *BLOOD vessels , *COMPUTED tomography , *RETROSPECTIVE studies , *TERTIARY care , *DESCRIPTIVE statistics , *MEDICAL records , *ACQUISITION of data , *RESEARCH methodology , *COMPARATIVE studies , *RADIATION doses ,NECK radiography - Abstract
Purpose: To externally validate the performance of automated postprocessing (AP) on head and neck CT Angiography (CTA) and compare it with manual postprocessing (MP). Methods: This retrospective study included head and neck CTA-exams of patients from three tertiary hospitals acquired on CT scanners from five manufacturers. AP was performed by CerebralDoc. The image quality was assessed using Likert scales, and the qualitative and quantitative diagnostic performance of arterial stenosis and aneurysm, postprocessing time, and scanning radiation dose were also evaluated. Results: A total of 250 patients were included. Among these, 55 patients exhibited significant stenosis (≥ 50%), and 33 patients had aneurysms, diagnosed using original CTA datasets and corresponding multiplanar reconstructions as the reference. While the scores of the V4 segment and the edge of the M1 segment on volume rendering (VR), as well as the C4 segment on maximum intensity projection (MIP), were significantly lower with AP compared to MP across vendors (all P < 0.05), most scores in AP demonstrated image quality that was either superior to or comparable with that of MP. Furthermore, the diagnostic performance of AP was either superior to or comparable with that of MP. Moreover, AP also exhibited advantages in terms of postprocessing time and radiation dose when compared to MP (P < 0.001). Conclusion: The AP of CerebralDoc presents clear advantages over MP and holds significant clinical value. However, further optimization is required in the image quality of the V4 and M1 segments on VR as well as the C4 segment on MIP. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Neural network-based surrogate model in postprocessing of topology optimized structures
- Author
-
Persia, Jude Thaddeus, Sung, Myung Kyun, Lee, Soobum, and Burns, Devin E.
- Published
- 2025
- Full Text
- View/download PDF
11. Leveraging Deterministic Weather Forecasts for In Situ Probabilistic Temperature Predictions via Deep Learning.
- Author
-
Landry, David, Charantonis, Anastase, and Monteleoni, Claire
- Subjects
- *
ARTIFICIAL intelligence , *LEAD time (Supply chain management) , *PREDICTION models , *MACHINE learning , *SURFACE temperature - Abstract
We propose a neural network approach to produce probabilistic weather forecasts from a deterministic numerical weather prediction. Our approach is applied to operational surface temperature outputs from the Global Deterministic Prediction System up to 10-day lead times, targeting METAR observations in Canada and the United States. We show how postprocessing performance is improved by training a single model for multiple lead times. Multiple strategies to condition the network for the lead time are studied, including a supplementary predictor and an embedding. The proposed model is evaluated for accuracy, spread, distribution calibration, and its behavior under extremes. The neural network approach decreases the continuous ranked probability score (CRPS) by 15% and has improved distribution calibration compared to a naive probabilistic model based on past forecast errors. Our approach increases the value of a deterministic forecast by adding information about the uncertainty, without incurring the cost of simulating multiple trajectories. It applies to any gridded forecast including the recent machine learning–based weather prediction models. It requires no information regarding forecast spread and can be trained to generate probabilistic predictions from any deterministic forecast. Significance Statement: Weather is difficult to predict a long time in advance because we cannot measure the state of the atmosphere precisely enough. Consequently, it is common practice to run forecasts several times and look at the differences to evaluate how uncertain the prediction is. This process of running ensemble forecasts is expensive and consequently not always feasible. We propose a middle ground where we add uncertainty information to forecasts that were run only once, using artificial intelligence. Our method increases the value of these forecasts by adding information about the uncertainty without incurring the cost of multiple full simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Neighborhood Ensemble Copula Coupling: Smoother and Sharper Calibrated Ensembles.
- Author
-
Trotta, Belinda
- Abstract
Ensemble copula coupling (Schefzik et al.) is a widely used method to produce a calibrated ensemble from a calibrated probabilistic forecast. This process improves the statistical accuracy of the ensemble; in other words, the distribution of the calibrated ensemble members at each grid point more closely approximates the true expected distribution. However, the trade-off is that the individual members are often less physically realistic than the original ensemble: there is noisy variation among neighboring grid points, and, depending on the calibration method, extremes in the original ensemble are sometimes muted. We introduce neighborhood ensemble copula coupling (N-ECC), a simple modification of ECC designed to mitigate these problems. We show that, when used with the calibrated forecasts produced by Flowerdew's (Flowerdew) reliability calibration, N-ECC improves both the visual plausibility and the statistical properties of the forecast. Significance Statement: Numerical weather prediction (NWP) uses physical models of the atmosphere to produce a set of scenarios (called an ensemble) describing possible weather outcomes. These forecasts are used in other models to produce weather forecasts and warnings of extreme events. For example, NWP forecasts of rainfall are used in hydrological models to predict the probability of flooding. However, the raw NWP forecasts require statistical postprocessing to ensure that the range of scenarios they describe accurately represents the true range of possible outcomes. This paper introduces a new method of processing NWP forecasts to produce physically realistic, well-calibrated ensembles. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. An Overview of Postprocessing in Quantum Key Distribution.
- Author
-
Luo, Yi, Cheng, Xi, Mao, Hao-Kun, and Li, Qiong
- Subjects
- *
QUANTUM mechanics , *PARAMETER estimation , *SECURITY systems , *PRIVACY , *PHOTONS - Abstract
Quantum key distribution (QKD) technology is a frontier in the field of secure communication, leveraging the principles of quantum mechanics to offer information-theoretically secure keys. Postprocessing is an important part of a whole QKD system because it directly impacts the secure key rate and the security of the system. In particular, with the fast increase in the photon transmission frequency in a QKD system, the processing speed of postprocessing becomes an essential issue. Our study embarks on a comprehensive review of the development of postprocessing of QKD, including five subprotocols, namely, parameter estimation, sifting, information reconciliation, privacy amplification, and channel authentication. Furthermore, we emphasize the issues raised in the implementation of these subprotocols under practical scenarios, such as limited computation or storage resources and fluctuations in channel environments. Based on the composable security theory, we demonstrate how enhancements in each subprotocol influence the secure key rate and security parameters, which can provide meaningful insights for future advancements in QKD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. New quadratic/serendipity finite volume element solutions on arbitrary triangular/quadrilateral meshes.
- Author
-
Zhou, Yanhui
- Subjects
- *
QUADRILATERALS , *SERENDIPITY , *HEAT equation , *LINEAR systems - Abstract
By postprocessing quadratic and eight‐node serendipity finite element solutions on arbitrary triangular and quadrilateral meshes, we obtain new quadratic/serendipity finite volume element solutions for solving anisotropic diffusion equations. The postprocessing procedure is implemented in each element independently, and we only need to solve a 6‐by‐6 (resp. 8‐by‐8) local linear algebraic system for triangular (resp. quadrilateral) element. The novelty of this paper is that, by designing some new quadratic dual meshes, and adding six/eight special constructed element‐wise bubble functions to quadratic/serendipity finite element solutions, we prove that the postprocessed solutions satisfy local conservation property on the new dual meshes. In particular, for any full anisotropic diffusion tensor, arbitrary triangular and quadrilateral meshes, we present a general framework to prove the existence and uniqueness of new quadratic/serendipity finite volume element solutions, which is better than some existing ones. That is, the existing theoretical results are improved, especially we extend the traditional rectangular assumption to arbitrary convex quadrilateral mesh. As a byproduct, we also prove that the new solutions converge to exact solution with optimal convergence rates under H1$$ {H}^1 $$ and L2$$ {L}^2 $$ norms on primal arbitrary triangular/quasi–parallelogram meshes. Finally, several numerical examples are carried out to validate the theoretical findings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Probabilistic Neural Networks for Ensemble Postprocessing.
- Author
-
Liu, Pu, Dabernig, Markus, Atencia, Aitor, Wang, Yong, and Zhao, Yuchu
- Subjects
- *
ARTIFICIAL neural networks , *ARTIFICIAL intelligence , *NUMERICAL weather forecasting - Abstract
Accurate temperature forecasts are critical for various industries and sectors. We propose a probabilistic neural network (PNN), an extension of the distributional regression network (DRN), for 2-m temperature forecasts, consisting of three variants with different inputs and target variables. The first variant, standardized anomaly probabilistic neural network (SAPNN), employs a two-step approach involving standardized anomalies and global PNN modeling to effectively capture underlying features and anomalies. The second variant, PNN with geographical predictors (PNNGE), incorporates raw and static geographical predictors to enhance predictive performance. The third variant, PNN with station one-hot encoding (PNNEN), utilizes raw with station one-hot encoding predictors to represent geographical information effectively. We compare three PNN variants with two benchmarks: 1) standardized anomaly model output statistics (SAMOS) and 2) three DRN variants identical to those applied to PNN. These evaluations utilize ECMWF data from 2019 to 2020 at 6-h intervals up to 72 h over Hebei, China. Results show that SAPNN and PNNGE are better than SAMOS, while PNNEN notably exhibits a significant 14% improvement in the continuous ranked probability skill score (CRPSS). Moreover, the PNN variants exhibit comparable or superior performance to DRN regarding forecast accuracy, CRPSS, and reliability, showcasing a better-calibrated spread–error relationship. This study highlights the value of the proposed PNN variants with a distribution output in capturing nonlinear relationships within different sources of predictors and improving temperature forecast skills. Significance Statement: This study aims to improve the accuracy, skills, and reliability of 2-m temperature forecasts, which are crucial in agriculture and energy management. To achieve this, we extend a popular artificial intelligence framework and explore four data sources with two schemes to systematically compare the predictive performances in making temperature forecasts. The findings of this research are vital as they offer novel ways to improve forecast skills. Imagine having a weather app that is significantly more accurate, enabling you to plan your day better. This study is about discovering innovative approaches to enhance forecast skills and reliability, which could benefit various aspects of our daily lives. One of the new methods even exhibits a 14% improvement in forecast skills. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Chapter Three: Advanced techniques for additive manufacturing of functional microdevices.
- Author
-
Bernasconi, Roberto
- Abstract
In the context of industry 4.0, which involves digitalization and smart production, additive manufacturing (AM) is playing a fundamental role in transforming the standard production routes of many goods. Thanks to its peculiar features, AM is highly efficient in producing customized and complex products in a sustainable and efficient way. Consequently, it is being employed in several applicative fields like biomedical engineering, aerospace, defense, automotive, etc., AM, in some of its many declinations, has the potential to work also at the microscale, allowing the manufacturing of miniaturized devices like sensors, microfluidic structures, electronic circuits, energy storage setups or microrobots. For these applications in particular, and for all possible applications in general, the already remarkable features of AM can be further expanded by applying some complementary techniques to the standard well-established technologies. By doing this, it is possible to overcome the limitations of the most common AM techniques, boosting their applicability. The present chapter discusses some of the most relevant advanced techniques applicable to standard AM, it analyzes how they enhance applicability and it presents some scientifically relevant applications. Such advanced techniques are essentially subdivided into three macro categories: approaches that use the same AM technology with many different materials (multimaterial AM), approaches that couple more than one AM technology with different materials (hybrid AM) and approaches that alter the properties of parts printed with standard AM techniques (postprocessing). The chapter also includes some final critical considerations and future outlooks on the techniques described. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Strapdown Airborne Gravimetry Based on Aircrafts and UAVs: Postprocessing Algorithms and New Results
- Author
-
Vyazmin, Vadim S., Golovan, Andrey A., Freymueller, Jeffrey T., Series Editor, and Sánchez, Laura, Assistant Editor
- Published
- 2024
- Full Text
- View/download PDF
18. Influence of Postprocessing on Microstructural and Tribological Behavior of HVOF-Sprayed HEA Coating
- Author
-
Abhijith, N. V., Kumar, Deepak, Chaari, Fakher, Series Editor, Gherardini, Francesco, Series Editor, Ivanov, Vitalii, Series Editor, Haddar, Mohamed, Series Editor, Cavas-Martínez, Francisco, Editorial Board Member, di Mare, Francesca, Editorial Board Member, Kwon, Young W., Editorial Board Member, Trojanowska, Justyna, Editorial Board Member, Xu, Jinyang, Editorial Board Member, Sinha, Sujeet Kumar, editor, Kumar, Deepak, editor, Gosvami, Nitya Nand, editor, and Nalam, Prathima, editor
- Published
- 2024
- Full Text
- View/download PDF
19. Exploring the Impact of the NULL Class on In-the-Wild Human Activity Recognition.
- Author
-
Cherian, Josh, Ray, Samantha, Taele, Paul, Koh, Jung In, and Hammond, Tracy
- Subjects
- *
HUMAN activity recognition , *MACHINE learning , *ACTIVITIES of daily living , *HAND washing , *COMBS , *BASIC needs - Abstract
Monitoring activities of daily living (ADLs) plays an important role in measuring and responding to a person's ability to manage their basic physical needs. Effective recognition systems for monitoring ADLs must successfully recognize naturalistic activities that also realistically occur at infrequent intervals. However, existing systems primarily focus on either recognizing more separable, controlled activity types or are trained on balanced datasets where activities occur more frequently. In our work, we investigate the challenges associated with applying machine learning to an imbalanced dataset collected from a fully in-the-wild environment. This analysis shows that the combination of preprocessing techniques to increase recall and postprocessing techniques to increase precision can result in more desirable models for tasks such as ADL monitoring. In a user-independent evaluation using in-the-wild data, these techniques resulted in a model that achieved an event-based F1-score of over 0.9 for brushing teeth, combing hair, walking, and washing hands. This work tackles fundamental challenges in machine learning that will need to be addressed in order for these systems to be deployed and reliably work in the real world. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. A Retrospective Hydrological Uncertainty Analysis Using Precipitation Estimation Ensembles for a Poorly Gauged Basin in High Mountain Asia.
- Author
-
Reggiani, Paolo and Boyko, Oleksiy
- Abstract
We study the impact of uncertain precipitation estimates on simulated streamflows for the poorly gauged Yarlung Tsangpo basin (YTB), high mountain Asia (HMA). A process-based hydrological model at 0.5-km resolution is driven by an ensemble of precipitation estimation products (PEPs), including analyzed ground observations, high-resolution precipitation estimates, climate data records, and reanalyses over the 2008–15 control period. The model is then forced retrospectively from 1983 onward to obtain seamless discharge estimates till 2007, a period for which there is very sparse flow data coverage. Whereas temperature forcing is considered deterministic, precipitation is sampled from the predictive distribution, which is obtained through processing PEPs by means of a probabilistic processor of uncertainty. The employed Bayesian processor combines the PEPs and outputs the predictive densities of daily precipitation depth accumulation as well as the probability of precipitation occurrence, from which random precipitation fields for probabilistic model forcing are sampled. The predictive density of precipitation is conditional on the precipitation estimation predictors that are bias corrected and variance adjusted. For the selected HMA study site, discharges simulated from reanalysis and climate data records score lowest against observations at three flow gauging points, whereas high-resolution satellite estimates perform better, but are still outperformed by precipitation fields obtained from analyzed observed precipitation and merged products, which were corrected against ground observations. The applied methodology indicates how missing flows for poorly gauged sites can be retrieved and is further extendable to hydrological projections of climate. Significance Statement: We show how to use different precipitation estimates, like computer simulations of weather or satellite observations, in conjunction with all available ground measurements in regions with generally poor meteorological and flow measurement infrastructure. We demonstrate how it is possible to retrieve past unobserved river flows using these estimates in combination with a hydrological computer model for streamflow simulations. The method can help us to better understand the hydrology of poorly gauged regions that play an important role in the distribution of water resources and can be affected by future changes. We applied the method to a large transboundary river basin in China. This basin holds water needed by large, densely populated regions of India that may become water constrained by warmer climate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. On the Post-Processing of Complex Additive Manufactured Metallic Parts: A Review.
- Author
-
Pourrahimi, Shamim and Hof, Lucas A.
- Subjects
SURFACE finishing ,SURFACE roughness ,ROUGH surfaces ,GRINDING & polishing ,ELECTROCHEMICAL cutting - Abstract
Additive manufacturing (AM) is gaining more attention due to its capability to produce customized and complex geometries. However, one significant drawback of AM is the rough surface finish of the as-built parts, necessitating post-processing for achieving the desired surface quality that meets application requirements. Post-processing of complex geometries, such as parts with internal holes, lattice structures, and free-form surfaces, poses unique challenges compared to other components. This review classifies various post-processing methods employed for complex AM parts, presenting the experimental conditions for each treatment alongside the resulting improvement in surface roughness as a success criterion. The post-processing methods are categorized into four groups: electrochemical polishing (ECP), chemical polishing (CP), mechanical polishing, and hybrid methods. Notably, mechanical methods exhibit the highest roughness improvement at 69.9%, followed by ECP (59.9%), hybrid methods (47.4%), and CP (49.5%). Nevertheless, mechanical post-processing techniques are less frequently utilized for lattice parts, making chemical or electrochemical methods more promising alternatives. In summary, all four categories of post-processing methods can improve the internal surfaces quality of AM holes. While mechanical methods offer the most substantial roughness improvement overall, chemical and electrochemical methods show particular potential for addressing the challenges associated with complex geometries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Digital postprocessing analysis of prostatic perfusion in neutered dogs.
- Author
-
Spada, Stefano, Arlt, Sebastian, De Felice, Daniela, England, Gary C. W., and Russo, Marco
- Abstract
B‐mode ultrasound is routinely performed to evaluate the prostate gland in neutered dogs, although, the detection of malignancies may be challenging. Contrast‐enhanced ultrasound (CEUS) has shown to be useful for the assessment of prostatic perfusion in normal and diseased dogs, although the interpretation of contrast ultrasonographic features may still be subjective. A quantitative tool for evaluating prostatic perfusion might improve the reliability of the results in terms of early detection of prostate neoplasia in neutered dogs. The present study aimed to evaluate the applicability of a postprocessing analysis tool to CEUS of the prostate in healthy neutered dogs, to provide quantitative measurements, and to study the influence of individual characteristics on prostatic regression. Twenty‐three neutered dogs underwent a B‐mode and CEUS examination of the prostate to acquire data about prostatic morphology and microcirculation. The prostate was imaged using a 5–7.5 MHz linear transducer and contrast was administered intravenously. Videoclips were analyzed by using Qontrast software and a postprocessing digital analysis tool (ImageJ) to measure perfusion peak intensity, time to peak, and vascularization ratio at the moment of the peak, which were then related to body weight, age, and time elapsed since orchiectomy. Correlation tests revealed higher vascularization in younger compared with older dogs (P <.05) and in smaller compared with larger dogs (P <.05). Time elapsed since orchiectomy (P >.05) did not affect prostatic perfusion. Contrast‐enhanced ultrasound and the postprocessing analysis tool ImageJ allowed analysis of vascular perfusion in all dogs and have the potential to improve the diagnostic possibilities for andrological examination. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Processing of High Interstitial Austenitic Steel with Powder Bed Fusion‐Laser Beam/Metal: Evolution of Chemical Inhomogeneity and Microstructural Features during Postprocessing.
- Author
-
Kimm, Janis, Hanke, Stefanie, Weber, Sebastian, and Lentz, Jonathan
- Subjects
ATOM-probe tomography ,ISOSTATIC pressing ,RECRYSTALLIZATION (Metallurgy) ,TRANSMISSION electron microscopy ,AUSTENITIC steel ,HOT pressing ,METAL powders ,POWDERS - Abstract
Powder bed fusion‐laser beam/metal (PBF‐LB/M) additive manufacturing provides a high potential to overcome the poor machinability of nickel‐free high interstitial alloy austenitic (HIA) steels. Therefore, this study focuses on the PBF‐LB/M processability of HIA X40MnCrMoN21‐18‐2 and the effect of postprocessing on microstructure and chemical homogeneity. Samples are fabricated on a laboratory and industrial PBF‐LB/M machine and subsequently postprocessed by conventional solution annealing or hot isostatic pressing (HIP). The influence of the processing steps on the microstructure and on the chemical composition is evaluated by scanning electron microscopy, X‐ray diffraction, transmission electron microscopy, atom probe tomography, and electron backscatter diffraction. The commercially available HIA powder exerts good processability, both by optimized and predefined PBF‐LB/M parameters. Loss of Mn and N is detected after PBF‐LB/M processing. Chemical homogenization but no further change in composition occurs during postprocessing. The as‐built microstructure shows segregation of elements (N, Mo, Cr, Mn) in intercellular spaces. A thermodynamic calculation confirms that N approaches a para‐equilibrium state in the PBF‐LB/M as‐built condition, while C does not. Porosity can be reduced by thermomechanical posttreatment with HIP. At the same time, HIP partially recrystallizes the microstructure, while (Mn + Cr)2SiO4 type oxides delay recovery and recrystallization of the microstructure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Two-headed UNetEfficientNets for parallel execution of segmentation and classification of brain tumors: incorporating postprocessing techniques with connected component labelling.
- Author
-
Rai, Hari Mohan, Yoo, Joon, and Dashkevych, Serhii
- Abstract
Purpose: The purpose of this study is to develop accurate and automated detection and segmentation methods for brain tumors, given their significant fatality rates, with aggressive malignant tumors like Glioblastoma Multiforme (GBM) having a five-year survival rate as low as 5 to 10%. This underscores the urgent need to improve diagnosis and treatment outcomes through innovative approaches in medical imaging and deep learning techniques. Methods: In this work, we propose a novel approach utilizing the two-headed UNetEfficientNets model for simultaneous segmentation and classification of brain tumors from Magnetic Resonance Imaging (MRI) images. The model combines the strengths of EfficientNets and a modified two-headed Unet model. We utilized a publicly available dataset consisting of 3064 brain MR images classified into three tumor classes: Meningioma, Glioma, and Pituitary. To enhance the training process, we performed 12 types of data augmentation on the training dataset. We evaluated the methodology using six deep learning models, ranging from UNetEfficientNet-B0 to UNetEfficientNet-B5, optimizing the segmentation and classification heads using binary cross entropy (BCE) loss with Dice and BCE with focal loss, respectively. Post-processing techniques such as connected component labeling (CCL) and ensemble models were applied to improve segmentation outcomes. Results: The proposed UNetEfficientNet-B4 model achieved outstanding results, with an accuracy of 99.4% after postprocessing. Additionally, it obtained high scores for DICE (94.03%), precision (98.67%), and recall (99.00%) after post-processing. The ensemble technique further improved segmentation performance, with a global DICE score of 95.70% and Jaccard index of 91.20%. Conclusion: Our study demonstrates the high efficiency and accuracy of the proposed UNetEfficientNet-B4 model in the automatic and parallel detection and segmentation of brain tumors from MRI images. This approach holds promise for improving diagnosis and treatment planning for patients with brain tumors, potentially leading to better outcomes and prognosis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Operational Aviation Icing Forecast Algorithm for the Korea Meteorological Administration.
- Author
-
Kim, Eun-Tae, Kim, Jung-Hoon, Kim, Soo-Hyun, and Morcrette, Cyril
- Subjects
- *
AIR travel , *NUMERICAL weather forecasting , *SUPERCOOLED liquids , *RESEARCH aircraft , *ALGORITHMS , *FORECASTING - Abstract
In this study, we developed and evaluated the Korean Forecast Icing Potential (K-FIP), an in-flight icing forecast system for the Korea Meteorological Administration (KMA) based on the simplified forecast icing potential (SFIP) algorithm. The SFIP is an algorithm used to postprocess numerical weather prediction (NWP) model forecasts for predicting potential areas of icing based on the fuzzy logic formulations of four membership functions: temperature, relative humidity, vertical velocity, and cloud liquid water content. In this study, we optimized the original version of the SFIP for the global NWP model of the KMA through three important updates using 34 months of pilot reports for icing as follows: using total cloud condensates, reconstructing membership functions, and determining the best weight combination for input variables. The use of all cloud condensates and the reconstruction of these membership functions resulted in a significant improvement in the algorithm compared with the original. The weight combinations for the KMA's global model were determined based on the performance scores. While several sets of weights performed equally well, this process identified the most effective weight combination for the KMA model, which is referred to as the K-FIP. The K-FIP demonstrated the ability to successfully predict icing over the Korean Peninsula using observations made by research aircraft from the National Institute of Meteorological Sciences of the KMA. Eventually, the K-FIP icing forecasts will provide better forecasts of icing potentials for safe and efficient aviation operations in South Korea. Significance Statement: In-flight aircraft icing has posed a threat to safe flights for decades. With advances in computing resources and an improvement in the spatiotemporal resolutions of numerical weather prediction (NWP) models, icing algorithms have been developed using NWP model outputs associated with supercooled liquid water. This study evaluated and optimized the simplified forecast icing potential, an NWP model–based icing algorithm, for the global model of the Korean Meteorological Administration (KMA) using a long-term observational dataset to improve its prediction skills. The improvements shown in this study and the SFIP implemented in the KMA will provide more informative predictions for safe and efficient air travel. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Comparison of Clustering Approaches in a Multimodel Ensemble for U.S. East Coast Cold Season Extratropical Cyclones.
- Author
-
Kiel, Benjamin M. and Colle, Brian A.
- Subjects
- *
CYCLONES , *SEA level , *NUMERICAL weather forecasting , *LEAD time (Supply chain management) , *HIERARCHICAL clustering (Cluster analysis) - Abstract
Several clustering approaches are evaluated for 1–9-day forecasts using a multimodel ensemble that includes the GEFS, ECMWF, and Canadian ensembles. Six clustering algorithms and three clustering spaces are evaluated using mean sea level pressure (MSLP) and 12-h accumulated precipitation (APCP) for cool-season extratropical cyclones across the Northeast United States. Using the MSLP cluster membership to obtain the APCP clusters is also evaluated, along with applying clustering determined at one lead time to cluster forecasts at a different lead time. Five scenarios from each clustering algorithm are evaluated using displacement and intensity/amount errors from the scenario nearest to the MSLP and 12-h APCP analyses in the NCEP GFS and ERA5, respectively. Most clustering strategies yield similar improvements over the full ensemble mean and are similar in probabilistic skill except that 1) intensity displacement space gives lower MSLP displacement and intensity errors; and 2) Euclidean space and agglomerative hierarchical clustering, when using either full or average linkage, struggle to produce reasonably sized clusters. Applying clusters derived from MSLP to 12-h APCP forecasts is not as skillful as clustering by 12-h APCP directly, especially if several members contain little precipitation. Use of the same cluster membership for one lead time to cluster the forecast at another lead time is less skillful than clustering independently at each forecast lead time. Finally, the number of members within each cluster does not necessarily correspond with the best forecast, especially at the longer lead times, when the probability of the smallest cluster being the best scenario was usually underestimated. Significance Statement: Numerical weather prediction ensembles are widely used, but more postprocessing tools are necessary to help forecasters interpret and communicate the possible outcomes. This study evaluates various clustering approaches, combining a large number of model forecasts with similar attributes together into a small number of scenarios. The 1–9-day forecasts of both sea level pressure and 12-h precipitation are used to evaluate the clustering approaches for a large number of U.S. East Coast winter cyclones, which is an important forecast problem for this region. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Deep Learning for Postprocessing Global Probabilistic Forecasts on Subseasonal Time Scales.
- Author
-
Horat, Nina and Lerch, Sebastian
- Subjects
- *
DEEP learning , *NUMERICAL weather forecasting , *CONVOLUTIONAL neural networks , *GLOBAL method of teaching , *FORECASTING , *WEATHER forecasting - Abstract
Subseasonal weather forecasts are becoming increasingly important for a range of socioeconomic activities. However, the predictive ability of physical weather models is very limited on these time scales. We propose four postprocessing methods based on convolutional neural networks to improve subseasonal forecasts by correcting systematic errors of numerical weather prediction models. Our postprocessing models operate directly on spatial input fields and are therefore able to retain spatial relationships and to generate spatially homogeneous predictions. They produce global probabilistic tercile forecasts for biweekly aggregates of temperature and precipitation for weeks 3–4 and 5–6. In a case study based on a public forecasting challenge organized by the World Meteorological Organization, our postprocessing models outperform the bias-corrected forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF), and achieve improvements over climatological forecasts for all considered variables and lead times. We compare several model architectures and training modes and demonstrate that all approaches lead to skillful and well-calibrated probabilistic forecasts. The good calibration of the postprocessed forecasts emphasizes that our postprocessing models reliably quantify the forecast uncertainty based on deterministic input information in the form of ECMWF ensemble mean forecast fields only. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Bias Correction of Tropical Cyclone Intensity for Ensemble Forecasts Using the XGBoost Method.
- Author
-
Feng, Songjiang, Tan, Yan, Kang, Junfeng, Zhong, Quanjia, Li, Yanjie, and Ding, Ruiqiang
- Subjects
- *
TROPICAL cyclones , *WIND speed , *SEA level , *TYPHOONS , *BOOSTING algorithms , *FORECASTING - Abstract
In this study, the extreme gradient boosting (XGBoost) algorithm is used to correct tropical cyclone (TC) intensity in ensemble forecast data from the Typhoon Ensemble Data Assimilation and Prediction System (TEDAPS) at the Shanghai Typhoon Institute (STI), China Meteorological Administration (CMA). Results show that the forecast accuracy of TC intensity may be improved substantially using the XGBoost algorithm, especially when compared with a simple ensemble average of all members in the ensemble forecast [as depicted by the ensemble average (EnsAve) algorithm in this study]. The forecast errors for maximum wind speed (MWS) and minimum sea level pressure (MSLP) have been reduced by a significant margin, ranging from 6.3% to 18.4% for MWS and from 4% to 14.9% for MSLP, respectively. The performance of the XGBoost algorithm is overall better than that of the EnsAve algorithm, although there are a few samples when it is worse. The bias analysis shows that TEDAPS underpredicts the MWS and overpredicts the MSLP, meaning that the TEDAPS underestimates TC intensity. However, the XGBoost algorithm can reduce the bias to improve the forecast accuracy of TC intensity. Specifically, it achieves a reduction of over 20% in forecast errors for both the MWS and MSLP of typhoons compared to the EnsAve algorithm, indicating the XGBoost algorithm's particular advantage in forecasting intense TCs. These results indicate that the TC intensity forecast can be substantially improved using the XGBoost algorithm, relative to the EnsAve algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. The role of controlled voids shape on the flexural properties of 3D printed food: an approach for tailoring their mechanical properties.
- Author
-
Maldonado-Rosas, Rubén, Pérez-Castillo, José Luis, Cuan-Urquizo, Enrique, and Tejada-Ortigoza, Viridiana
- Subjects
- *
FINITE element method , *TRIANGLES , *FOOD consumption , *IMAGE analysis , *BEND testing - Abstract
The study of the mechanical properties of 3D printed food is crucial for food personalisation. Texture impacts the sensory experience along the oral food consumption. This work aimed to investigate the mechanical properties of post-processed printed food samples with different porosity topologies (triangled-shape/squared-shape). Image analysis showed changes in the structure before and after post-processing. The mechanical properties characterised via 3-point bending tests revealed that triangle-shaped topologies presented lower strength (20% less) and higher flexural stiffness (-20%) when compared to the square-shaped topologies. A qualitative comparison of the porosity topologies and their role in the stiffness of structures under tensile and bending loads was performed via finite element models. The flexural stiffness varied between the triangle-shaped designs (13-35%) but remained almost constant for squareshaped designs (40-43%). The results presented in this work showed that the mechanical properties of 3D-printed food could be modified by the selection of porosity topology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Reconfigurable Hexapartite Cluster States by Four‐Wave Mixing Processes with a Spatially Structured Pump.
- Author
-
Zhang, Kai, Guo, Yu, and Jing, Jietai
- Subjects
FOUR-wave mixing ,QUANTUM computing ,QUANTUM states ,HUMAN information processing ,INFORMATION processing ,SCALABILITY - Abstract
The universal quantum computation provides a new paradigm for information processing. One feasible approach is measurement‐based one‐way quantum computation utilizing cluster states. Generally, the generation of cluster states with different structures for implementing on‐demand quantum computation needs different experimental setup, which limits its scalability. Here, the reconfigurable hexapartite cluster states created by postprocessing the quadrature information of hexapartite entangled state are demonstrated. Without altering the experimental layout, nine quantum correlated states with different structures, especially three cluster states, are implemented. In particular, such method can effectively reduce the excess noise introduced by creating cluster states under limited squeezing resources. This approach provides an avenue for realizing large‐scale reconfigurable cluster states without changing the experimental architecture. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. From bytes to bites: Advancing the food industry with three‐dimensional food printing.
- Author
-
Hamilton, Allyson N., Mirmahdi, Razieh S., Ubeyitogullari, Ali, Romana, Chetanjot K., Baum, Jamie I., and Gibson, Kristen E.
- Abstract
The rapid advancement of three‐dimensional (3D) printing (i.e., a type of additive manufacturing) technology has brought about significant advances in various industries, including the food industry. Among its many potential benefits, 3D food printing offers a promising solution to deliver products meeting the unique nutritional needs of diverse populations while also promoting sustainability within the food system. However, this is an emerging field, and there are several aspects to consider when planning for use of 3D food printing for large‐scale food production. This comprehensive review explores the importance of food safety when using 3D printing to produce food products, including pathogens of concern, machine hygiene, and cleanability, as well as the role of macronutrients and storage conditions in microbial risks. Furthermore, postprocessing factors such as packaging, transportation, and dispensing of 3D‐printed foods are discussed. Finally, this review delves into barriers of implementation of 3D food printers and presents both the limitations and opportunities of 3D food printing technology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Improving magnetic resonance spectroscopy in the brainstem periaqueductal gray using spectral registration.
- Author
-
Sirucek, Laura, Zoelch, Niklaus, and Schweinhardt, Petra
- Subjects
NUCLEAR magnetic resonance spectroscopy ,BRAIN stem ,RECORDING & registration ,MEDIAN (Mathematics) - Abstract
Purpose: Functional understanding of the periaqueductal gray (PAG), a clinically relevant brainstem region, can be advanced using 1H‐MRS. However, the PAG's small size and high levels of physiological noise are methodologically challenging. This study aimed to (1) improve 1H‐MRS quality in the PAG using spectral registration for frequency and phase error correction; (2) investigate whether spectral registration is particularly useful in cases of greater head motion; and (3) examine metabolite quantification using literature‐based or individual‐based water relaxation times. Methods: Spectra were acquired in 33 healthy volunteers (50.1 years, SD = 17.19, 18 females) on a 3 T Philipps MR system using a point‐resolved spectroscopy (PRESS) sequence optimized with very selective saturation pulses (OVERPRESS) and voxel‐based flip angle calibration (effective volume of interest size: 8.8 × 10.2 × 12.2 mm3). Spectra were fitted using LCModel and SNR, NAA peak linewidths and Cramér‐Rao lower bounds (CRLBs) were measured after spectral registration and after minimal frequency alignment. Results: Spectral registration improved SNR by 5% (p = 0.026, median value post‐correction: 18.0) and spectral linewidth by 23% (p < 0.001, 4.3 Hz), and reduced the metabolites' CRLBs by 1% to 15% (p < 0.026). Correlational analyses revealed smaller SNR improvements with greater head motion (p = 0.010) recorded using a markerless motion tracking system. Higher metabolite concentrations were detected using individual‐based compared to literature‐based water relaxation times (p < 0.001). Conclusion: This study demonstrates high‐quality 1H‐MRS acquisition in the PAG using spectral registration. This shows promise for future 1H‐MRS studies in the PAG and possibly other clinically relevant brain regions with similar methodological challenges. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Undersampling and cumulative class re-decision methods to improve detection of agitation in people with dementia.
- Author
-
Meng, Zhidong, Iaboni, Andrea, Ye, Bing, Newman, Kristine, Mihailidis, Alex, Deng, Zhihong, and Khan, Shehroz S.
- Abstract
Agitation is one of the most prevalent symptoms in people with dementia (PwD) that can place themselves and the caregiver's safety at risk. Developing objective agitation detection approaches is important to support health and safety of PwD living in a residential setting. In a previous study, we collected multimodal wearable sensor data from 17 participants for 600 days and developed machine learning models for detecting agitation in 1-min windows. However, there are significant limitations in the dataset, such as imbalance problem and potential imprecise labels as the occurrence of agitation is much rarer in comparison to the normal behaviours. In this paper, we first implemented different undersampling methods to eliminate the imbalance problem, and came to the conclusion that only 20% of normal behaviour data were adequate to train a competitive agitation detection model. Then, we designed a weighted undersampling method to evaluate the manual labeling mechanism given the ambiguous time interval assumption. After that, the postprocessing method of cumulative class re-decision (CCR) was proposed based on the historical sequential information and continuity characteristic of agitation, improving the decision-making performance for the potential application of agitation detection system. The results showed that a combination of undersampling and CCR improved F1-score and other metrics to varying degrees with less training time and data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Postprocessing
- Author
-
Steiner, Stephen A., III, Griffin, Justin S., Nelson, Ryan T., Hurwitz, Frances I., Worsley, Marcus A., Merkle, Dieter, Managing Editor, Aegerter, Michel A., editor, Leventis, Nicholas, editor, Koebel, Matthias, editor, and Steiner III, Stephen A., editor
- Published
- 2023
- Full Text
- View/download PDF
35. The Novel Multi Source Method for the Randomness Extraction
- Author
-
Iavich, Maksim, Kuchukhidze, Tamari, Xhafa, Fatos, Series Editor, Hu, Zhengbing, editor, Wang, Yong, editor, and He, Matthew, editor
- Published
- 2023
- Full Text
- View/download PDF
36. Statistical refinement of the North American Multi-Model Ensemble precipitation forecasts over Karoon basin, Iran
- Author
-
Farhad Yazdandoost, Mina Zakipour, and Ardalan Izadi
- Subjects
copula ,extreme value ,nmme ,postprocessing ,precipitation forecast ,topsis ,Environmental technology. Sanitary engineering ,TD1-1066 ,Environmental sciences ,GE1-350 - Abstract
An effective postprocessing approach has been examined to improve the skill of North American Multi-Model Ensemble (NMME) precipitation forecasts in the Karoon basin, Iran. The Copula–Bayesian approach was used along with the Normal Kernel Density marginal distribution and the Kernel Copula function. This process creates more than one postprocessing precipitation value as results candidates (first pass). A similar process is used for a second pass to obtain preprocessed values based on the candidate inputs, which helps identify the most suitable postprocessed value. The application of the technique for order preference by similarity to the ideal solution method based on conditional probability distribution functions of the first and second passes leads to achieving final improved forecast data among the existing candidates. To validate the results, data from 1982–2010 and 2011–2018 were used for the calibration and forecast periods. The results show that while the GFDL and CFS2 models tend to overestimate precipitation, most other NMME models underestimate it. Postprocessing improves the accuracy of forecasts for most models by 20%–40%. Overall, the proposed Copula–Bayesian postprocessing approach could provide more reliable forecasts with higher spatial and temporal consistency, better detection of extreme precipitation values, and a significant reduction in uncertainties. HIGHLIGHTS The precipitation forecasts of Karoon river watershed in southwest Iran as a flood-prone area are investigated.; A new postprocessing approach is presented for North American Multi-Model Ensemble (NMME) precipitation estimations.; The proposed method is based on the Copula–Bayesian approach.; The method is desirable for detection of the extreme precipitation values.; Significant increases in forecast skill of improved NMME data are provided.;
- Published
- 2023
- Full Text
- View/download PDF
37. Practical guidance to identify and troubleshoot suboptimal DSC-MRI results
- Author
-
Melissa A. Prah and Kathleen M. Schmainda
- Subjects
DSC-MRI/Dynamic susceptibility contrast ,rCBV/rCBF ,perfusion ,guide ,postprocessing ,issues/troubleshooting ,Medical physics. Medical radiology. Nuclear medicine ,R895-920 - Abstract
Relative cerebral blood volume (rCBV) derived from dynamic susceptibility contrast (DSC) perfusion MR imaging (pMRI) has been shown to be a robust marker of neuroradiological tumor burden. Recent consensus recommendations in pMRI acquisition strategies have provided a pathway for pMRI inclusion in diverse patient care centers, regardless of size or experience. However, even with proper implementation and execution of the DSC-MRI protocol, issues will arise that many centers may not easily recognize or be aware of. Furthermore, missed pMRI issues are not always apparent in the resulting rCBV images, potentiating inaccurate or missed radiological diagnoses. Therefore, we gathered from our database of DSC-MRI datasets, true-to-life examples showcasing the breakdowns in acquisition, postprocessing, and interpretation, along with appropriate mitigation strategies when possible. The pMRI issues addressed include those related to image acquisition and postprocessing with a focus on contrast agent administration, timing, and rate, signal-to-noise quality, and susceptibility artifact. The goal of this work is to provide guidance to minimize and recognize pMRI issues to ensure that only quality data is interpreted.
- Published
- 2024
- Full Text
- View/download PDF
38. Deterministic Rapid Intensity Forecast Guidance for the Joint Typhoon Warning Center's Area of Responsibility.
- Author
-
Sampson, C. R., Knaff, J. A., Slocum, C. J., Onderlinde, M. J., Brammer, A., Frost, M., and Strahl, B.
- Subjects
- *
TYPHOONS , *TROPICAL cyclones , *FORECASTING , *FALSE alarms , *FUTUROLOGISTS - Abstract
Intensity consensus forecasts can provide skillful overall guidance for intensity forecasting at the Joint Typhoon Warning Center as they provide among the lowest mean absolute errors; however, these forecasts are far less useful for periods of rapid intensification (RI) as guidance provided is generally low biased. One way to address this issue is to construct a consensus that also includes deterministic RI forecast guidance in order to increase intensification rates during RI. While this approach increases skill and eliminates some bias, consensus forecasts from this approach generally remain low biased during RI events. Another approach is to construct a consensus forecast using an equally weighted average of deterministic RI forecasts. This yields a forecast that is generally among the top performing RI guidance, but suffers from false alarms and a high bias due to those false alarms. Neither approach described here is a prescription for forecast success, but both have qualities that merit consideration for operational centers tasked with the difficult task of RI prediction. Significance Statement: Forecasters at the Joint Typhoon Warning Center are required to make intensity forecasts every watch. Skillful guidance is available to make these forecasts, yielding lower mean absolute errors and biases; however, errors are higher for tropical cyclones either undergoing rapid intensification or with the potential to do so. This effort is an attempt to mitigate higher errors associated with rapid intensification forecasts using existing guidance and consensus techniques. Resultant rapid intensification guidance can be used to reduce operational forecast intensity forecast errors and provide advanced warning to customers for these difficult cases. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Predictability of Rainfall over Equatorial East Africa in the ECMWF Ensemble Reforecasts on Short- to Medium-Range Time Scales.
- Author
-
Ageet, Simon, Fink, Andreas H., Maranan, Marlon, and Schulz, Benedikt
- Subjects
- *
ISOTONIC regression , *PRECIPITATION forecasting , *LEAD time (Supply chain management) , *RAIN gauges , *CLIMATOLOGY , *RAINFALL - Abstract
Despite the enormous potential of precipitation forecasts to save lives and property in Africa, low skill has limited their uptake. To assess the skill and improve the performance of the forecast, validation and postprocessing should continuously be carried out. Here, we evaluate the quality of reforecasts from the European Centre for Medium-Range Weather Forecasts over equatorial East Africa (EEA) against satellite and rain gauge observations for the period 2001–18. The 24-h rainfall accumulations are analyzed from short- to medium-range time scales. Additionally, 48- and 120-h rainfall accumulations were also assessed. The skill was assessed using an extended probabilistic climatology (EPC) derived from the observations. Results show that the reforecasts overestimate rainfall, especially during the rainy seasons and over high-altitude areas. However, there is potential of skill in the raw forecasts up to 14-day lead time. There is an improvement of up to 30% in the Brier score/continuous ranked probability score relative to EPC in most areas, especially the higher-altitude regions, decreasing with lead time. Aggregating the reforecasts enhances the skill further, likely due to a reduction in timing mismatches. However, for some regions of the study domain, the predictive performance is worse than EPC, mainly due to biases. Postprocessing the reforecasts using isotonic distributional regression considerably improves skill, increasing the number of grid points with positive Brier skill score (continuous ranked probability skill score) by an average of 81% (91%) for lead times 1–14 days ahead. Overall, the study highlights the potential of the reforecasts, the spatiotemporal variation in skill, and the benefit of postprocessing in EEA. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. A Multi-Object Tracking Approach Combining Contextual Features and Trajectory Prediction.
- Author
-
Zhang, Peng, Jing, Qingyang, Zhao, Xinlei, Dong, Lijia, Lei, Weimin, Zhang, Wei, and Lin, Zhaonan
- Subjects
OBJECT tracking (Computer vision) ,FORECASTING ,PROBLEM solving - Abstract
Aiming to solve the problem of the identity switching of objects with similar appearances in real scenarios, a multi-object tracking approach combining contextual features and trajectory prediction is proposed. This approach integrates the motion and appearance features of objects. The motion features are mainly used for trajectory prediction, and the appearance features are divided into contextual features and individual features, which are mainly used for trajectory matching. In order to accurately distinguish the identities of objects with similar appearances, a context graph is constructed by taking the specified object as the master node and its neighboring objects as the branch nodes. A preprocessing module is applied to exclude unnecessary connections in the graph model based on the speed of the historical trajectory of the object, and to distinguish the features of objects with similar appearances. Feature matching is performed using the Hungarian algorithm, based on the similarity matrix obtained from the features. Post-processing is performed for the temporarily unmatched frames to obtain the final object matching results. The experimental results show that the approach proposed in this paper can achieve the highest MOTA. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. Improved Visualization and Quantification of Net Water Uptake in Recent Small Subcortical Infarcts in the Thalamus Using Computed Tomography.
- Author
-
Schön, Felix, Wahl, Hannes, Grey, Arne, Krukowski, Pawel, Müller, Angela, Puetz, Volker, Linn, Jennifer, and Kaiser, Daniel P. O.
- Subjects
- *
DIFFUSION magnetic resonance imaging , *THALAMUS , *LACUNAR stroke , *COMPUTED tomography , *DATA visualization - Abstract
Diagnosing recent small subcortical infarcts (RSSIs) via early computed tomography (CT) remains challenging. This study aimed to assess CT attenuation values (Hounsfield Units (HU)) and net water uptake (NWU) in RSSI and explore a postprocessing algorithm's potential to enhance thalamic RSSI detection. We examined non-contrast CT (NCCT) data from patients with confirmed thalamic RSSI on diffusion-weighted magnetic resonance imaging (DW-MRI) between January 2010 and October 2017. Co-registered DW-MRI and NCCT images enabled HU and NWU quantification in the infarct area compared to unaffected contralateral tissue. Results were categorized based on symptom onset to NCCT timing. Postprocessing using window optimization and frequency-selective non-linear blending (FSNLB) was applied, with interpretations by three blinded Neuroradiologists. The study included 34 patients (median age 70 years [IQR 63–76], 14 women). RSSI exhibited significantly reduced mean CT attenuation compared to unaffected thalamus (29.6 HU (±3.1) vs. 33.3 HU (±2.6); p < 0.01). Mean NWU in the infarct area increased from 6.4% (±7.2) at 0–6 h to 16.6% (±8.7) at 24–36 h post-symptom onset. Postprocessed NCCT using these HU values improved sensitivity for RSSI detection from 32% in unprocessed CT to 41% in FSNLB-optimized CT, with specificities ranging from 86% to 95%. In conclusion, CT attenuation values and NWU are discernible in thalamic RSSI up to 36 h post-symptom onset. Postprocessing techniques, particularly window optimization and FSNLB, moderately enhance RSSI detection. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Long-Range Atmospheric Transport of Black Carbon from Severe Forest Fires in Siberia to the Arctic Basin.
- Author
-
Bardin, M. Yu.
- Subjects
- *
SEA ice , *ATMOSPHERIC transport , *CARBON-black , *FOREST fires , *ARCTIC climate , *ATMOSPHERIC circulation - Abstract
This work is part of a study on the impact of black carbon (BC) transfer from various sources to the Arctic on climate change in the region. The main objectives are to develop software for analyzing the Lagrangian transport of air particles; assessing the deposition of aerosol particles by precipitation and the concentration of particles in the atmosphere; and obtaining, for specific conditions of atmospheric circulation during severe fires in the years of maximum reduction in the Arctic sea ice area, estimates of the relative residence time of air particles emitted by these fires over the Arctic Basin (AB), as well as the proportion of BC deposited in the AB from fires. This software package contains a module for calculating Lagrangian trajectories from a 4-dimensional wind array (u, v, ω, t), which contains horizontal wind components and an analog of vertical speed available from reanalysis, as well as modules for the postprocessing of the found trajectories, which allow us to obtain in a given area the residence time estimates, 3-dimensional BC concentration, and BC deposition on the surface, also using reanalysis data and some empirical constants. Since the main decrease in the Arctic sea ice area occurred in 2 years, 2007 and 2012, it was supposed to analyze the fires of these years; however, in 2007, there were no great fires, and in 2012 one fire was much larger than the others (K-217, March–June). This fire was chosen for the experiments: several sets of trajectories were obtained for it, corresponding to various options for choosing the initial conditions, and estimates were obtained for the fraction of trajectories that passed over the Arctic basin, the time spent there, and the fraction of BC deposited in the AB. Together, these estimates led to the conclusion that Siberian fires can hardly be the leading cause of the accelerated melting of Arctic sea ice. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Superconvergent postprocessing of the C1-conforming finite element method for fourth-order boundary value problems.
- Author
-
Zha, Yuanyuan, Li, Zhe, and Yi, Lijun
- Subjects
- *
BOUNDARY element methods , *BOUNDARY value problems , *FINITE element method , *JACOBI polynomials - Abstract
We develop a very simple but efficient postprocessing technique for enhancing the accuracy of the C 1 -conforming finite element method for fourth-order boundary value problems. The key idea of the postprocessing technique is to add certain generalized Jacobi polynomials of degree larger than k to the Galerkin approximation of degree k. We prove that the postprocess improves the order of convergence of the Galerkin approximation under L 2 - and H 2 -norms. Numerical experiments are provided to illustrate the theoretical results. We also present computational results for a steady-state problem with to demonstrate the effectiveness of the postprocessing. • We propose a novel postprocessing technique for enhancing the accuracy of the C 1 -conforming FEM for fourth-order BVPs. • Convergence orders of the L 2 - and H 2 -errors of the Galerkin approximation are improved from O (h k + 1) and O (h k − 1) to O (h { 2 k − 2 , k + 3 }) and O (h k + 1) , respectively. • The postprocessing is local and can be done independently on each element. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. IMU and GNSS Postprocessing for High-Resolution Strapdown Airborne Gravimetry †.
- Author
-
Vyazmin, Vadim and Golovan, Andrey
- Subjects
GLOBAL Positioning System ,ARTIFICIAL satellites in navigation ,GRAVIMETRY ,ELECTRONIC data processing ,WAVELENGTHS - Abstract
Strapdown airborne gravimeters based on a navigation-grade inertial measurement unit (IMU) are highly sensitive to perturbations during aircraft flight, especially in the case of flights in draped mode (at a constant altitude above the terrain) or drone-based flights. This implies the crucial importance of postprocessing, including determination of the IMU and GNSS navigation solutions, IMU/GNSS integration, and gravity estimation. In the paper, we briefly outline the key aspects of the developed postprocessing methodology. By processing raw data from two surveys (one is based on a small aircraft and the other on a drone), we investigate the best achievable spatial resolution of strapdown airborne gravimetry. We show that high-accuracy gravity estimates (at sub-mGal level) at a half-wavelength spatial resolution of 1 km can be obtained in the considered surveys. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. A spatial postprocessing method of precipitation forecast based on ECMWF ensemble predication system and application effect evaluation
- Author
-
Yixiao FU, Yu WANG, Jiajia HUA, Xuejiao CHEN, Shu LIU, and Tianxiang GAO
- Subjects
precipitation forecasting ,postprocessing ,targets matching ,error correction ,Meteorology. Climatology ,QC851-999 - Abstract
Modeling postprocessing methods can improve the accuracy of quantitative precipitation forecasts. At present, postprocessing methods for precipitation based on statistical analysis are mainly used to correct the precipitation rates or to estimate the precipitation probability. It usually ignores the spatial displacement errors of the precipitation area forecast, thus resulting in low forecast scores. In this study, a new spatial postprocessing method based on rain cluster matching is developed to correct the spatial errors of the precipitation area forecast, in order to improve the forecasting accuracy. With the identification and separation of rain clusters, this method applies the Bayesian multi-objective tracking approach and compares the model forecasting and observed rain clusters at the current time window, so as to obtain the displacement and intensity errors between the model forecasting results and the observations. Finally, these discrepancies are used to correct the model output in the coming time window. With the method proposed in this study, the precipitation forecast based on ECMWF ensemble predication system for summer precipitation processes during 2018—2019 in North China are corrected and tested. Using the CMPAS hourly precipitation analysis dataset as observations, the test results show that, after correction, the mean TS score of the precipitation forecasts at coming time window increases from 0.333 to 0.369, with the correlation coefficient increasing from 0.260 to 0.327, and the mean absolute error decreasing from 2.788 mm to 2.541 mm. We suggest that the method proposed in this study can effectively improve the accuracy of precipitation forecasts.
- Published
- 2023
- Full Text
- View/download PDF
46. A comprehensive review on surface quality improvement methods for additively manufactured parts
- Author
-
Hashmi, Abdul Wahab, Mali, Harlal Singh, and Meena, Anoj
- Published
- 2023
- Full Text
- View/download PDF
47. Asymptotic Errors in the Superconvergence of Discontinuous Galerkin Methods Based on Upwind-Biased Fluxes for 1D Linear Hyperbolic Equations
- Author
-
Lu, Tianshi and Rahmati, Sirvan
- Published
- 2024
- Full Text
- View/download PDF
48. Study of early flood warning based on postprocessed predicted precipitation and Xinanjiang model
- Author
-
Xiaolei Jiang, Liping Zhang, Zhongmin Liang, Xiaolei Fu, Jun Wang, Jiaxin Xu, Yuchen Zhang, and Qi Zhong
- Subjects
GEFS ,Early flood warning ,Xinanjiang model ,Flood forecasting ,Postprocessing ,Meteorology. Climatology ,QC851-999 - Abstract
Precipitation is the most common cause of flood. Accurate precipitation prediction is therefore important for flood forecasting and can be a key factor that increases lead time and the accuracy of early flood warning. In this study, the reforecast precipitation of the global ensemble forecast system (GEFS) was postprocessed using CSG EMOS (censored and shifted gamma distribution-based ensemble model output statistics) method to improve its reliability, and then used as the forcing data for Xinanjiang model to increase the lead time of flood forecasts in order to provide a more effective early flood warning based on an empirically water level–discharge curve at Wangjiaba section, Huaihe River basin, China. Three scenarios were set to demonstrate the importance of precipitation prediction in flood forecasting. The results showed that predicted precipitation became more reliable after postprocessing and this improvement increased as lead time expanded. It is also demonstrated that the postprocessed predicted precipitation brings the improvement for flood forecasting, then leads to the gain of early flood warning. However, this improvement becomes less significant by the increase of lead time and fades away when lead time reaches 7 d. In addition, the results of flood forecast and early warning in predicted precipitation scenario were not as good as those in observed precipitation scenario, indicating that substitution of predicted rainfall for observation requires further refinement in future.
- Published
- 2023
- Full Text
- View/download PDF
49. Conditional Ensemble Model Output Statistics for Postprocessing of Ensemble Precipitation Forecasting.
- Author
-
Ji, Yan, Zhi, Xiefei, Ji, Luying, and Peng, Ting
- Subjects
- *
PRECIPITATION forecasting , *FORECASTING , *GAMMA distributions , *STATISTICAL ensembles , *STATISTICS , *PRECIPITATION (Chemistry) - Abstract
Forecasts produced by EPSs provide the potential state of the future atmosphere and quantify uncertainty. However, the raw ensemble forecasts from a single EPS are typically characterized by underdispersive predictions, especially for precipitation that follows a right-skewed gamma distribution. In this study, censored and shifted gamma distribution ensemble model output statistics (CSG-EMOS) is performed as one of the state-of-the-art methods for probabilistic precipitation postprocessing across China. Ensemble forecasts from multiple EPSs, including the European Centre for Medium-Range Weather Forecasts, the National Centers for Environmental Prediction, and the Met Office, are collected as raw ensembles. A conditional CSG EMOS (Cond-CSG-EMOS) model is further proposed to calibrate the ensemble forecasts for heavy-precipitation events, where the standard CSG-EMOS is insufficient. The precipitation samples from the training period are divided into two categories, light- and heavy-precipitation events, according to a given precipitation threshold and prior ensemble forecast. Then individual models are, respectively, optimized for adequate parameter estimation. The results demonstrate that the Cond-CSG-EMOS is superior to the raw EPSs and the standard CSG-EMOS, especially for the calibration of heavy-precipitation events. The spatial distribution of forecast skills shows that the Cond-CSG-EMOS outperforms the others over most of the study region, particularly in North and Central China. A sensitivity testing on the precipitation threshold shows that a higher threshold leads to better outcomes for the regions that have more heavy-precipitation events, i.e., South China. Our results indicate that the proposed Cond-CSG-EMOS model is a promising approach for the statistical postprocessing of ensemble precipitation forecasts. Significance Statement: Heavy-precipitation events are of highly socioeconomic relevance. But it remains a great challenge to obtain high-quality probabilistic quantitative precipitation forecasting (PQPF) from the operational ensemble prediction systems (EPSs). Statistical postprocessing is commonly used to calibrate the systematic errors of the raw EPSs forecasts. However, the non-Gaussian nature of precipitation and the imbalance between the size of light- and heavy-precipitation samples add to the challenge. This study proposes a conditional postprocessing method to improve PQPF of heavy precipitation by performing calibration separately for light and heavy precipitation, in contrast to some previous studies. Our results indicate that the conditional model mitigates the underestimation of heavy precipitation, as well as with a better calibration for the light- and moderate-precipitation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. AN ADAPTIVE SUPERCONVERGENT MIXED FINITE ELEMENT METHOD BASED ON LOCAL RESIDUAL MINIMIZATION.
- Author
-
MUGA, IGNACIO, ROJAS, SERGIO, and VEGA, PATRICK
- Subjects
- *
FINITE element method , *PARTIAL differential equations , *A posteriori error analysis , *HEAT equation - Abstract
We introduce an adaptive superconvergent finite element method for a class of mixed formulations to solve partial differential equations involving a diffusion term. It combines a superconvergent postprocessing technique for the primal variable with an adaptive finite element method via residual minimization. Such a residual minimization procedure is performed on a local postprocessing scheme, commonly used in the context of mixed finite element methods. Given the local nature of that approach, the underlying saddle point problems associated with residual minimizations can be solved with minimal computational effort. We propose and study a posteriori error estimators, including the built-in residual representative associated with residual minimization schemes; and an improved estimator which adds, on the one hand, a residual term quantifying the mismatch between discrete fluxes and, on the other hand, the interelement jumps of the postprocessed solution. We present numerical experiments in two dimensions using Brezzi--Douglas--Marini elements as input for our methodology. The experiments perfectly fit our key theoretical findings and suggest that our estimates are sharp. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.