32 results
Search Results
2. Concept Maps for Formative Assessment: Creation and Implementation of an Automatic and Intelligent Evaluation Method
- Author
-
Tom Bleckmann and Gunnar Friege
- Abstract
Formative assessment is about providing and using feedback and diagnostic information. On this basis, further learning or further teaching should be adaptive and, in the best case, optimized. However, this aspect is difficult to implement in reality, as teachers work with a large number of students and the whole process of formative assessment, especially the evaluation of student performance takes a lot of time. To address this problem, this paper presents an approach in which student performance is collected through a concept map and quickly evaluated using Machine Learning techniques. For this purpose, a concept map on the topic of mechanics was developed and used in 14 physics classes in Germany. After the student maps were analysed by two human raters on the basis of a four-level feedback scheme, a supervised Machine Learning algorithm was trained on the data. The results show a very good agreement between the human and Machine Learning evaluation. Based on these results, an embedding in everyday school life is conceivable, especially as support for teachers. In this way, the teacher can use and interpret the automatic evaluation and use it in the classroom.
- Published
- 2023
3. [Artificial intelligence and secure use of health data in the KI-FDZ project: anonymization, synthetization, and secure processing of real-world data].
- Author
-
Prasser F, Riedel N, Wolter S, Corr D, and Ludwig M
- Subjects
- Humans, Germany, Delivery of Health Care, Artificial Intelligence, Algorithms
- Abstract
The increasing digitization of the healthcare system is leading to a growing volume of health data. Leveraging this data beyond its initial collection purpose for secondary use can provide valuable insights into diagnostics, treatment processes, and the quality of care. The Health Data Lab (HDL) will provide infrastructure for this purpose. Both the protection of patient privacy and optimal analytical capabilities are of central importance in this context, and artificial intelligence (AI) provides two opportunities. First, it enables the analysis of large volumes of data with flexible models, which means that hidden correlations and patterns can be discovered. Second, synthetic - that is, artificial - data generated by AI can protect privacy.This paper describes the KI-FDZ project, which aims to investigate innovative technologies that can support the secure provision of health data for secondary research purposes. A multi-layered approach is investigated in which data-level measures can be combined in different ways with processing in secure environments. To this end, anonymization and synthetization methods, among others, are evaluated based on two concrete application examples. Moreover, it is examined how the creation of machine learning pipelines and the execution of AI algorithms can be supported in secure processing environments. Preliminary results indicate that this approach can achieve a high level of protection while maintaining data validity. The approach investigated in the project can be an important building block in the secure secondary use of health data., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
4. Evidence of an Indirect Effect of Generativity on Fear of Death Through Ego-Integrity Considering Social Desirability.
- Author
-
Busch, Holger
- Subjects
DEATH & psychology ,EGO (Psychology) ,CONFIDENCE intervals ,SELF-evaluation ,FEAR ,HEALTH status indicators ,REGRESSION analysis ,PSYCHOLOGICAL tests ,CRONBACH'S alpha ,QUESTIONNAIRES ,SCALE analysis (Psychology) ,DESCRIPTIVE statistics ,FACTOR analysis ,SOCIAL skills ,REACTION time ,PSYCHOLOGY & religion ,STATISTICAL correlation ,ATTITUDES toward death ,ALGORITHMS - Abstract
Recent research has shown an indirect effect of generativity on fear of death through ego-integrity in older adults. The present paper aims at demonstrating that the indirect effect is valid even when controlling for social desirability. For that purpose, participants (N = 260 German adults) in study 1 provided self-reports on generativity, ego-integrity, fear of death, and social desirability. Analyses confirmed the indirect effect when the tendency for socially desirable responding was statistically controlled. In study 2, participants (N = 133 German adults) also reported on their generativity and ego-integrity. Fear of death, however, was assessed with a reaction time-based measure (i.e., the Implicit Associations Test). Again, the indirect effect could be confirmed. Taken together, the studies lend further credibility to the extant findings on the indirect effect of generativity on fear of death through ego-integrity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Clinical measures of communication limitations in dysarthria assessed through crowdsourcing: specificity, sensitivity, and retest-reliability.
- Author
-
Lehner, Katharina and Ziegler, Wolfram
- Subjects
DYSARTHRIA ,RESEARCH evaluation ,STATISTICAL reliability ,CONFIDENCE intervals ,RESEARCH methodology evaluation ,RESEARCH methodology ,INTELLIGIBILITY of speech ,SPEECH evaluation ,COMMUNICATIVE disorders ,PSYCHOMETRICS ,T-test (Statistics) ,RESEARCH funding ,DESCRIPTIVE statistics ,INTRACLASS correlation ,CROWDSOURCING ,SENSITIVITY & specificity (Statistics) ,RECEIVER operating characteristic curves ,ALGORITHMS - Abstract
Assessing the impact of dysarthria on a patient's ability to communicate should be an integral part of patient management. However, due to the high demands on reliable quantification of communication limitations, hardly any formal clinical tests with approved psychometric properties have been developed so far. This study investigates a web-based assessment of communication impairment in dysarthria, named KommPaS. The test comprises measures of intelligibility, naturalness, perceived listener effort and communication efficiency, as well as a total score that integrates these parameters. The approach is characterized by a quasi-random access to a large inventory of test materials and to a large group of naïve listeners, recruited via crowdsourcing. As part of a larger research program to establish the clinical applicability of this new approach, the present paper focuses on two psychometric issues, namely specificity and sensitivity (study 1) and retest-reliability (study 2). Study 1: KommPaS was administered to 54 healthy adults and 100 adult persons with dysarthria (PWD). Non-parametric criterion-based norms (specificity: 0.95) were used to derive a standard metric for each of the four component variables, and corresponding sensitivity values for the presence of dysarthria were identified. Overall classification accuracy of the total score was determined using a ROC analysis. The resulting cutscores showed a high accuracy in the separation of PWD from healthy speakers for the naturalness and the total score. Study 2: A sub-group of 20 PWD enrolled in study 1 were administered a second KommPaS examination. ICC analyses revealed good to excellent retest reliabilities for all parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Personalized refutation texts best stimulate teachers' conceptual change about multimedia learning.
- Author
-
Dersch, Anna‐Sophia, Renkl, Alexander, and Eitel, Alexander
- Subjects
TEACHER education ,ONLINE education ,RESEARCH ,PROFESSIONS ,MULTIMEDIA systems ,COMPUTER assisted instruction ,INTERNET ,GUILT (Psychology) ,PRE-tests & post-tests ,RESEARCH funding ,QUESTIONNAIRES ,CHI-squared test ,FACTOR analysis ,SHAME ,ALGORITHMS - Abstract
Background: Previous research has shown that teachers hold misconceptions about multimedia learning (e.g., multimedia instruction needs to be adapted to students' learning styles), which may be at odds with evidence‐based teaching. Objectives: Refutation texts are a classical method to reduce misconceptions and thus to stimulate conceptual change. We wanted to know whether making use of a computer algorithm to personalize refutation texts would best initiate teachers' conceptual change. Methods: We designed an online experiment, in which N = 129 in‐service teachers read either (1) expository texts (without direct refutation), (2) common refutation texts, or (3) personalized refutation texts. The teachers filled in a misconception questionnaire pre and post to assess their conceptual change. Results and Conclusions: Statistical analyses revealed that personalized refutation texts initiated the strongest conceptual change, which was driven by increased feelings of guilt and shame. Common refutation texts did not foster teachers' conceptual change as compared to expository texts. These findings indicate that refutation texts should be personalized for experienced practitioners such as teachers. Takeaways: Personalized refutation seems to be promising in the context of online teacher training programs. Further research should test to which extent the present findings also apply to other groups of experienced learners or practitioners. Lay Description: What is already known about this topic?: Teachers hold misconceptions about multimedia learning (e.g., learning materials should be adapted to students' individual learning styles, such as visualizers or verbalizers).Refutation texts, naming a commonly held misconception, disproving it and introducing a scientific explanation, are a common means to reduce misconceptions.Personalization fosters learning by drawing the learner's attention toward the discrepancy between their own beliefs and the learning material, further creating an impasse experience.Said impasse experience may trigger teachers' conceptual change, as, for teachers' conceptual change, a certain degree of discomfort is required. Yet, anger, caused by lessoning teachers on their topic may cause repulse and hamper learning. What this paper adds?: With a computer algorithm, we can efficiently personalize refutation texts by automatically matching them to teachers' answers in a pre‐test. Such a personalized refutation instruction may especially foster conceptual change.Within a randomized experiment, the personalized refutation instruction worked best compared to common refutation texts and expository texts.Feelings of guilt and shame moderated the effect of a personalized refutation, as teachers felt more addressed in their misconceptions and thus experienced the required impasse experience.Feelings of anger did not play an important role within our experiment. The implications of study findings for practitioners: Computer algorithms enable efficient personalization of instruction to better deal with heterogeneous groups of learners (e.g., with big differences in prior knowledge or experience, such as in the case of in‐service teachers).Refutation texts work better for teachers when they are personalized. Common refutation texts do not work better than expository texts.An advantage of digital instruction is the use of algorithms to efficiently personalize instructions even for larger groups. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. Exploring novel algorithms for atrial fibrillation detection by driving graduate level education in medical machine learning.
- Author
-
Rohr, Maurice, Reich, Christoph, Höhl, Andreas, Lilienthal, Timm, Dege, Tizian, Plesinger, Filip, Bulkova, Veronika, Clifford, Gari, Reyna, Matthew, and Hoog Antink, Christoph
- Subjects
ATRIAL fibrillation ,GRADUATE medical education ,MACHINE learning ,ALGORITHMS - Abstract
During the lockdown of universities and the COVID-Pandemic most students were restricted to their homes. Novel and instigating teaching methods were required to improve the learning experience and so recent implementations of the annual PhysioNet/Computing in Cardiology (CinC) Challenges posed as a reference. For over 20 years, the challenges have proven repeatedly to be of immense educational value, besides leading to technological advances for specific problems. In this paper, we report results from the class â€Artificial Intelligence in Medicine Challenge’, which was implemented as an online project seminar at Technical University Darmstadt, Germany, and which was heavily inspired by the PhysioNet/CinC Challenge 2017 â€AF Classification from a Short Single Lead ECG Recording’. Atrial fibrillation is a common cardiac disease and often remains undetected. Therefore, we selected the two most promising models of the course and give an insight into the Transformer-based DualNet architecture as well as into the CNN-LSTM-based model and finally a detailed analysis for both. In particular, we show the model performance results of our internal scoring process for all submitted models and the near state-of-the-art model performance for the two named models on the official 2017 challenge test set. Several teams were able to achieve F
1 scores above/close to 90% on a hidden test-set of Holter recordings. We highlight themes commonly observed among participants, and report the results from the self-assessed student evaluation. Finally, the self-assessment of the students reported a notable increase in machine learning knowledge. [ABSTRACT FROM AUTHOR]- Published
- 2022
- Full Text
- View/download PDF
8. LiDAR Positioning Algorithm Based on ICP and Artificial Landmarks Assistance.
- Author
-
Zeng Q, Kan Y, Tao X, and Hu Y
- Subjects
- Germany, Algorithms
- Abstract
As one of the automated guided vehicle (AGV) positioning methods, the LiDAR positioning method, based on artificial landmarks, has been widely used in warehousing logistics industries in recent years. However, the traditional LiDAR positioning method based on artificial landmarks mainly depends on the three-point positioning method, the performance of which is limited due to landmarks' layout and detection requirements. This paper proposes a LiDAR positioning algorithm based on iterative closest point (ICP) and artificial landmarks assistance. It provides improvements based on the traditional ICP algorithm. The result of positioning provided by the landmarks is used as the initial iteration ICP value. The combination of the ICP algorithm and landmarks enables the positioning algorithm to maintain a certain positioning precision when landmark detection is disturbed. By comparing the proposed algorithm with the positioning scheme developed by SICK in Germany, we prove that the combination of the ICP algorithm and landmarks can effectively improve the robustness under the premise of ensuring precision.
- Published
- 2021
- Full Text
- View/download PDF
9. Classification of airborne 3D point clouds regarding separation of vegetation in complex environments.
- Author
-
Bulatov D, Stütz D, Hacker J, and Weinmann M
- Subjects
- Archaeology, Construction Materials, Datasets as Topic, Geography, Germany, Imaging, Three-Dimensional methods, Italy, Lasers, Photogrammetry, Queensland, Soil Erosion, Algorithms, Geographic Mapping, Geological Phenomena, Plants, Remote Sensing Technology
- Abstract
Classification of outdoor point clouds is an intensely studied topic, particularly with respect to the separation of vegetation from the terrain and manmade structures. In the presence of many overhanging and vertical structures, the (relative) height is no longer a reliable criterion for such a separation. An alternative would be to apply supervised classification; however, thousands of examples are typically required for appropriate training. In this paper, an unsupervised and rotation-invariant method is presented and evaluated for three datasets with very different characteristics. The method allows us to detect planar patches by filtering and clustering so-called superpoints, whereby the well-known but suitably modified random sampling and consensus (RANSAC) approach plays a key role for plane estimation in outlier-rich data. The performance of our method is compared to that produced by supervised classifiers common for remote sensing settings: random forest as learner and feature sets for point cloud processing, like covariance-based features or point descriptors. It is shown that for point clouds resulting from airborne laser scans, the detection accuracy of the proposed method is over 96% and, as such, higher than that of standard supervised classification approaches. Because of artifacts caused by interpolation during 3D stereo matching, the overall accuracy was lower for photogrammetric point clouds (74-77%). However, using additional salient features, such as the normalized green-red difference index, the results became more accurate and less dependent on the data source.
- Published
- 2021
- Full Text
- View/download PDF
10. Algorithmic literacy in medical students – results of a knowledge test conducted in Germany.
- Author
-
Kampa, Philipp and Balzer, Felix
- Subjects
MEDICAL students ,HEALTH occupations students ,QUANTITATIVE research ,INFORMATION literacy ,QUESTIONNAIRES ,SCALE analysis (Psychology) ,DESCRIPTIVE statistics ,DATA analysis software ,MEDICAL education ,COMPUTER literacy ,ALGORITHMS ,TELEMEDICINE - Abstract
The impact of algorithms on everyday life is ever increasing. Medicine and public health are not excluded from this development – algorithms in medicine do not only challenge, change and inform research (methods) but also clinical situations. Given this development, questions arise concerning the competency level of prospective physicians, thus medical students, on algorithm related topics. This paper, based on a master's thesis in library and information science written at Humboldt‐Universität zu Berlin, gives an insight into this topic by presenting and analysing the results of a knowledge test conducted among medical students in Germany. F. J. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
11. "All mimsy were the borogoves" – a discriminative learning model of morphological knowledge in pseudo-word inflection.
- Author
-
Nieder, Jessica, van de Vijver, Ruben, and Tomaschek, Fabian
- Subjects
- *
COMPUTER simulation , *COMPARATIVE grammar , *LEARNING strategies , *LANGUAGE acquisition , *INTELLECT , *PHONETICS , *RESEARCH funding , *VOCABULARY , *ARTIFICIAL neural networks , *ALGORITHMS - Abstract
Grammatical knowledge has often been investigated in wug tests, in which participants inflect pseudo-words. It was argued that in inflecting these pseudo-words, speakers apply their knowledge of word formation. However, it remains unclear what exactly this knowledge is and how it is learned. According to one theory, the knowledge is best characterised as abstractions that specify how units are combined. Another theory maintains that it is best characterised by memory-based analogy. In both cases the knowledge is learned by association based on positive evidence alone. In this paper, we model the classification of pseudo-words to Maltese plurals using a shallow neural network trained with an error-driven learning algorithm. We demonstrate that the classifications mirror those of Maltese speakers in a wug test. Our results indicate that speakers rely on gradient knowledge of a relation between the phonetics of whole words and plural classes, which is learned in an error-driven way. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Comparison of cardiac output estimates obtained from the Antares oscillometric pulse wave analysis algorithm and from Doppler transthoracic echocardiography.
- Author
-
Stäuber, Alexander, Hoppe, Matthias Wilhelm, Lapp, Harald, Richter, Stefan, Ohlow, Marc-Alexander, Dörr, Marcus, Piper, Cornelia, Eckert, Siegfried, Coll- Barroso, Michael Thomas, Stäuber, Franziska, Abanador-Kamper, Nadine, and Baulmann, Johannes
- Subjects
PULSE wave analysis ,CARDIAC output ,DOPPLER echocardiography ,BLAND-Altman plot ,PATIENTS ,PEARSON correlation (Statistics) ,ALGORITHMS - Abstract
Background: In cardiology, cardiac output (CO) is an important parameter for assessing cardiac function. While invasive thermodilution procedures are the gold standard for CO assessment, transthoracic Doppler echocardiography (TTE) has become the established method for routine CO assessment in daily clinical practice. However, a demand persists for non-invasive approaches, including oscillometric pulse wave analysis (PWA), to enhance the accuracy of CO estimation, reduce complications associated with invasive procedures, and facilitate its application in non-intensive care settings. Here, we aimed to compare the TTE and oscillometric PWA algorithm Antares for a non-invasive estimation of CO. Methods: Non-invasive CO data obtained by two-dimensional TTE were compared with those from an oscillometric blood pressure device (custo med GmbH, Ottobrunn, Germany) using the integrated algorithm Antares (Redwave Medical GmbH, Jena, Germany). In total, 59 patients undergoing elective cardiac catheterization for clinical reasons (71±10 years old, 76% males) were included. Agreement between both CO measures were assessed by Bland-Altman analysis, Student's t-test, and Pearson correlations. Results: The mean difference in CO was 0.04 ± 1.03 l/min (95% confidence interval for the mean difference: -0.23 to 0.30 l/min) for the overall group, with lower and upper limits of agreement at -1.98 and 2.05 l/min, respectively. There was no statistically significant difference in means between both CO measures (P = 0.785). Statistically significant correlations between TTE and Antares CO were observed in the entire cohort (r = 0.705, P<0.001) as well as in female (r = 0.802, P<0.001) and male patients (r = 0.669, P<0.001). Conclusions: The oscillometric PWA algorithm Antares and established TTE for a non-invasive estimation of CO are highly correlated in male and female patients, with no statistically significant difference between both approaches. Future validation studies of the Antares CO are necessary before a clinical application can be considered. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Study protocol for the development, trial, and evaluation of a strategy for the implementation of qualification-oriented work organization in nursing homes.
- Author
-
Burfeindt, Corinna, Darmann-Finck, Ingrid, Stammann, Carina, Stegbauer, Constance, Stolle-Wahl, Claudia, Zündel, Matthias, and Rothgang, Heinz
- Subjects
NURSING care facility laws ,HUMAN services programs ,PROFESSIONAL practice ,LONG-term health care ,WORKING hours ,ACADEMIC achievement ,ORGANIZATIONAL change ,MEDICAL research ,EVIDENCE-based medicine ,ALGORITHMS - Abstract
Background: Staffing ratios in nursing homes vary among the federal states of Germany, but there are no rational grounds for these variations. In a previous study, a new instrument for the standardized calculation of staffing requirements in nursing homes was developed (Algorithm1.0). The development was based on a new empirical data collection method that derives actual and target values for the time and number of care interventions provided. Algorithm1.0 found an increased requirement of 36% of staff in German nursing homes. Based on these results, the German legislature has commissioned a model program to trial and evaluate a complex intervention comprising increased staffing combined with strategies for organizational development. Methods: The mixed-methods study consists of (i) developing a concept for restructuring the work organization, (ii) the application of this concept combined with increased staffing in 10 nursing homes (complex intervention), and the further development of the concept using a participatory and iterative formal evaluation process. The intervention consists of (a) quantitative measures of increased staffing based on a calculation using Algorithm1.0 and (b) qualitative measures regarding organizational development. The intervention will be conducted over one year. The effects of the intervention on job satisfaction and quality of care will be evaluated in (iii) a comprehensive prospective, controlled summative evaluation. The results will be compared with ten matched nursing homes as a control group. Finally, (iv) prototypical concepts for qualification-oriented work organization, a strategy for the national rollout, and the further development of Algorithm1.0 into Algorithm 2.0 will be derived. Discussion: In Germany, there is an ongoing dynamic legislation process regarding further developing the long-term care sector. The study, which is the subject of the study protocol presented here, generates an evidence-based strategy for the staffing requirements for nursing homes. Ethics and dissemination. This study was approved by the Ethics Committee of the German Association of Nursing Science (Deutsche Gesellschaft für Pflegewissenschaft) on 02.08.2023 (amended on 20.09.2023). Research findings are disseminated through presentations at national and international conferences and publications in peer-reviewed scientific journals. Trial registration number: German Clinical Trails Register DRKS00031773 (Date of registration 09.11.2023). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Traffic Sign Detection and Recognition Using YOLO Object Detection Algorithm: A Systematic Review.
- Author
-
Flores-Calero, Marco, Astudillo, César A., Guevara, Diego, Maza, Jessica, Lita, Bryan S., Defaz, Bryan, Ante, Juan S., Zabala-Blanco, David, and Armingol Moreno, José María
- Subjects
TRAFFIC monitoring ,TRAFFIC signs & signals ,ARTIFICIAL neural networks ,INTELLIGENT transportation systems ,ALGORITHMS ,OBJECT recognition (Computer vision) ,MOBILE operating systems ,IRIS recognition - Abstract
Context: YOLO (You Look Only Once) is an algorithm based on deep neural networks with real-time object detection capabilities. This state-of-the-art technology is widely available, mainly due to its speed and precision. Since its conception, YOLO has been applied to detect and recognize traffic signs, pedestrians, traffic lights, vehicles, and so on. Objective: The goal of this research is to systematically analyze the YOLO object detection algorithm, applied to traffic sign detection and recognition systems, from five relevant aspects of this technology: applications, datasets, metrics, hardware, and challenges. Method: This study performs a systematic literature review (SLR) of studies on traffic sign detection and recognition using YOLO published in the years 2016–2022. Results: The search found 115 primary studies relevant to the goal of this research. After analyzing these investigations, the following relevant results were obtained. The most common applications of YOLO in this field are vehicular security and intelligent and autonomous vehicles. The majority of the sign datasets used to train, test, and validate YOLO-based systems are publicly available, with an emphasis on datasets from Germany and China. It has also been discovered that most works present sophisticated detection, classification, and processing speed metrics for traffic sign detection and recognition systems by using the different versions of YOLO. In addition, the most popular desktop data processing hardwares are Nvidia RTX 2080 and Titan Tesla V100 and, in the case of embedded or mobile GPU platforms, Jetson Xavier NX. Finally, seven relevant challenges that these systems face when operating in real road conditions have been identified. With this in mind, research has been reclassified to address these challenges in each case. Conclusions: This SLR is the most relevant and current work in the field of technology development applied to the detection and recognition of traffic signs using YOLO. In addition, insights are provided about future work that could be conducted to improve the field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Comparative Analysis of Algorithms to Cleanse Soil Micro-Relief Point Clouds.
- Author
-
Ott, Simone, Burkhard, Benjamin, Harmening, Corinna, Paffenholz, Jens-André, and Steinhoff-Knopp, Bastian
- Subjects
ARTIFICIAL neural networks ,POINT cloud ,DEEP learning ,ALGORITHMS ,MACHINE learning - Abstract
Detecting changes in soil micro-relief in farmland helps to understand degradation processes like sheet erosion. Using the high-resolution technique of terrestrial laser scanning (TLS), we generated point clouds of three 2 × 3 m plots on a weekly basis from May to mid-June in 2022 on cultivated farmland in Germany. Three well-known applications for eliminating vegetation points in the generated point cloud were tested: Cloth Simulation Filter (CSF) as a filtering method, three variants of CANUPO as a machine learning method, and ArcGIS PointCNN as a deep learning method, a sub-category of machine learning using deep neural networks. We assessed the methods with hard criteria such as F1 score, balanced accuracy, height differences, and their standard deviations to the reference surface, resulting in data gaps and robustness, and with soft criteria such as time-saving capacity, accessibility, and user knowledge. All algorithms showed a low performance at the initial measurement epoch, increasing with later epochs. While most of the results demonstrate a better performance of ArcGIS PointCNN, this algorithm revealed an exceptionally low performance in plot 1, which is describable by the generalization gap. Although CANUPO variants created the highest amount of data gaps, we recommend that CANUPO include colour values in combination with CSF. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. Your Boss Is an Algorithm: Artificial Intelligence, Platform Work and Labour.
- Author
-
BERNICCHIA-FREEMAN, ZOÉ
- Subjects
ARTIFICIAL intelligence ,ROBOTS ,AUTOMATION ,ALGORITHMS - Abstract
In March 1964, the cover page of a popular German weekly magazine entitled Der Spiegel painted a frightening picture: An anthropomorphic robot with six mechanical arms commands an assembly line while a displaced human worker floats aimlessly in the foreground. Ejected from his station, the worker throws up his hands in despair next to a headline that reads, "Automation in Germany, the arrival of robots." Over fifty years later, a cover page from the same magazine evoked similar themes: A giant robot arm yanks an office worker away from his computer under the headline, "You're fired! How computers and robots steal our jobs - and which jobs will be safe." The more things change, the more they stay the same. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. Validierung innerklinischer Sichtungsalgorithmen für den Massenanfall von Verletzten – eine simulationsbasierte Studie – deutsche Version.
- Author
-
Heller, Axel R., Neidel, Tobias, Klotz, Patrick J., Solarek, André, Kowalzik, Barbara, Juncken, Kathleen, and Kleber, Christan
- Subjects
STATISTICS ,MEDICAL triage ,DISASTERS ,SIMULATION methods in education ,MONITOR alarms (Medicine) ,EMERGENCY management ,RESEARCH funding ,CASE studies ,MASS casualties ,COMPUTER-aided diagnosis ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,ALGORITHMS - Abstract
Copyright of Die Anaesthesiologie is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
18. Self-potential data inversion utilizing the Bat optimizing algorithm (BOA) with various application cases.
- Author
-
Essa, Khalid S., Diab, Zein E., and Mehanee, Salah A.
- Subjects
BAT behavior ,GEOMETRIC shapes ,BATS ,BREWSTER'S angle ,ALGORITHMS ,METAHEURISTIC algorithms ,ANGLES - Abstract
The Bat optimizing algorithm (BOA) is one of the metaheuristic algorithms and applied here to interpret self-potential (SP) data. The BOA is depending upon a bat echolocation behavior for global optimization, which the global optimum solution reached at the suggested minimum value of the objective function. The best interpretive source parameters for the subsurface structures occurred at the minimal the objective function value (global best solution). The BOA is applied to 2D SP anomaly data to estimate the characteristic source parameters (i.e., the depth to center, amplitude coefficient, origin location, geometric shape factor, and polarization and inclination angle of the causative buried structure). The BOA can be applied to single and multiple source structures in the restricted class of simple geometric shapes, which these bodies help in the validation of the subsurface ore and mineral targets. The stability and efficiency of the proposed BOA have been examined by several synthetic examples. In addition, three different real field examples from Germany and Indonesia have been successfully applied to ore and mineral investigation and geological structure studies. In general, the achieved results are in good agreement with the available borehole data and results mentioned in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
19. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. A Radar-Based Quantitative Precipitation Estimation Algorithm to Overcome the Impact of Vertical Gradients of Warm-Rain Precipitation: The Flood in Western Germany on 14 July 2021.
- Author
-
Chen, Ju-Yu, Reinoso-Rondinel, Ricardo, Trömel, Silke, Simmer, Clemens, and Ryzhkov, Alexander
- Subjects
RAINDROP size ,METEOROLOGICAL services ,RAINFALL ,HYDROLOGIC models ,RAIN gauges ,ALGORITHMS ,FLOODS - Abstract
The demand of accurate, near-real-time radar-based quantitative precipitation estimation (QPE), which is key to feed hydrological models and enable reliable flash flood predictions, was highlighted again by the disastrous floods following after an intense stratiform precipitation field passing western Germany on 14 July 2021. Three state-of-the-art rainfall algorithms based on reflectivity Z, specific differential phase KDP, and specific attenuation A were applied to observations of four polarimetric C-band radars operated by the German Meteorological Service [DWD (Deutscher Wetterdienst)]. Due to the large vertical gradients of precipitation below the melting layer suggesting warm-rain processes, all QPE products significantly underestimate surface precipitation. We propose two mitigation approaches: (i) vertical profile (VP) corrections for Z and KDP and (ii) gap filling using observations of a local X-band radar, JuXPol. We also derive rainfall retrievals from vertically pointing Micro Rain Radar (MRR) profiles, which indirectly take precipitation gradients in the lower few hundreds of meters into account. When evaluated with DWD rain gauge measurements, those retrievals result in pronounced improvements, especially for the A-based retrieval partly due to its lower sensitivity to the variability of raindrop size distributions. The VP correction further improves QPE by reducing the normalized root-mean-square error by 23% and the normalized mean bias by 20%. With the use of gap-filling JuXPol data, the A-based retrieval gives the lowest errors followed by the Z-based retrievals in combination with VP corrections. The presented algorithms demonstrate the increased value of radar-based QPE application for warm-rain events and related potential flash flooding warnings. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
21. 3D involute gear evaluation - part II: deviations - basic algorithms for modern software validation.
- Author
-
Stein, Martin and Härtig, Frank
- Subjects
SPUR gearing ,SOFTWARE validation ,HELICAL gears ,ALGORITHMS ,MEASURING instruments - Abstract
The fundamental equations for the evaluation of cylindrical involute gear measurements on 3D gear measuring instruments are provided. The computations are based on the principles of gear kinematics and use the system of involute gear coordinates introduced in a previous work of the authors. This holistic approach focuses on significant error sources that only appear since 3D measurement technology is used and that are almost unrecognized till today. The proposed algorithms are beneficial for the description of gear deviations as they allow the use of simple formulas covering profile, helix and pitch evaluation for internal or external and spur or helical gears. The presented equations contain the key fundamentals to complement existing standards. They will become part of reference algorithms used by the Physikalisch-Technische Bundesanstalt, the national metrology institute of Germany, to certify gear evaluation software. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
22. Proposal of an algorithm for the management of rectally inserted foreign bodies: a surgical single-center experience with review of the literature.
- Author
-
Fritz, Stefan, Killguss, Hansjörg, Schaudt, André, Sommer, Christof M., Richter, Götz M., Belle, Sebastian, Reissfelder, Christoph, Loff, Steffan, and Köninger, Jörg
- Subjects
INTESTINAL perforation ,FOREIGN bodies ,LITERATURE reviews ,ABDOMINOPERINEAL resection ,SURGICAL complications ,ABDOMINAL surgery ,EMERGENCY physicians ,ALGORITHMS - Abstract
Background: Retained rectal foreign bodies (RFBs) are uncommon clinical findings. Although the management of RFBs is rarely reported in the literature, clinicians regularly face this issue. To date, there is no standardized management of RFBs. The aim of the present study was to evaluate our own data and subsequently develop a treatment algorithm. Methods: All consecutive patients who presented between January 2006 and December 2019 with rectally inserted RFBs at the emergency department of the Klinikum Stuttgart, Germany, were retrospectively identified. Clinicopathologic features, management, complications, and outcomes were assessed. Based on this experience, a treatment algorithm was developed. Results: A total of 69 presentations with rectally inserted RFBs were documented in 57 patients. In 23/69 cases (33.3%), the RFB was removed transanally by the emergency physician either digitally (n = 14) or with the help of a rigid rectoscope (n = 8) or a colonoscope (n = 1). In 46/69 cases (66.7%), the RFB was removed in the operation theater under general anesthesia with muscle relaxation. Among these, 11/46 patients (23.9%) underwent abdominal surgery, either for manual extraction of the RFB (n = 9) or to exclude a bowel perforation (n = 2). Surgical complications occurred in 3/11 patients. One patient with rectal perforation developed pelvic sepsis and underwent abdominoperineal extirpation in the further clinical course. Conclusion: The management of RFBs can be challenging and includes a wide range of options from removal without further intervention to abdominoperineal extirpation in cases of pelvic sepsis. Whenever possible, RFBs should obligatorily be managed in specialized colorectal centers following a clear treatment algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Neither timeless, nor placeless: Control of food delivery gig work via place-based working time regimes.
- Author
-
Heiland, Heiner
- Subjects
ELECTRONIC commerce ,RESEARCH methodology ,INTERVIEWING ,QUESTIONNAIRES ,LABOR market ,WORKING hours ,TIME management ,EMPIRICAL research ,FOOD service ,PERSONNEL management ,ALGORITHMS - Abstract
Working time regimes in platform labour are so far either ignored as a topic in research on gig work, or they are framed as an allocative instrument only. This article argues that working time regimes instead have both a coordinating and controlling effect. Adopting the analytical framework of labour process theory, the article hence focuses on the interrelation of working time and control regimes. The empirical material presented stems from research on platform-based food courier work in Germany and is based on a mixed methods research design consisting of interviews, multi-sited ethnography and a survey. The findings show that platforms implement hybrid control regimes that are not only based on the sufficiently analysed algorithmic management, but also on complementary control through working time regimes: temporal control. Platforms organise intra-platform markets where workers compete for shifts by means of performance. Thus, platforms are able to ensure an efficient and simultaneously reliable use of an autonomous and spatially distributed workforce. Furthermore, it is shown that platform labour is not placeless, either. The effects of its control regime vary according to different local conditions. As a result, platforms cannot be analysed only as techno-cultural ecosystems, but also as local-specific socio-economic structures. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
24. Preference Discovery in University Admissions: The Case for Dynamic Multioffer Mechanisms.
- Author
-
Grenet, Julien, He, YingHua, and Kübler, Dorothea
- Subjects
UNIVERSITY & college admission ,ALGORITHMS - Abstract
We document quasi-experimental evidence against the common assumption in the matching literature that agents have full information on their own preferences. In Germany's university admissions, the first stages of the Gale-Shapley algorithm are implemented in real time, allowing for multiple offers per student. We demonstrate that nonexploding early offers are accepted more often than later offers, despite not being more desirable. These results, together with survey evidence and a theoretical model, are consistent with students' costly discovery of preferences. A novel dynamic multioffer mechanism that batches early offers improves matching efficiency by informing students of offer availability before preference discovery. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
25. Estimating dry biomass and plant nitrogen concentration in pre-Alpine grasslands with low-cost UAS-borne multispectral data – a comparison of sensors, algorithms, and predictor sets.
- Author
-
Schucknecht, Anne, Seo, Bumsuk, Krämer, Alexander, Asam, Sarah, Atzberger, Clement, and Kiese, Ralf
- Subjects
PLANT biomass ,GRASSLANDS ,PLANT drying ,DETECTORS ,ALGORITHMS ,RANDOM forest algorithms ,WILDLIFE management areas - Abstract
Grasslands are an important part of pre-Alpine and Alpine landscapes. Despite the economic value and the significant role of grasslands in carbon and nitrogen (N) cycling, spatially explicit information on grassland biomass and quality is rarely available. Remotely sensed data from unmanned aircraft systems (UASs) and satellites might be an option to overcome this gap. Our study aims to investigate the potential of low-cost UAS-based multispectral sensors for estimating above-ground biomass (dry matter, DM) and plant N concentration. In our analysis, we compared two different sensors (Parrot Sequoia, SEQ; MicaSense RedEdge-M, REM), three statistical models (linear model; random forests, RFs; gradient-boosting machines, GBMs), and six predictor sets (i.e. different combinations of raw reflectance, vegetation indices, and canopy height). Canopy height information can be derived from UAS sensors but was not available in our study. Therefore, we tested the added value of this structural information with in situ measured bulk canopy height data. A combined field sampling and flight campaign was conducted in April 2018 at different grassland sites in southern Germany to obtain in situ and the corresponding spectral data. The hyper-parameters of the two machine learning (ML) approaches (RF, GBM) were optimized, and all model setups were run with a 6-fold cross-validation. Linear models were characterized by very low statistical performance measures, thus were not suitable to estimate DM and plant N concentration using UAS data. The non-linear ML algorithms showed an acceptable regression performance for all sensor–predictor set combinations with average (avg; cross-validated, cv) Rcv2 of 0.48, RMSE cv,avg of 53.0 g m 2 , and rRMSE cv,avg (relative) of 15.9 % for DM and with Rcv,avg2 of 0.40, RMSE cv,avg of 0.48 wt %, and rRMSE cv, avg of 15.2 % for plant N concentration estimation. The optimal combination of sensors, ML algorithms, and predictor sets notably improved the model performance. The best model performance for the estimation of DM (Rcv2=0.67 , RMSE cv=41.9 g m 2 , rRMSE cv=12.6 %) was achieved with an RF model that utilizes all possible predictors and REM sensor data. The best model for plant N concentration was a combination of an RF model with all predictors and SEQ sensor data (Rcv2=0.47 , RMSE cv=0.45 wt %, rRMSE cv=14.2 %). DM models with the spectral input of REM performed significantly better than those with SEQ data, while for N concentration models, it was the other way round. The choice of predictors was most influential on model performance, while the effect of the chosen ML algorithm was generally lower. The addition of canopy height to the spectral data in the predictor set significantly improved the DM models. In our study, calibrating the ML algorithm improved the model performance substantially, which shows the importance of this step. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
26. Phasing out support schemes for renewables in neighbouring countries: An agent-based model with investment preferences.
- Author
-
Melliger, Marc and Chappin, Emile
- Subjects
- *
ELECTRICITY pricing , *RENEWABLE energy sources , *ALGORITHMS , *COUNTRIES , *PHOTOVOLTAIC power generation - Abstract
[Display omitted] • We simulate renewables pathways under different support phase-out scenarios. • We extend an investment algorithm by preferences and calibrated returns. • Our improved algorithm incorporates more heterogeneity resulting in stronger effects. • Further auction support is necessary for most capacity targets in the case countries. • Countries should coordinate policy changes due to cross-border effects. Support schemes have been central to the expansion of renewable electricity globally and in the European Union. As technologies mature, individual member states may decide to phase out these policies. While previous research has shown that such policy changes affect investors' decisions, we investigate how they affect pathways and electricity prices by simulating investment decisions in an agent-based model in two case countries. This paper contributes and applies an adapted investment decision algorithm that incorporates empirically observed technology and return preferences and is calibrated by return observations. The new algorithm yields more refined and stronger effects compared to its predecessor. Results show that the phase-out of auctions in Germany and the Netherlands slows down their deployment of renewable capacity by up to ∼60% and ∼35%, respectively. With the exception of photovoltaics and onshore wind projects in the Netherlands, the targeted capacities can only be reached by continuing support in both countries. Furthermore, ending support in a large country like Germany leads to higher electricity prices and fosters a market-driven but insufficient capacity expansion in smaller neighbours like the Netherlands. As the electricity grids in many countries are strongly interconnected, such cross-border effects are of international relevance. Our findings suggest that continued auctions may be necessary and that countries should coordinate policy changes to stay on track for meeting their renewables targets. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
27. Einfluss atmosphärischer Umgebungsbedingungen auf den Lebenszyklus konvektiver Zellen in der Echtzeit-Vorhersage.
- Author
-
Wilhelm, Jannik
- Subjects
ACADEMIC dissertations ,MESOSCALE convective complexes ,RADAR ,WEATHER forecasting ,ALGORITHMS ,METEOROLOGICAL services ,CELLS ,LIFE cycles (Biology) ,DATA ,VERTICAL wind shear ,THUNDERSTORMS ,HAILSTORMS - Abstract
Copyright of Wissenschaftliche Berichte des Instituts für Meteorologie und Klimaforschung des Karlsruher Instituts für Technologie is the property of KIT Scientific Publishing and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
28. An algorithm to assess the heating strategy of buildings in cold climates: a case study of Germany.
- Author
-
Mazhar, Abdur Rehman, Zou, Yuliang, Zeng, Cheng, Shen, Yongliang, and Liu, Shuli
- Subjects
RESIDENTIAL heating systems ,HOME energy use ,WINTER ,THERMAL comfort ,HEATING load ,PEAK load ,ALGORITHMS - Abstract
Two-thirds of the final energy consumption of the EU residential sector goes towards space heating of buildings, yet a huge portion of the population still suffers from energy poverty. Identifying optimum heating strategies of current buildings would be a solution to this crisis, which is the main aim of the developed algorithm in this research. The algorithm incorporates a modified version of the simple hourly method from the ISO 13790 standard to determine the hourly heating load and indoor temperatures of buildings based on any heating strategy. Flexibility in the input of building and weather data make this tool versatile with practicality towards building users and policymakers. With this algorithm, a case study to evaluate three commonly used domestic heating strategies has been established for nine different residential buildings in typical cold winter conditions in Germany. Most EU households heat their buildings either continuously throughout the day at fixed temperatures, sporadically at fixed times or at peak loads during the evening. The continuous heating strategy is rated the best consuming minimal energy with consistent temperatures and optimal thermal comfort ranges. The sporadic and peak load heating strategies provide fluctuating indoor temperatures with high standard deviations of up to 8.70°C while consuming a similar cumulative energy to the continuous heating strategy. Additionally, both these aforementioned strategies augment energy poverty and promote indoor mould formation on the building envelope caused by water vapor condensation. Consequently, this algorithm can be applicable to any building type of any region. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
29. Simulation‐based training improves process times in acute stroke care (STREAM).
- Author
-
Bohmann, Ferdinand O., Gruber, Katharina, Kurka, Natalia, Willems, Laurent M., Herrmann, Eva, du Mesnil de Rochemont, Richard, Scholz, Peter, Rai, Heike, Zickler, Philipp, Ertl, Michael, Berlis, Ansgar, Poli, Sven, Mengel, Annerose, Ringleb, Peter, Nagel, Simon, Pfaff, Johannes, Wollenweber, Frank A., Kellert, Lars, Herzberg, Moriz, and Koehler, Luzie
- Subjects
ENDOVASCULAR surgery ,THROMBOLYTIC therapy ,ALGORITHMS ,MEDICAL simulation ,SECONDARY analysis ,TERTIARY care - Abstract
Background: The objective of the STREAM Trial was to evaluate the effect of simulation training on process times in acute stroke care. Methods: The multicenter prospective interventional STREAM Trial was conducted between 10/2017 and 04/2019 at seven tertiary care neurocenters in Germany with a pre‐ and post‐interventional observation phase. We recorded patient characteristics, acute stroke care process times, stroke team composition and simulation experience for consecutive direct‐to‐center patients receiving intravenous thrombolysis (IVT) and/or endovascular therapy (EVT). The intervention consisted of a composite intervention centered around stroke‐specific in situ simulation training. Primary outcome measure was the 'door‐to‐needle' time (DTN) for IVT. Secondary outcome measures included process times of EVT and measures taken to streamline the pre‐existing treatment algorithm. Results: The effect of the STREAM intervention on the process times of all acute stroke operations was neutral. However, secondary analyses showed a DTN reduction of 5 min from 38 min pre‐intervention (interquartile range [IQR] 25–43 min) to 33 min (IQR 23–39 min, p = 0.03) post‐intervention achieved by simulation‐experienced stroke teams. Concerning EVT, we found significantly shorter door‐to‐groin times in patients who were treated by teams with simulation experience as compared to simulation‐naive teams in the post‐interventional phase (−21 min, simulation‐naive: 95 min, IQR 69–111 vs. simulation‐experienced: 74 min, IQR 51–92, p = 0.04). Conclusion: An intervention combining workflow refinement and simulation‐based stroke team training has the potential to improve process times in acute stroke care. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
30. Beyond data protection concerns – the European passenger name record system.
- Author
-
Olsen, Henrik Palmer and Wiesener, Cornelius
- Subjects
DATA protection ,PASSENGERS ,RISK assessment ,CRIME statistics ,TERRORISM - Abstract
In this article, we examine the European framework of collecting and analysing flight passenger name record (PNR) data for the purpose of combating terrorism and serious crime. The focus is mainly on the EU PNR Directive of 2016, but we also consider the specific legislative framework in Germany and Denmark. In light of the recent review of the Directive, the article aims at exploring the policy-related, legal and technological challenges. In doing so, it goes beyond established data protection concerns. In particular, we debunk the popular claim that PNR analysis in and of itself entails the risk of discrimination of certain groups – a claim commonly levelled against algorithmic analysis. We also provide useful insights into the specific legal safeguards vis-à-vis automated profiling and decision-making through human review. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. l 2-Penalized temporal logit-mixed models for the estimation of regional obesity prevalence over time.
- Author
-
Burgard, Jan P, Krause, Joscha, Münnich, Ralf, and Morales, Domingo
- Subjects
OBESITY ,PARAMETER estimation ,THERAPEUTICS ,ALGORITHMS ,MODERN society ,LOGITS ,STATISTICAL bootstrapping - Abstract
Obesity is considered to be one of the primary health risks in modern industrialized societies. Estimating the evolution of its prevalence over time is an essential element of public health reporting. This requires the application of suitable statistical methods on epidemiologic data with substantial local detail. Generalized linear-mixed models with medical treatment records as covariates mark a powerful combination for this purpose. However, the task is methodologically challenging. Disease frequencies are subject to both regional and temporal heterogeneity. Medical treatment records often show strong internal correlation due to diagnosis-related grouping. This frequently causes excessive variance in model parameter estimation due to rank-deficiency problems. Further, generalized linear-mixed models are often estimated via approximate inference methods as their likelihood functions do not have closed forms. These problems combined lead to unacceptable uncertainty in prevalence estimates over time. We propose an l
2 -penalized temporal logit-mixed model to solve these issues. We derive empirical best predictors and present a parametric bootstrap to estimate their mean-squared errors. A novel penalized maximum approximate likelihood algorithm for model parameter estimation is stated. With this new methodology, the regional obesity prevalence in Germany from 2009 to 2012 is estimated. We find that the national prevalence ranges between 15 and 16%, with significant regional clustering in eastern Germany. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
32. A new combined statistical method for bias adjustment and downscaling making use of multi-variate bias adjustment and PCA-driven rescaling.
- Author
-
KRÄHENMANN, STEFAN, HALLER, MICHAEL, and WALTER, ANDREAS
- Subjects
STATISTICAL bias ,DOWNSCALING (Climatology) ,ATMOSPHERIC models ,SUPPLY & demand ,ALGORITHMS ,EXTREME value theory ,PARETO distribution ,RANDOM forest algorithms - Abstract
One major concern of climate modeling is the projection of future extreme events as they have the most severe impact on society and environment. This is a challenging task for modeling and due to the low occurrence rate of extreme values. Furthermore, the local-scale characteristics of extreme events demand for high resolution model data. In the framework of the EURO-CORDEX initiative, climate model ensemble data on 0.11° grid resolution are produced. In order to provide climate data on a higher resolution in an efficient and reliable way, a statistical downscaling method has been developed, which combines bias adjustment and downscaling. With this method, an ensemble of climate model data on a target resolution of 5 km has been built and it was established as a reference ensemble for Germany. The ensemble consists of the three scenarios RCP 2.6, 4.5 and 8.5, 44 members in total. The method is comprehensible and of minimum complexity. It involves objective predictor selection and it can be applied for different areas, horizontal resolutions, target variables and predictor data sets, and, thus, providing high flexibility. While the methodology imposes refined structures onto modeled data, it does not affect the models data range and therefore allows for extrapolation beyond observed values. The raw model data show for threshold-based indices a rather large spread and bias, which was tremendously improved in the bias adjustment step. Downscaling is challenging as local terrain features can introduce unpredictable residual variation without localized information from e.g. observations. In particular, temperature-based extreme values were well captured by the downscaling algorithm, as temperature is strongly elevation dependent. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.