77 results
Search Results
2. Concept Maps for Formative Assessment: Creation and Implementation of an Automatic and Intelligent Evaluation Method
- Author
-
Tom Bleckmann and Gunnar Friege
- Abstract
Formative assessment is about providing and using feedback and diagnostic information. On this basis, further learning or further teaching should be adaptive and, in the best case, optimized. However, this aspect is difficult to implement in reality, as teachers work with a large number of students and the whole process of formative assessment, especially the evaluation of student performance takes a lot of time. To address this problem, this paper presents an approach in which student performance is collected through a concept map and quickly evaluated using Machine Learning techniques. For this purpose, a concept map on the topic of mechanics was developed and used in 14 physics classes in Germany. After the student maps were analysed by two human raters on the basis of a four-level feedback scheme, a supervised Machine Learning algorithm was trained on the data. The results show a very good agreement between the human and Machine Learning evaluation. Based on these results, an embedding in everyday school life is conceivable, especially as support for teachers. In this way, the teacher can use and interpret the automatic evaluation and use it in the classroom.
- Published
- 2023
3. [Artificial intelligence and secure use of health data in the KI-FDZ project: anonymization, synthetization, and secure processing of real-world data].
- Author
-
Prasser F, Riedel N, Wolter S, Corr D, and Ludwig M
- Subjects
- Humans, Germany, Delivery of Health Care, Artificial Intelligence, Algorithms
- Abstract
The increasing digitization of the healthcare system is leading to a growing volume of health data. Leveraging this data beyond its initial collection purpose for secondary use can provide valuable insights into diagnostics, treatment processes, and the quality of care. The Health Data Lab (HDL) will provide infrastructure for this purpose. Both the protection of patient privacy and optimal analytical capabilities are of central importance in this context, and artificial intelligence (AI) provides two opportunities. First, it enables the analysis of large volumes of data with flexible models, which means that hidden correlations and patterns can be discovered. Second, synthetic - that is, artificial - data generated by AI can protect privacy.This paper describes the KI-FDZ project, which aims to investigate innovative technologies that can support the secure provision of health data for secondary research purposes. A multi-layered approach is investigated in which data-level measures can be combined in different ways with processing in secure environments. To this end, anonymization and synthetization methods, among others, are evaluated based on two concrete application examples. Moreover, it is examined how the creation of machine learning pipelines and the execution of AI algorithms can be supported in secure processing environments. Preliminary results indicate that this approach can achieve a high level of protection while maintaining data validity. The approach investigated in the project can be an important building block in the secure secondary use of health data., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
4. Evidence of an Indirect Effect of Generativity on Fear of Death Through Ego-Integrity Considering Social Desirability.
- Author
-
Busch, Holger
- Subjects
DEATH & psychology ,EGO (Psychology) ,CONFIDENCE intervals ,SELF-evaluation ,FEAR ,HEALTH status indicators ,REGRESSION analysis ,PSYCHOLOGICAL tests ,CRONBACH'S alpha ,QUESTIONNAIRES ,SCALE analysis (Psychology) ,DESCRIPTIVE statistics ,FACTOR analysis ,SOCIAL skills ,REACTION time ,PSYCHOLOGY & religion ,STATISTICAL correlation ,ATTITUDES toward death ,ALGORITHMS - Abstract
Recent research has shown an indirect effect of generativity on fear of death through ego-integrity in older adults. The present paper aims at demonstrating that the indirect effect is valid even when controlling for social desirability. For that purpose, participants (N = 260 German adults) in study 1 provided self-reports on generativity, ego-integrity, fear of death, and social desirability. Analyses confirmed the indirect effect when the tendency for socially desirable responding was statistically controlled. In study 2, participants (N = 133 German adults) also reported on their generativity and ego-integrity. Fear of death, however, was assessed with a reaction time-based measure (i.e., the Implicit Associations Test). Again, the indirect effect could be confirmed. Taken together, the studies lend further credibility to the extant findings on the indirect effect of generativity on fear of death through ego-integrity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Clinical measures of communication limitations in dysarthria assessed through crowdsourcing: specificity, sensitivity, and retest-reliability.
- Author
-
Lehner, Katharina and Ziegler, Wolfram
- Subjects
DYSARTHRIA ,RESEARCH evaluation ,STATISTICAL reliability ,CONFIDENCE intervals ,RESEARCH methodology evaluation ,RESEARCH methodology ,INTELLIGIBILITY of speech ,SPEECH evaluation ,COMMUNICATIVE disorders ,PSYCHOMETRICS ,T-test (Statistics) ,RESEARCH funding ,DESCRIPTIVE statistics ,INTRACLASS correlation ,CROWDSOURCING ,SENSITIVITY & specificity (Statistics) ,RECEIVER operating characteristic curves ,ALGORITHMS - Abstract
Assessing the impact of dysarthria on a patient's ability to communicate should be an integral part of patient management. However, due to the high demands on reliable quantification of communication limitations, hardly any formal clinical tests with approved psychometric properties have been developed so far. This study investigates a web-based assessment of communication impairment in dysarthria, named KommPaS. The test comprises measures of intelligibility, naturalness, perceived listener effort and communication efficiency, as well as a total score that integrates these parameters. The approach is characterized by a quasi-random access to a large inventory of test materials and to a large group of naïve listeners, recruited via crowdsourcing. As part of a larger research program to establish the clinical applicability of this new approach, the present paper focuses on two psychometric issues, namely specificity and sensitivity (study 1) and retest-reliability (study 2). Study 1: KommPaS was administered to 54 healthy adults and 100 adult persons with dysarthria (PWD). Non-parametric criterion-based norms (specificity: 0.95) were used to derive a standard metric for each of the four component variables, and corresponding sensitivity values for the presence of dysarthria were identified. Overall classification accuracy of the total score was determined using a ROC analysis. The resulting cutscores showed a high accuracy in the separation of PWD from healthy speakers for the naturalness and the total score. Study 2: A sub-group of 20 PWD enrolled in study 1 were administered a second KommPaS examination. ICC analyses revealed good to excellent retest reliabilities for all parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Personalized refutation texts best stimulate teachers' conceptual change about multimedia learning.
- Author
-
Dersch, Anna‐Sophia, Renkl, Alexander, and Eitel, Alexander
- Subjects
TEACHER education ,ONLINE education ,RESEARCH ,PROFESSIONS ,MULTIMEDIA systems ,COMPUTER assisted instruction ,INTERNET ,GUILT (Psychology) ,PRE-tests & post-tests ,RESEARCH funding ,QUESTIONNAIRES ,CHI-squared test ,FACTOR analysis ,SHAME ,ALGORITHMS - Abstract
Background: Previous research has shown that teachers hold misconceptions about multimedia learning (e.g., multimedia instruction needs to be adapted to students' learning styles), which may be at odds with evidence‐based teaching. Objectives: Refutation texts are a classical method to reduce misconceptions and thus to stimulate conceptual change. We wanted to know whether making use of a computer algorithm to personalize refutation texts would best initiate teachers' conceptual change. Methods: We designed an online experiment, in which N = 129 in‐service teachers read either (1) expository texts (without direct refutation), (2) common refutation texts, or (3) personalized refutation texts. The teachers filled in a misconception questionnaire pre and post to assess their conceptual change. Results and Conclusions: Statistical analyses revealed that personalized refutation texts initiated the strongest conceptual change, which was driven by increased feelings of guilt and shame. Common refutation texts did not foster teachers' conceptual change as compared to expository texts. These findings indicate that refutation texts should be personalized for experienced practitioners such as teachers. Takeaways: Personalized refutation seems to be promising in the context of online teacher training programs. Further research should test to which extent the present findings also apply to other groups of experienced learners or practitioners. Lay Description: What is already known about this topic?: Teachers hold misconceptions about multimedia learning (e.g., learning materials should be adapted to students' individual learning styles, such as visualizers or verbalizers).Refutation texts, naming a commonly held misconception, disproving it and introducing a scientific explanation, are a common means to reduce misconceptions.Personalization fosters learning by drawing the learner's attention toward the discrepancy between their own beliefs and the learning material, further creating an impasse experience.Said impasse experience may trigger teachers' conceptual change, as, for teachers' conceptual change, a certain degree of discomfort is required. Yet, anger, caused by lessoning teachers on their topic may cause repulse and hamper learning. What this paper adds?: With a computer algorithm, we can efficiently personalize refutation texts by automatically matching them to teachers' answers in a pre‐test. Such a personalized refutation instruction may especially foster conceptual change.Within a randomized experiment, the personalized refutation instruction worked best compared to common refutation texts and expository texts.Feelings of guilt and shame moderated the effect of a personalized refutation, as teachers felt more addressed in their misconceptions and thus experienced the required impasse experience.Feelings of anger did not play an important role within our experiment. The implications of study findings for practitioners: Computer algorithms enable efficient personalization of instruction to better deal with heterogeneous groups of learners (e.g., with big differences in prior knowledge or experience, such as in the case of in‐service teachers).Refutation texts work better for teachers when they are personalized. Common refutation texts do not work better than expository texts.An advantage of digital instruction is the use of algorithms to efficiently personalize instructions even for larger groups. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. Exploring novel algorithms for atrial fibrillation detection by driving graduate level education in medical machine learning.
- Author
-
Rohr, Maurice, Reich, Christoph, Höhl, Andreas, Lilienthal, Timm, Dege, Tizian, Plesinger, Filip, Bulkova, Veronika, Clifford, Gari, Reyna, Matthew, and Hoog Antink, Christoph
- Subjects
ATRIAL fibrillation ,GRADUATE medical education ,MACHINE learning ,ALGORITHMS - Abstract
During the lockdown of universities and the COVID-Pandemic most students were restricted to their homes. Novel and instigating teaching methods were required to improve the learning experience and so recent implementations of the annual PhysioNet/Computing in Cardiology (CinC) Challenges posed as a reference. For over 20 years, the challenges have proven repeatedly to be of immense educational value, besides leading to technological advances for specific problems. In this paper, we report results from the class â€Artificial Intelligence in Medicine Challenge’, which was implemented as an online project seminar at Technical University Darmstadt, Germany, and which was heavily inspired by the PhysioNet/CinC Challenge 2017 â€AF Classification from a Short Single Lead ECG Recording’. Atrial fibrillation is a common cardiac disease and often remains undetected. Therefore, we selected the two most promising models of the course and give an insight into the Transformer-based DualNet architecture as well as into the CNN-LSTM-based model and finally a detailed analysis for both. In particular, we show the model performance results of our internal scoring process for all submitted models and the near state-of-the-art model performance for the two named models on the official 2017 challenge test set. Several teams were able to achieve F
1 scores above/close to 90% on a hidden test-set of Holter recordings. We highlight themes commonly observed among participants, and report the results from the self-assessed student evaluation. Finally, the self-assessment of the students reported a notable increase in machine learning knowledge. [ABSTRACT FROM AUTHOR]- Published
- 2022
- Full Text
- View/download PDF
8. LiDAR Positioning Algorithm Based on ICP and Artificial Landmarks Assistance.
- Author
-
Zeng Q, Kan Y, Tao X, and Hu Y
- Subjects
- Germany, Algorithms
- Abstract
As one of the automated guided vehicle (AGV) positioning methods, the LiDAR positioning method, based on artificial landmarks, has been widely used in warehousing logistics industries in recent years. However, the traditional LiDAR positioning method based on artificial landmarks mainly depends on the three-point positioning method, the performance of which is limited due to landmarks' layout and detection requirements. This paper proposes a LiDAR positioning algorithm based on iterative closest point (ICP) and artificial landmarks assistance. It provides improvements based on the traditional ICP algorithm. The result of positioning provided by the landmarks is used as the initial iteration ICP value. The combination of the ICP algorithm and landmarks enables the positioning algorithm to maintain a certain positioning precision when landmark detection is disturbed. By comparing the proposed algorithm with the positioning scheme developed by SICK in Germany, we prove that the combination of the ICP algorithm and landmarks can effectively improve the robustness under the premise of ensuring precision.
- Published
- 2021
- Full Text
- View/download PDF
9. Classification of airborne 3D point clouds regarding separation of vegetation in complex environments.
- Author
-
Bulatov D, Stütz D, Hacker J, and Weinmann M
- Subjects
- Archaeology, Construction Materials, Datasets as Topic, Geography, Germany, Imaging, Three-Dimensional methods, Italy, Lasers, Photogrammetry, Queensland, Soil Erosion, Algorithms, Geographic Mapping, Geological Phenomena, Plants, Remote Sensing Technology
- Abstract
Classification of outdoor point clouds is an intensely studied topic, particularly with respect to the separation of vegetation from the terrain and manmade structures. In the presence of many overhanging and vertical structures, the (relative) height is no longer a reliable criterion for such a separation. An alternative would be to apply supervised classification; however, thousands of examples are typically required for appropriate training. In this paper, an unsupervised and rotation-invariant method is presented and evaluated for three datasets with very different characteristics. The method allows us to detect planar patches by filtering and clustering so-called superpoints, whereby the well-known but suitably modified random sampling and consensus (RANSAC) approach plays a key role for plane estimation in outlier-rich data. The performance of our method is compared to that produced by supervised classifiers common for remote sensing settings: random forest as learner and feature sets for point cloud processing, like covariance-based features or point descriptors. It is shown that for point clouds resulting from airborne laser scans, the detection accuracy of the proposed method is over 96% and, as such, higher than that of standard supervised classification approaches. Because of artifacts caused by interpolation during 3D stereo matching, the overall accuracy was lower for photogrammetric point clouds (74-77%). However, using additional salient features, such as the normalized green-red difference index, the results became more accurate and less dependent on the data source.
- Published
- 2021
- Full Text
- View/download PDF
10. GA-based implicit stochastic optimization and RNN-based simulation for deriving multi-objective reservoir hedging rules.
- Author
-
Khadr M and Schlenkhoff A
- Subjects
- Computer Simulation, Germany, Uncertainty, Algorithms, Neural Networks, Computer
- Abstract
Management of reservoir systems is a complicated process involving many uncertainties regarding future events and the diversity of purposes these reservoirs serve; therefore, an effective management of these systems could help improve resource utilization and avoid stakeholder disputes. The aim of this paper was to build an optimization-simulation framework based on implicit stochastic optimization (ISO), genetic algorithms (GA), and recurrent neural network (RNN) for addressing the issue of reservoir operation. Inflow scenarios were generated synthetically based on a monthly scale to be used as an input to a multi-objective genetic programming model to construct an optimal operating rules database. Such database was subsequently used simultaneously with the output of the inflow forecasting model to simulate monthly reservoir hedging rules using RNN. Our results demonstrate the effectiveness of the GA-ISO-RNN model for simulating and predicting optimal reservoir release with consistent accuracy. Results from both the training and testing phases clearly proved the usefulness of RNN in predicting optimal reservoir release with relatively higher values of the Nash-Sutcliffe model efficiency coefficient, correlation coefficient, and lower values of root mean squared error and mean absolute deviation. Furthermore, by comparing the historical releases and the output of the proposed model, the results show that the proposed model was less vulnerable than standard operating rules. The proposed methodology was applied to the Bigge reservoir in Germany, as it features an extensive management infrastructure, but this methodology can also be easily adopted in other similar cases.
- Published
- 2021
- Full Text
- View/download PDF
11. Algorithmic literacy in medical students – results of a knowledge test conducted in Germany.
- Author
-
Kampa, Philipp and Balzer, Felix
- Subjects
MEDICAL students ,HEALTH occupations students ,QUANTITATIVE research ,INFORMATION literacy ,QUESTIONNAIRES ,SCALE analysis (Psychology) ,DESCRIPTIVE statistics ,DATA analysis software ,MEDICAL education ,COMPUTER literacy ,ALGORITHMS ,TELEMEDICINE - Abstract
The impact of algorithms on everyday life is ever increasing. Medicine and public health are not excluded from this development – algorithms in medicine do not only challenge, change and inform research (methods) but also clinical situations. Given this development, questions arise concerning the competency level of prospective physicians, thus medical students, on algorithm related topics. This paper, based on a master's thesis in library and information science written at Humboldt‐Universität zu Berlin, gives an insight into this topic by presenting and analysing the results of a knowledge test conducted among medical students in Germany. F. J. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
12. "All mimsy were the borogoves" – a discriminative learning model of morphological knowledge in pseudo-word inflection.
- Author
-
Nieder, Jessica, van de Vijver, Ruben, and Tomaschek, Fabian
- Subjects
- *
COMPUTER simulation , *COMPARATIVE grammar , *LEARNING strategies , *LANGUAGE acquisition , *INTELLECT , *PHONETICS , *RESEARCH funding , *VOCABULARY , *ARTIFICIAL neural networks , *ALGORITHMS - Abstract
Grammatical knowledge has often been investigated in wug tests, in which participants inflect pseudo-words. It was argued that in inflecting these pseudo-words, speakers apply their knowledge of word formation. However, it remains unclear what exactly this knowledge is and how it is learned. According to one theory, the knowledge is best characterised as abstractions that specify how units are combined. Another theory maintains that it is best characterised by memory-based analogy. In both cases the knowledge is learned by association based on positive evidence alone. In this paper, we model the classification of pseudo-words to Maltese plurals using a shallow neural network trained with an error-driven learning algorithm. We demonstrate that the classifications mirror those of Maltese speakers in a wug test. Our results indicate that speakers rely on gradient knowledge of a relation between the phonetics of whole words and plural classes, which is learned in an error-driven way. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. Algorithm Directed Troop Medical Care Manual Application for Desktop and Smartphone.
- Author
-
Chrosniak, James, Olsen, Christian, and Galdi, Andrew
- Subjects
- *
PSYCHOLOGY of the sick , *MOBILE apps , *MEDICAL care , *CRITICAL thinking , *PERSONAL computers , *RESEARCH , *RESEARCH methodology , *MEDICAL cooperation , *EVALUATION research , *COMPARATIVE studies , *MILITARY personnel , *ALGORITHMS - Abstract
Introduction: The U.S. Army medics are often the first responders in the care of sick and injured soldiers on the battlefield, sick call in a Role 2 aid station and garrison clinics. Sick call medics are required to utilize the Algorithm Directed Troop Medical Care (ADTMC) to care for and then render a disposition for these soldiers. The current ADTMC manual is a thick, heavy paper manual. A desktop and smartphone application has been developed that contains the entire ADTMC manual algorithm-based content. Our goal is to enhance the medics' clinical learning and critical thinking skills while improving their evaluation, disposition, and documentation during patient encounters.Materials and Methods: The application was field-tested with the 173rd IBCT (A) while attending a field exercise at Grafenwoehr, Germany. At the unit's Role 2 tent setup, the use of the paper manual to the utilization of the same material via the ADTMC application was compared by observing the medics' workflows directly while they were caring for ill and injured soldiers.Results: Medics, while examining patients, would demonstrate 50% faster exam time using the application compared to medics using only the manual. Moreover, a 50% decrease in document processing time as a result of digitization of the paper documentation process was confirmed. The application ("app") enabled the medics to continue their screening assessments and patient disposition duties on a desktop computer or a smartphone without the need to refer to the paper manual.Conclusion: Medics, by adopting this tool, will become quicker, more efficient, and develop critical thinking skills. In other words, the ability to objectively evaluate patients in order to form a proper disposition of sick and injured soldiers during training, in the field, as well as in garrison. When utilized properly, the ADTMC application ensures that soldiers reporting to sick call are expeditiously routed to the appropriate level of care, and is a vehicle for further training for medics in the care of soldiers. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
14. [Managment of acute low back pain without trauma - an algorithm].
- Author
-
Melcher C, Wegener B, Jansson V, Mutschler W, Kanz KG, and Birkenmaier C
- Subjects
- Acute Pain diagnosis, Acute Pain etiology, Diagnosis, Differential, Germany, Humans, Low Back Pain diagnosis, Low Back Pain etiology, Outcome and Process Assessment, Health Care, Pain Management methods, Acute Pain therapy, Algorithms, Low Back Pain therapy
- Abstract
Background: Low back pain is a common problem for primary care providers, outpatient clinics and A&E departments. The predominant symptoms are those of so-called "unspecific back pain", but serious pathologies can be concealed by the clinical signs. Especially less experienced colleagues have problems in treating these patients, as - despite the multitude of recommendations and guidelines - there is no generally accepted algorithm., Methods: After a literature search (Medline/Cochrane), 158 articles were selected from 15,000 papers and classified according to their level of evidence. These were attuned to the clinical guidelines of the orthopaedic and pain-physician associations in Europe, North America and overseas and the experience of specialists at LMU Munich, in order to achieve consistency with literature recommendations, as well as feasibility in everyday clinical work and optimised with practical relevance., Results: An algorithm was formed to provide the crucial differential diagnosis of lumbar back pain according to its clinical relevance and to provide a plan of action offering reasonable diagnostic and therapeutic steps. As a consequence of distinct binary decisions, low back patients should be treated at any given time according to the guidelines, with emergencies detected, unnecessary diagnostic testing and interventions averted and reasonable treatment initiated pursuant to the underlying pathology., Conclusion: In the context of the available evidence, a clinical algorithm has been developed that translates the complex diagnostic testing of acute low back pain into a transparent, structured and systematic guideline., Competing Interests: Die Autoren geben an, dass kein Interessenkonflikt besteht., (Georg Thieme Verlag KG Stuttgart · New York.)
- Published
- 2018
- Full Text
- View/download PDF
15. Deriving health utilities from the MacNew Heart Disease Quality of Life Questionnaire.
- Author
-
Chen, Gang, McKie, John, Khan, Munir A., and Richardson, Jeff R.
- Subjects
CORONARY heart disease treatment ,QUALITY of life ,ALGORITHMS ,STATISTICAL correlation ,GOODNESS-of-fit tests ,NONPARAMETRIC statistics ,QUESTIONNAIRES ,REGRESSION analysis ,RESEARCH funding ,STATISTICAL sampling ,STATISTICS ,SURVEYS ,VISUAL analog scale ,INTER-observer reliability ,DATA analysis software ,MANN Whitney U Test ,KRUSKAL-Wallis Test - Abstract
Introduction: Quality of life is included in the economic evaluation of health services by measuring the preference for health states, i.e. health state utilities. However, most intervention studies include a disease-specific, not a utility, instrument. Consequently, there has been increasing use of statistical mapping algorithms which permit utilities to be estimated from a disease-specific instrument. The present paper provides such algorithms between the MacNew Heart Disease Quality of Life Questionnaire (MacNew) instrument and six multi-attribute utility (MAU) instruments, the Euroqol (EQ-5D), the Short Form 6D (SF-6D), the Health Utilities Index (HUI) 3, the Quality of Wellbeing (QWB), the 15D (15 Dimension) and the Assessment of Quality of Life (AQoL-8D). Methods: Heart disease patients and members of the healthy public were recruited from six countries. Non-parametric rank tests were used to compare subgroup utilities and MacNew scores. Mapping algorithms were estimated using three separate statistical techniques. Results: Mapping algorithms achieved a high degree of precision. Based on the mean absolute error and the intra class correlation the preferred mapping is MacNew into SF-6D or 15D. Using the R squared statistic the preferred mapping is MacNew into AQoL-8D. Implications for research: The algorithms reported in this paper enable MacNew data to be mapped into utilities predicted from any of six instruments. This permits studies which have included the MacNew to be used in cost utility analyses which, in turn, allows the comparison of services with interventions across the health system. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
16. Comparison of cardiac output estimates obtained from the Antares oscillometric pulse wave analysis algorithm and from Doppler transthoracic echocardiography.
- Author
-
Stäuber, Alexander, Hoppe, Matthias Wilhelm, Lapp, Harald, Richter, Stefan, Ohlow, Marc-Alexander, Dörr, Marcus, Piper, Cornelia, Eckert, Siegfried, Coll- Barroso, Michael Thomas, Stäuber, Franziska, Abanador-Kamper, Nadine, and Baulmann, Johannes
- Subjects
PULSE wave analysis ,CARDIAC output ,DOPPLER echocardiography ,BLAND-Altman plot ,PATIENTS ,PEARSON correlation (Statistics) ,ALGORITHMS - Abstract
Background: In cardiology, cardiac output (CO) is an important parameter for assessing cardiac function. While invasive thermodilution procedures are the gold standard for CO assessment, transthoracic Doppler echocardiography (TTE) has become the established method for routine CO assessment in daily clinical practice. However, a demand persists for non-invasive approaches, including oscillometric pulse wave analysis (PWA), to enhance the accuracy of CO estimation, reduce complications associated with invasive procedures, and facilitate its application in non-intensive care settings. Here, we aimed to compare the TTE and oscillometric PWA algorithm Antares for a non-invasive estimation of CO. Methods: Non-invasive CO data obtained by two-dimensional TTE were compared with those from an oscillometric blood pressure device (custo med GmbH, Ottobrunn, Germany) using the integrated algorithm Antares (Redwave Medical GmbH, Jena, Germany). In total, 59 patients undergoing elective cardiac catheterization for clinical reasons (71±10 years old, 76% males) were included. Agreement between both CO measures were assessed by Bland-Altman analysis, Student's t-test, and Pearson correlations. Results: The mean difference in CO was 0.04 ± 1.03 l/min (95% confidence interval for the mean difference: -0.23 to 0.30 l/min) for the overall group, with lower and upper limits of agreement at -1.98 and 2.05 l/min, respectively. There was no statistically significant difference in means between both CO measures (P = 0.785). Statistically significant correlations between TTE and Antares CO were observed in the entire cohort (r = 0.705, P<0.001) as well as in female (r = 0.802, P<0.001) and male patients (r = 0.669, P<0.001). Conclusions: The oscillometric PWA algorithm Antares and established TTE for a non-invasive estimation of CO are highly correlated in male and female patients, with no statistically significant difference between both approaches. Future validation studies of the Antares CO are necessary before a clinical application can be considered. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Algorithm for activation of coagulation support treatment in multiple injured patients--cohort study.
- Author
-
Brilej, D., Stropnik, D., Lefering, R., and Komadina, R.
- Subjects
ALGORITHMS ,BLOOD coagulation disorders ,REPORTING of diseases ,LONGITUDINAL method ,EVALUATION of medical care ,WOUNDS & injuries - Abstract
Background: Early recognition and management of trauma related coagulopathy improves the outcome. Trauma facilities should implement an algorithm to identify the bleeding trauma patient with coagulopathy. Objective: The scope of the paper is to identify the indicators of early coagulopathy and to optimize the indications for thromboelastometry and coagulation support. Design: Cohort study based on data from trauma registry. Setting: Data of 493 major trauma patients treated in GH Celje from 2006 to 2014 were included into The TraumaRegister DGU (TR-DGU). Patients: Patients were selected for inclusion into TR-DGU according to the following criteria: polytraumatized patients with Injury severity score (ISS) ≥ 18, patients with injuries to single region with AIS 5, patients with major injuries to a single region and abnormal vital signs. All patients that were dead on arrival to hospital, patients presented to hospital >24 h after the injury, and head injuries that occurred with a low energy mechanism in patients on anticoagulation drugs were excluded. Measurements: Two groups were formed (with or without coagulopathy). Mortality, morbidity, length of mechanical ventilation, ICU and hospital stay were used as outcome and compared between the groups. A coagulopathy prediction model (CPM) was developed to identify the patients who were at high risk of coagulopathy. Results: Coagulopathy was present in 51 % of patients. Severe injuries to the torso and limbs, infusion of >1000 ml of fluids in the prehospital settings, and hypotension were included into CPM. If all three criteria were present, the sensitivity of the model to predict coagulopathy was 93 %. By adding the blood gas analysis (BE ≤ −5), the specificity increased to 81.7 %. Limitations: Shortcomings of our analysis are mainly related to the quality of data in the registry that may not be comparable to a clinical trial where data are collected specifically to address a given issue. Conclusions: The Criteria for activation of coagulation support treatment remain centre dependent. In our settings the CPM is the tool to select patients for ROTEM analysis. By adding data from blood gas analysis, treatment of coagulopathy is justifiable before complete test results are available. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
18. Study protocol for the development, trial, and evaluation of a strategy for the implementation of qualification-oriented work organization in nursing homes.
- Author
-
Burfeindt, Corinna, Darmann-Finck, Ingrid, Stammann, Carina, Stegbauer, Constance, Stolle-Wahl, Claudia, Zündel, Matthias, and Rothgang, Heinz
- Subjects
NURSING care facility laws ,HUMAN services programs ,PROFESSIONAL practice ,LONG-term health care ,WORKING hours ,ACADEMIC achievement ,ORGANIZATIONAL change ,MEDICAL research ,EVIDENCE-based medicine ,ALGORITHMS - Abstract
Background: Staffing ratios in nursing homes vary among the federal states of Germany, but there are no rational grounds for these variations. In a previous study, a new instrument for the standardized calculation of staffing requirements in nursing homes was developed (Algorithm1.0). The development was based on a new empirical data collection method that derives actual and target values for the time and number of care interventions provided. Algorithm1.0 found an increased requirement of 36% of staff in German nursing homes. Based on these results, the German legislature has commissioned a model program to trial and evaluate a complex intervention comprising increased staffing combined with strategies for organizational development. Methods: The mixed-methods study consists of (i) developing a concept for restructuring the work organization, (ii) the application of this concept combined with increased staffing in 10 nursing homes (complex intervention), and the further development of the concept using a participatory and iterative formal evaluation process. The intervention consists of (a) quantitative measures of increased staffing based on a calculation using Algorithm1.0 and (b) qualitative measures regarding organizational development. The intervention will be conducted over one year. The effects of the intervention on job satisfaction and quality of care will be evaluated in (iii) a comprehensive prospective, controlled summative evaluation. The results will be compared with ten matched nursing homes as a control group. Finally, (iv) prototypical concepts for qualification-oriented work organization, a strategy for the national rollout, and the further development of Algorithm1.0 into Algorithm 2.0 will be derived. Discussion: In Germany, there is an ongoing dynamic legislation process regarding further developing the long-term care sector. The study, which is the subject of the study protocol presented here, generates an evidence-based strategy for the staffing requirements for nursing homes. Ethics and dissemination. This study was approved by the Ethics Committee of the German Association of Nursing Science (Deutsche Gesellschaft für Pflegewissenschaft) on 02.08.2023 (amended on 20.09.2023). Research findings are disseminated through presentations at national and international conferences and publications in peer-reviewed scientific journals. Trial registration number: German Clinical Trails Register DRKS00031773 (Date of registration 09.11.2023). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. [Conservative calibration of a clearance monitor system for waste material from nuclear medicine].
- Author
-
Wanke C and Geworski L
- Subjects
- Calibration, Germany, Medical Waste prevention & control, Radiation Dosage, Radioactive Waste prevention & control, Reproducibility of Results, Sensitivity and Specificity, Algorithms, Medical Waste analysis, Nuclear Medicine standards, Practice Guidelines as Topic, Radiation Monitoring instrumentation, Radiation Monitoring standards, Radioactive Waste analysis
- Abstract
Clearance monitor systems are used for gross gamma measurements of waste potentially contaminated with radioactivity. These measurements are to make sure that legal requirements, e.g. clearance criteria according to the german radiation protection ordinance, are met. This means that measurement results may overestimate, but must not underestimate the true values. This paper describes a pragmatic way using a calibrated Cs-137 point source to generate a conservative calibration for the clearance monitor system used in the Medizinische Hochschule Hannover (MHH). The most important nuclides used in nuclear medicine are considered. The measurement result reliably overestimates the true value of the activity present in the waste. The calibration is compliant with the demands for conservativity and traceability to national standards., (Copyright © 2014. Published by Elsevier GmbH.)
- Published
- 2014
- Full Text
- View/download PDF
20. Traffic Sign Detection and Recognition Using YOLO Object Detection Algorithm: A Systematic Review.
- Author
-
Flores-Calero, Marco, Astudillo, César A., Guevara, Diego, Maza, Jessica, Lita, Bryan S., Defaz, Bryan, Ante, Juan S., Zabala-Blanco, David, and Armingol Moreno, José María
- Subjects
TRAFFIC monitoring ,TRAFFIC signs & signals ,ARTIFICIAL neural networks ,INTELLIGENT transportation systems ,ALGORITHMS ,OBJECT recognition (Computer vision) ,MOBILE operating systems ,IRIS recognition - Abstract
Context: YOLO (You Look Only Once) is an algorithm based on deep neural networks with real-time object detection capabilities. This state-of-the-art technology is widely available, mainly due to its speed and precision. Since its conception, YOLO has been applied to detect and recognize traffic signs, pedestrians, traffic lights, vehicles, and so on. Objective: The goal of this research is to systematically analyze the YOLO object detection algorithm, applied to traffic sign detection and recognition systems, from five relevant aspects of this technology: applications, datasets, metrics, hardware, and challenges. Method: This study performs a systematic literature review (SLR) of studies on traffic sign detection and recognition using YOLO published in the years 2016–2022. Results: The search found 115 primary studies relevant to the goal of this research. After analyzing these investigations, the following relevant results were obtained. The most common applications of YOLO in this field are vehicular security and intelligent and autonomous vehicles. The majority of the sign datasets used to train, test, and validate YOLO-based systems are publicly available, with an emphasis on datasets from Germany and China. It has also been discovered that most works present sophisticated detection, classification, and processing speed metrics for traffic sign detection and recognition systems by using the different versions of YOLO. In addition, the most popular desktop data processing hardwares are Nvidia RTX 2080 and Titan Tesla V100 and, in the case of embedded or mobile GPU platforms, Jetson Xavier NX. Finally, seven relevant challenges that these systems face when operating in real road conditions have been identified. With this in mind, research has been reclassified to address these challenges in each case. Conclusions: This SLR is the most relevant and current work in the field of technology development applied to the detection and recognition of traffic signs using YOLO. In addition, insights are provided about future work that could be conducted to improve the field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Comparative Analysis of Algorithms to Cleanse Soil Micro-Relief Point Clouds.
- Author
-
Ott, Simone, Burkhard, Benjamin, Harmening, Corinna, Paffenholz, Jens-André, and Steinhoff-Knopp, Bastian
- Subjects
ARTIFICIAL neural networks ,POINT cloud ,DEEP learning ,ALGORITHMS ,MACHINE learning - Abstract
Detecting changes in soil micro-relief in farmland helps to understand degradation processes like sheet erosion. Using the high-resolution technique of terrestrial laser scanning (TLS), we generated point clouds of three 2 × 3 m plots on a weekly basis from May to mid-June in 2022 on cultivated farmland in Germany. Three well-known applications for eliminating vegetation points in the generated point cloud were tested: Cloth Simulation Filter (CSF) as a filtering method, three variants of CANUPO as a machine learning method, and ArcGIS PointCNN as a deep learning method, a sub-category of machine learning using deep neural networks. We assessed the methods with hard criteria such as F1 score, balanced accuracy, height differences, and their standard deviations to the reference surface, resulting in data gaps and robustness, and with soft criteria such as time-saving capacity, accessibility, and user knowledge. All algorithms showed a low performance at the initial measurement epoch, increasing with later epochs. While most of the results demonstrate a better performance of ArcGIS PointCNN, this algorithm revealed an exceptionally low performance in plot 1, which is describable by the generalization gap. Although CANUPO variants created the highest amount of data gaps, we recommend that CANUPO include colour values in combination with CSF. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
22. Routine outcome measures in Germany.
- Author
-
Puschner, Bernd, Becker, Thomas, and Bauer, Stephanie
- Subjects
ALGORITHMS ,CLINICAL medicine ,COMPUTER software ,DOCUMENTATION ,MEDICAL quality control ,MENTAL health services ,HEALTH outcome assessment ,PROFESSIONAL peer review ,POLICY sciences ,QUALITY assurance ,KEY performance indicators (Management) ,HUMAN services programs ,PATIENT-centered care - Abstract
The German healthcare system offers comprehensive coverage for people with mental illness including inpatient, day hospital and outpatient services. These services are primarily financed through the statutory health and pension insurances. According to legal regulations, providers are required to base their services on current scientific evidence and to continuously assure the quality of their services. This paper gives an overview of recent initiatives to develop, evaluate and disseminate routine outcome measurement (ROM) in service settings in Germany. A large number of projects have shown outcome monitoring to be feasible, and that feedback of outcome may enhance routine care through an improved allocation of treatment resources. However, none of these initiatives have been integrated into routine care on a nationwide or trans-sectoral level, and their sustainability has been limited. This is due to various barriers in a fragmented mental health service system and to the lack of coordinated national or state-level service planning. The time is ripe for a concerted effort including policy-makers to pick up on these initiatives and move them towards wide-spread implementation in routine care accompanied by practice-oriented research including service user involvement. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
23. Your Boss Is an Algorithm: Artificial Intelligence, Platform Work and Labour.
- Author
-
BERNICCHIA-FREEMAN, ZOÉ
- Subjects
ARTIFICIAL intelligence ,ROBOTS ,AUTOMATION ,ALGORITHMS - Abstract
In March 1964, the cover page of a popular German weekly magazine entitled Der Spiegel painted a frightening picture: An anthropomorphic robot with six mechanical arms commands an assembly line while a displaced human worker floats aimlessly in the foreground. Ejected from his station, the worker throws up his hands in despair next to a headline that reads, "Automation in Germany, the arrival of robots." Over fifty years later, a cover page from the same magazine evoked similar themes: A giant robot arm yanks an office worker away from his computer under the headline, "You're fired! How computers and robots steal our jobs - and which jobs will be safe." The more things change, the more they stay the same. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Comparing artificial intelligence algorithms to 157 German dermatologists: the melanoma classification benchmark.
- Author
-
Brinker, Titus J., Hekler, Achim, Hauschild, Axel, Berking, Carola, Schilling, Bastian, Enk, Alexander H., Haferkamp, Sebastian, Karoglan, Ante, von Kalle, Christof, Weichenthal, Michael, Sattler, Elke, Schadendorf, Dirk, Gaiser, Maria R., Klode, Joachim, and Utikal, Jochen S.
- Subjects
- *
ACADEMIC medical centers , *ALGORITHMS , *ARTIFICIAL intelligence , *BENCHMARKING (Management) , *DERMATOLOGISTS , *DIAGNOSTIC imaging , *COMPUTERS in medicine , *MELANOMA , *ARTIFICIAL neural networks , *QUESTIONNAIRES , *RECEIVER operating characteristic curves - Abstract
Abstract Background Several recent publications have demonstrated the use of convolutional neural networks to classify images of melanoma at par with board-certified dermatologists. However, the non-availability of a public human benchmark restricts the comparability of the performance of these algorithms and thereby the technical progress in this field. Methods An electronic questionnaire was sent to dermatologists at 12 German university hospitals. Each questionnaire comprised 100 dermoscopic and 100 clinical images (80 nevi images and 20 biopsy-verified melanoma images, each), all open-source. The questionnaire recorded factors such as the years of experience in dermatology, performed skin checks, age, sex and the rank within the university hospital or the status as resident physician. For each image, the dermatologists were asked to provide a management decision (treat/biopsy lesion or reassure the patient). Main outcome measures were sensitivity, specificity and the receiver operating characteristics (ROC). Results Total 157 dermatologists assessed all 100 dermoscopic images with an overall sensitivity of 74.1%, specificity of 60.0% and an ROC of 0.67 (range = 0.538–0.769); 145 dermatologists assessed all 100 clinical images with an overall sensitivity of 89.4%, specificity of 64.4% and an ROC of 0.769 (range = 0.613–0.9). Results between test-sets were significantly different (P < 0.05) confirming the need for a standardised benchmark. Conclusions We present the first public melanoma classification benchmark for both non-dermoscopic and dermoscopic images for comparing artificial intelligence algorithms with diagnostic performance of 145 or 157 dermatologists. Melanoma Classification Benchmark should be considered as a reference standard for white-skinned Western populations in the field of binary algorithmic melanoma classification. Highlights • This paper provides the first open access melanoma classification benchmark for both non-dermoscopic and dermoscopic images. • Algorithms can now be easily compared to the performance of dermatologists in terms of sensitivity, specificity and ROC. • The melanoma benchmark allows comparability between algorithms of different publications and provides a new reference standard. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
25. Validierung innerklinischer Sichtungsalgorithmen für den Massenanfall von Verletzten – eine simulationsbasierte Studie – deutsche Version.
- Author
-
Heller, Axel R., Neidel, Tobias, Klotz, Patrick J., Solarek, André, Kowalzik, Barbara, Juncken, Kathleen, and Kleber, Christan
- Subjects
STATISTICS ,MEDICAL triage ,DISASTERS ,SIMULATION methods in education ,MONITOR alarms (Medicine) ,EMERGENCY management ,RESEARCH funding ,CASE studies ,MASS casualties ,COMPUTER-aided diagnosis ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,ALGORITHMS - Abstract
Copyright of Die Anaesthesiologie is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
26. Introducing risk adjustment and free health plan choice in employer-based health insurance: Evidence from Germany.
- Author
-
Pilny, Adam, Wübker, Ansgar, and Ziebarth, Nicolas R.
- Subjects
- *
HEALTH planning , *HEALTH insurance , *MEDICAL economics , *CONSUMERS , *MEDICAL care costs , *RISK assessment -- Law & legislation , *GOVERNMENT aid laws , *ALGORITHMS , *DATABASES , *DECISION making , *DISCRIMINATION in insurance , *EMPLOYER-sponsored health insurance ,HEALTH insurance & economics - Abstract
To equalize differences in health plan premiums due to differences in risk pools, the German legislature introduced a simple Risk Adjustment Scheme (RAS) based on age, gender and disability status in 1994. In addition, effective 1996, consumers gained the freedom to choose among hundreds of existing health plans, across employers and state-borders. This paper (a) estimates RAS pass-through rates on premiums, financial reserves, and expenditures and assesses the overall RAS impact on market price dispersion. Moreover, it (b) characterizes health plan switchers and investigates their annual and cumulative switching rates over time. Our main findings are based on representative enrollee panel data linked to administrative RAS and health plan data. We show that sickness funds with bad risk pools and high pre-RAS premiums lowered their total premiums by 42 cents per additional euro allocated by the RAS. Consequently, post-RAS, health plan prices converged but not fully. Because switchers are more likely to be white collar, young and healthy, the new consumer choice resulted in more risk segregation and the amount of money redistributed by the RAS increased over time. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
27. Diagnosis and treatment of chronic wounds: current standards of Germany's Initiative for Chronic Wounds e. V.
- Author
-
Dissemond, J. and Bültemann, A.
- Subjects
ACRONYMS ,ALGORITHMS ,TERMS & phrases ,WOUND care ,CHRONIC wounds & injuries ,DIAGNOSIS ,THERAPEUTICS - Abstract
The diagnosis and treatment of patients with chronic wounds is an enormous challenge in various disciplines of medicine. These very complex processes usually involve several experts of different medical specialties with varying educational backgrounds. A necessary basis for consistent communication and documentation is the use of unambiguous nomenclature. Therefore, the board of the German wound association, Initiative for Chronic Wounds (ICW) e.V., has started to define various terms and procedures. An easy to remember algorithm, in the form of the ABCDE rule, has been developed for the structured diagnosis of chronic wounds. The successful therapy of chronic wounds is then based on the causal treatment of the underlying, pathophysiological relevant diseases. M.O.I.S.T. a concept which helps health professionals in the systematic approach to the local treatment of patients with chronic wounds, in conforming to the most up-to-date scientific knowledge. By using consistent definitions and standards in wound care, it is possible to optimise current diagnostic and treatment strategies as well as to make them more easily understandable. Declaration of interest: The authors have no conflicts or interest with regard to this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
28. Optimal policy identification: Insights from the German electricity market.
- Author
-
Herrmann, J.K. and Savin, I.
- Subjects
ELECTRIC industries ,RENEWABLE energy sources ,ENERGY policy ,ELECTRIC utilities ,ALGORITHMS - Abstract
The diffusion of renewable electricity technologies is widely considered as crucial for establishing a sustainable energy system in the future. However, the required transition is unlikely to be achieved by market forces alone. For this reason, many countries implement various policy instruments to support this process, also by re-distributing related costs among all electricity consumers. This paper presents a novel history-friendly agent-based study aiming to explore the efficiency of different mixes of policy instruments by means of a Differential Evolution algorithm. Special emphasis of the model is devoted to the possibility of small scale renewable electricity generation, but also to the storage of this electricity using small scale facilities being actively developed over the last decade. Both combined pose an important instrument for electricity consumers to achieve partial or full autarky from the electricity grid, particularly after accounting for decreasing costs and increasing efficiency of both due to continuous innovation. Among other things, we find that the historical policy mix of Germany introduced too strong and inflexible demand-side instruments (like feed-in tariff) too early, thereby creating strong path-dependency for future policy makers and reducing their ability to react to technological but also economic shocks without further increases of the budget. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
29. Self-potential data inversion utilizing the Bat optimizing algorithm (BOA) with various application cases.
- Author
-
Essa, Khalid S., Diab, Zein E., and Mehanee, Salah A.
- Subjects
BAT behavior ,GEOMETRIC shapes ,BATS ,BREWSTER'S angle ,ALGORITHMS ,METAHEURISTIC algorithms ,ANGLES - Abstract
The Bat optimizing algorithm (BOA) is one of the metaheuristic algorithms and applied here to interpret self-potential (SP) data. The BOA is depending upon a bat echolocation behavior for global optimization, which the global optimum solution reached at the suggested minimum value of the objective function. The best interpretive source parameters for the subsurface structures occurred at the minimal the objective function value (global best solution). The BOA is applied to 2D SP anomaly data to estimate the characteristic source parameters (i.e., the depth to center, amplitude coefficient, origin location, geometric shape factor, and polarization and inclination angle of the causative buried structure). The BOA can be applied to single and multiple source structures in the restricted class of simple geometric shapes, which these bodies help in the validation of the subsurface ore and mineral targets. The stability and efficiency of the proposed BOA have been examined by several synthetic examples. In addition, three different real field examples from Germany and Indonesia have been successfully applied to ore and mineral investigation and geological structure studies. In general, the achieved results are in good agreement with the available borehole data and results mentioned in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. A Radar-Based Quantitative Precipitation Estimation Algorithm to Overcome the Impact of Vertical Gradients of Warm-Rain Precipitation: The Flood in Western Germany on 14 July 2021.
- Author
-
Chen, Ju-Yu, Reinoso-Rondinel, Ricardo, Trömel, Silke, Simmer, Clemens, and Ryzhkov, Alexander
- Subjects
RAINDROP size ,METEOROLOGICAL services ,RAINFALL ,HYDROLOGIC models ,RAIN gauges ,ALGORITHMS ,FLOODS - Abstract
The demand of accurate, near-real-time radar-based quantitative precipitation estimation (QPE), which is key to feed hydrological models and enable reliable flash flood predictions, was highlighted again by the disastrous floods following after an intense stratiform precipitation field passing western Germany on 14 July 2021. Three state-of-the-art rainfall algorithms based on reflectivity Z, specific differential phase KDP, and specific attenuation A were applied to observations of four polarimetric C-band radars operated by the German Meteorological Service [DWD (Deutscher Wetterdienst)]. Due to the large vertical gradients of precipitation below the melting layer suggesting warm-rain processes, all QPE products significantly underestimate surface precipitation. We propose two mitigation approaches: (i) vertical profile (VP) corrections for Z and KDP and (ii) gap filling using observations of a local X-band radar, JuXPol. We also derive rainfall retrievals from vertically pointing Micro Rain Radar (MRR) profiles, which indirectly take precipitation gradients in the lower few hundreds of meters into account. When evaluated with DWD rain gauge measurements, those retrievals result in pronounced improvements, especially for the A-based retrieval partly due to its lower sensitivity to the variability of raindrop size distributions. The VP correction further improves QPE by reducing the normalized root-mean-square error by 23% and the normalized mean bias by 20%. With the use of gap-filling JuXPol data, the A-based retrieval gives the lowest errors followed by the Z-based retrievals in combination with VP corrections. The presented algorithms demonstrate the increased value of radar-based QPE application for warm-rain events and related potential flash flooding warnings. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. 3D involute gear evaluation - part II: deviations - basic algorithms for modern software validation.
- Author
-
Stein, Martin and Härtig, Frank
- Subjects
SPUR gearing ,SOFTWARE validation ,HELICAL gears ,ALGORITHMS ,MEASURING instruments - Abstract
The fundamental equations for the evaluation of cylindrical involute gear measurements on 3D gear measuring instruments are provided. The computations are based on the principles of gear kinematics and use the system of involute gear coordinates introduced in a previous work of the authors. This holistic approach focuses on significant error sources that only appear since 3D measurement technology is used and that are almost unrecognized till today. The proposed algorithms are beneficial for the description of gear deviations as they allow the use of simple formulas covering profile, helix and pitch evaluation for internal or external and spur or helical gears. The presented equations contain the key fundamentals to complement existing standards. They will become part of reference algorithms used by the Physikalisch-Technische Bundesanstalt, the national metrology institute of Germany, to certify gear evaluation software. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
33. Proposal of an algorithm for the management of rectally inserted foreign bodies: a surgical single-center experience with review of the literature.
- Author
-
Fritz, Stefan, Killguss, Hansjörg, Schaudt, André, Sommer, Christof M., Richter, Götz M., Belle, Sebastian, Reissfelder, Christoph, Loff, Steffan, and Köninger, Jörg
- Subjects
INTESTINAL perforation ,FOREIGN bodies ,LITERATURE reviews ,ABDOMINOPERINEAL resection ,SURGICAL complications ,ABDOMINAL surgery ,EMERGENCY physicians ,ALGORITHMS - Abstract
Background: Retained rectal foreign bodies (RFBs) are uncommon clinical findings. Although the management of RFBs is rarely reported in the literature, clinicians regularly face this issue. To date, there is no standardized management of RFBs. The aim of the present study was to evaluate our own data and subsequently develop a treatment algorithm. Methods: All consecutive patients who presented between January 2006 and December 2019 with rectally inserted RFBs at the emergency department of the Klinikum Stuttgart, Germany, were retrospectively identified. Clinicopathologic features, management, complications, and outcomes were assessed. Based on this experience, a treatment algorithm was developed. Results: A total of 69 presentations with rectally inserted RFBs were documented in 57 patients. In 23/69 cases (33.3%), the RFB was removed transanally by the emergency physician either digitally (n = 14) or with the help of a rigid rectoscope (n = 8) or a colonoscope (n = 1). In 46/69 cases (66.7%), the RFB was removed in the operation theater under general anesthesia with muscle relaxation. Among these, 11/46 patients (23.9%) underwent abdominal surgery, either for manual extraction of the RFB (n = 9) or to exclude a bowel perforation (n = 2). Surgical complications occurred in 3/11 patients. One patient with rectal perforation developed pelvic sepsis and underwent abdominoperineal extirpation in the further clinical course. Conclusion: The management of RFBs can be challenging and includes a wide range of options from removal without further intervention to abdominoperineal extirpation in cases of pelvic sepsis. Whenever possible, RFBs should obligatorily be managed in specialized colorectal centers following a clear treatment algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
34. Neither timeless, nor placeless: Control of food delivery gig work via place-based working time regimes.
- Author
-
Heiland, Heiner
- Subjects
ELECTRONIC commerce ,RESEARCH methodology ,INTERVIEWING ,QUESTIONNAIRES ,LABOR market ,WORKING hours ,TIME management ,EMPIRICAL research ,FOOD service ,PERSONNEL management ,ALGORITHMS - Abstract
Working time regimes in platform labour are so far either ignored as a topic in research on gig work, or they are framed as an allocative instrument only. This article argues that working time regimes instead have both a coordinating and controlling effect. Adopting the analytical framework of labour process theory, the article hence focuses on the interrelation of working time and control regimes. The empirical material presented stems from research on platform-based food courier work in Germany and is based on a mixed methods research design consisting of interviews, multi-sited ethnography and a survey. The findings show that platforms implement hybrid control regimes that are not only based on the sufficiently analysed algorithmic management, but also on complementary control through working time regimes: temporal control. Platforms organise intra-platform markets where workers compete for shifts by means of performance. Thus, platforms are able to ensure an efficient and simultaneously reliable use of an autonomous and spatially distributed workforce. Furthermore, it is shown that platform labour is not placeless, either. The effects of its control regime vary according to different local conditions. As a result, platforms cannot be analysed only as techno-cultural ecosystems, but also as local-specific socio-economic structures. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
35. Preference Discovery in University Admissions: The Case for Dynamic Multioffer Mechanisms.
- Author
-
Grenet, Julien, He, YingHua, and Kübler, Dorothea
- Subjects
UNIVERSITY & college admission ,ALGORITHMS - Abstract
We document quasi-experimental evidence against the common assumption in the matching literature that agents have full information on their own preferences. In Germany's university admissions, the first stages of the Gale-Shapley algorithm are implemented in real time, allowing for multiple offers per student. We demonstrate that nonexploding early offers are accepted more often than later offers, despite not being more desirable. These results, together with survey evidence and a theoretical model, are consistent with students' costly discovery of preferences. A novel dynamic multioffer mechanism that batches early offers improves matching efficiency by informing students of offer availability before preference discovery. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
36. Estimating dry biomass and plant nitrogen concentration in pre-Alpine grasslands with low-cost UAS-borne multispectral data – a comparison of sensors, algorithms, and predictor sets.
- Author
-
Schucknecht, Anne, Seo, Bumsuk, Krämer, Alexander, Asam, Sarah, Atzberger, Clement, and Kiese, Ralf
- Subjects
PLANT biomass ,GRASSLANDS ,PLANT drying ,DETECTORS ,ALGORITHMS ,RANDOM forest algorithms ,WILDLIFE management areas - Abstract
Grasslands are an important part of pre-Alpine and Alpine landscapes. Despite the economic value and the significant role of grasslands in carbon and nitrogen (N) cycling, spatially explicit information on grassland biomass and quality is rarely available. Remotely sensed data from unmanned aircraft systems (UASs) and satellites might be an option to overcome this gap. Our study aims to investigate the potential of low-cost UAS-based multispectral sensors for estimating above-ground biomass (dry matter, DM) and plant N concentration. In our analysis, we compared two different sensors (Parrot Sequoia, SEQ; MicaSense RedEdge-M, REM), three statistical models (linear model; random forests, RFs; gradient-boosting machines, GBMs), and six predictor sets (i.e. different combinations of raw reflectance, vegetation indices, and canopy height). Canopy height information can be derived from UAS sensors but was not available in our study. Therefore, we tested the added value of this structural information with in situ measured bulk canopy height data. A combined field sampling and flight campaign was conducted in April 2018 at different grassland sites in southern Germany to obtain in situ and the corresponding spectral data. The hyper-parameters of the two machine learning (ML) approaches (RF, GBM) were optimized, and all model setups were run with a 6-fold cross-validation. Linear models were characterized by very low statistical performance measures, thus were not suitable to estimate DM and plant N concentration using UAS data. The non-linear ML algorithms showed an acceptable regression performance for all sensor–predictor set combinations with average (avg; cross-validated, cv) Rcv2 of 0.48, RMSE cv,avg of 53.0 g m 2 , and rRMSE cv,avg (relative) of 15.9 % for DM and with Rcv,avg2 of 0.40, RMSE cv,avg of 0.48 wt %, and rRMSE cv, avg of 15.2 % for plant N concentration estimation. The optimal combination of sensors, ML algorithms, and predictor sets notably improved the model performance. The best model performance for the estimation of DM (Rcv2=0.67 , RMSE cv=41.9 g m 2 , rRMSE cv=12.6 %) was achieved with an RF model that utilizes all possible predictors and REM sensor data. The best model for plant N concentration was a combination of an RF model with all predictors and SEQ sensor data (Rcv2=0.47 , RMSE cv=0.45 wt %, rRMSE cv=14.2 %). DM models with the spectral input of REM performed significantly better than those with SEQ data, while for N concentration models, it was the other way round. The choice of predictors was most influential on model performance, while the effect of the chosen ML algorithm was generally lower. The addition of canopy height to the spectral data in the predictor set significantly improved the DM models. In our study, calibrating the ML algorithm improved the model performance substantially, which shows the importance of this step. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
37. Phasing out support schemes for renewables in neighbouring countries: An agent-based model with investment preferences.
- Author
-
Melliger, Marc and Chappin, Emile
- Subjects
- *
ELECTRICITY pricing , *RENEWABLE energy sources , *ALGORITHMS , *COUNTRIES , *PHOTOVOLTAIC power generation - Abstract
[Display omitted] • We simulate renewables pathways under different support phase-out scenarios. • We extend an investment algorithm by preferences and calibrated returns. • Our improved algorithm incorporates more heterogeneity resulting in stronger effects. • Further auction support is necessary for most capacity targets in the case countries. • Countries should coordinate policy changes due to cross-border effects. Support schemes have been central to the expansion of renewable electricity globally and in the European Union. As technologies mature, individual member states may decide to phase out these policies. While previous research has shown that such policy changes affect investors' decisions, we investigate how they affect pathways and electricity prices by simulating investment decisions in an agent-based model in two case countries. This paper contributes and applies an adapted investment decision algorithm that incorporates empirically observed technology and return preferences and is calibrated by return observations. The new algorithm yields more refined and stronger effects compared to its predecessor. Results show that the phase-out of auctions in Germany and the Netherlands slows down their deployment of renewable capacity by up to ∼60% and ∼35%, respectively. With the exception of photovoltaics and onshore wind projects in the Netherlands, the targeted capacities can only be reached by continuing support in both countries. Furthermore, ending support in a large country like Germany leads to higher electricity prices and fosters a market-driven but insufficient capacity expansion in smaller neighbours like the Netherlands. As the electricity grids in many countries are strongly interconnected, such cross-border effects are of international relevance. Our findings suggest that continued auctions may be necessary and that countries should coordinate policy changes to stay on track for meeting their renewables targets. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
38. Einfluss atmosphärischer Umgebungsbedingungen auf den Lebenszyklus konvektiver Zellen in der Echtzeit-Vorhersage.
- Author
-
Wilhelm, Jannik
- Subjects
ACADEMIC dissertations ,MESOSCALE convective complexes ,RADAR ,WEATHER forecasting ,ALGORITHMS ,METEOROLOGICAL services ,CELLS ,LIFE cycles (Biology) ,DATA ,VERTICAL wind shear ,THUNDERSTORMS ,HAILSTORMS - Abstract
Copyright of Wissenschaftliche Berichte des Instituts für Meteorologie und Klimaforschung des Karlsruher Instituts für Technologie is the property of KIT Scientific Publishing and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
39. An algorithm to assess the heating strategy of buildings in cold climates: a case study of Germany.
- Author
-
Mazhar, Abdur Rehman, Zou, Yuliang, Zeng, Cheng, Shen, Yongliang, and Liu, Shuli
- Subjects
RESIDENTIAL heating systems ,HOME energy use ,WINTER ,THERMAL comfort ,HEATING load ,PEAK load ,ALGORITHMS - Abstract
Two-thirds of the final energy consumption of the EU residential sector goes towards space heating of buildings, yet a huge portion of the population still suffers from energy poverty. Identifying optimum heating strategies of current buildings would be a solution to this crisis, which is the main aim of the developed algorithm in this research. The algorithm incorporates a modified version of the simple hourly method from the ISO 13790 standard to determine the hourly heating load and indoor temperatures of buildings based on any heating strategy. Flexibility in the input of building and weather data make this tool versatile with practicality towards building users and policymakers. With this algorithm, a case study to evaluate three commonly used domestic heating strategies has been established for nine different residential buildings in typical cold winter conditions in Germany. Most EU households heat their buildings either continuously throughout the day at fixed temperatures, sporadically at fixed times or at peak loads during the evening. The continuous heating strategy is rated the best consuming minimal energy with consistent temperatures and optimal thermal comfort ranges. The sporadic and peak load heating strategies provide fluctuating indoor temperatures with high standard deviations of up to 8.70°C while consuming a similar cumulative energy to the continuous heating strategy. Additionally, both these aforementioned strategies augment energy poverty and promote indoor mould formation on the building envelope caused by water vapor condensation. Consequently, this algorithm can be applicable to any building type of any region. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
40. Simulation‐based training improves process times in acute stroke care (STREAM).
- Author
-
Bohmann, Ferdinand O., Gruber, Katharina, Kurka, Natalia, Willems, Laurent M., Herrmann, Eva, du Mesnil de Rochemont, Richard, Scholz, Peter, Rai, Heike, Zickler, Philipp, Ertl, Michael, Berlis, Ansgar, Poli, Sven, Mengel, Annerose, Ringleb, Peter, Nagel, Simon, Pfaff, Johannes, Wollenweber, Frank A., Kellert, Lars, Herzberg, Moriz, and Koehler, Luzie
- Subjects
ENDOVASCULAR surgery ,THROMBOLYTIC therapy ,ALGORITHMS ,MEDICAL simulation ,SECONDARY analysis ,TERTIARY care - Abstract
Background: The objective of the STREAM Trial was to evaluate the effect of simulation training on process times in acute stroke care. Methods: The multicenter prospective interventional STREAM Trial was conducted between 10/2017 and 04/2019 at seven tertiary care neurocenters in Germany with a pre‐ and post‐interventional observation phase. We recorded patient characteristics, acute stroke care process times, stroke team composition and simulation experience for consecutive direct‐to‐center patients receiving intravenous thrombolysis (IVT) and/or endovascular therapy (EVT). The intervention consisted of a composite intervention centered around stroke‐specific in situ simulation training. Primary outcome measure was the 'door‐to‐needle' time (DTN) for IVT. Secondary outcome measures included process times of EVT and measures taken to streamline the pre‐existing treatment algorithm. Results: The effect of the STREAM intervention on the process times of all acute stroke operations was neutral. However, secondary analyses showed a DTN reduction of 5 min from 38 min pre‐intervention (interquartile range [IQR] 25–43 min) to 33 min (IQR 23–39 min, p = 0.03) post‐intervention achieved by simulation‐experienced stroke teams. Concerning EVT, we found significantly shorter door‐to‐groin times in patients who were treated by teams with simulation experience as compared to simulation‐naive teams in the post‐interventional phase (−21 min, simulation‐naive: 95 min, IQR 69–111 vs. simulation‐experienced: 74 min, IQR 51–92, p = 0.04). Conclusion: An intervention combining workflow refinement and simulation‐based stroke team training has the potential to improve process times in acute stroke care. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. Beyond data protection concerns – the European passenger name record system.
- Author
-
Olsen, Henrik Palmer and Wiesener, Cornelius
- Subjects
DATA protection ,PASSENGERS ,RISK assessment ,CRIME statistics ,TERRORISM - Abstract
In this article, we examine the European framework of collecting and analysing flight passenger name record (PNR) data for the purpose of combating terrorism and serious crime. The focus is mainly on the EU PNR Directive of 2016, but we also consider the specific legislative framework in Germany and Denmark. In light of the recent review of the Directive, the article aims at exploring the policy-related, legal and technological challenges. In doing so, it goes beyond established data protection concerns. In particular, we debunk the popular claim that PNR analysis in and of itself entails the risk of discrimination of certain groups – a claim commonly levelled against algorithmic analysis. We also provide useful insights into the specific legal safeguards vis-à-vis automated profiling and decision-making through human review. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
42. l 2-Penalized temporal logit-mixed models for the estimation of regional obesity prevalence over time.
- Author
-
Burgard, Jan P, Krause, Joscha, Münnich, Ralf, and Morales, Domingo
- Subjects
OBESITY ,PARAMETER estimation ,THERAPEUTICS ,ALGORITHMS ,MODERN society ,LOGITS ,STATISTICAL bootstrapping - Abstract
Obesity is considered to be one of the primary health risks in modern industrialized societies. Estimating the evolution of its prevalence over time is an essential element of public health reporting. This requires the application of suitable statistical methods on epidemiologic data with substantial local detail. Generalized linear-mixed models with medical treatment records as covariates mark a powerful combination for this purpose. However, the task is methodologically challenging. Disease frequencies are subject to both regional and temporal heterogeneity. Medical treatment records often show strong internal correlation due to diagnosis-related grouping. This frequently causes excessive variance in model parameter estimation due to rank-deficiency problems. Further, generalized linear-mixed models are often estimated via approximate inference methods as their likelihood functions do not have closed forms. These problems combined lead to unacceptable uncertainty in prevalence estimates over time. We propose an l
2 -penalized temporal logit-mixed model to solve these issues. We derive empirical best predictors and present a parametric bootstrap to estimate their mean-squared errors. A novel penalized maximum approximate likelihood algorithm for model parameter estimation is stated. With this new methodology, the regional obesity prevalence in Germany from 2009 to 2012 is estimated. We find that the national prevalence ranges between 15 and 16%, with significant regional clustering in eastern Germany. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
43. Laparoscopic approaches to the retropubic space: three alternatives with anatomical considerations.
- Author
-
Anapolski, Michael, Alkatout, Ibrahim, Wedel, Thilo, Panayotopoulos, Dimitrios, Soltesz, Stefan, Schiermeier, Sven, Papathemelis, Thomas, and Noé, Günter K.
- Subjects
PELVIC anatomy ,LAPAROSCOPY ,UROLOGICAL surgery ,ALGORITHMS - Abstract
Many urogynecological and surgical laparoscopic interventions require access to the retropubic space, also known as the space of Retzius. Especially in patients with a history of previous surgery in this area or in general in the lower abdomen, the preparation may be complicated by adhesions and scar tissue. The necessity to combine several laparoscopic procedures in one surgical session may require consideration of the most appropriate way to approach the retropubic space. We describe and discuss three different options to access the space of Retzius via laparoscopy: the medial transperitoneal, the extraperitoneal and the lateral transperitoneal approach. For all approaches, we used one umbilical trocar and two trocars in the lower abdomen. An algorithm was developed to select the most appropriate access route to the retropubic space, depending on the history of previous surgeries and accompanying interventions. The knowledge of different access routes to the retropubic space offers the possibility of adjusting the surgical procedure to the individual constellation of the patient. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
44. A DETERMINISTIC ALMOST-TIGHT DISTRIBUTED ALGORITHM FOR APPROXIMATING SINGLE-SOURCE SHORTEST PATHS.
- Author
-
HENZINGER, MONIKA, KRINNINGER, SEBASTIAN, and NANONGKAI, DANUPON
- Subjects
ALGORITHMS ,DETERMINISTIC algorithms ,DISTRIBUTED algorithms ,DETERMINISTIC processes ,MATHEMATICS - Abstract
We present a deterministic (1+o(1))-approximation O(n
1/2+o(1) + D1+o(1) )-time algorithm for solving the single-source shortest paths problem on distributed weighted networks (the CONGEST model); here n is the number of nodes in the network and D is its (hop) diameter. This is the first non-trivial deterministic algorithm for this problem. It also improves (i) the running time of the randomized (1+o(1))-approximation Õ(√n1/2D1/4 +D)-time algorithm of Nanongkai [STOC 2014] by a factor of as large as n1/8 , and (ii) the O(є-1 logє-1 )-approximation factor of Lenzen and Patt-Shamir's Õ(n1/2+є +D)-time algorithm [STOC 2013, pp. 381-390] within the same running time. (Throughout, we use Õ(.) to hide polylogarithmic factors in n.) Our running time matches the known time lower bound of Ω(√n/log n + D) [M. Elkin, SIAM J. Comput., 36 (2006), pp. 433-456], thus essentially settling the status of this problem which was raised at least a decade ago [M. Elkin, SIGACT News, 35 (2004), pp. 40-57]. It also implies a (2+o(1))-approximation O(n1/2+o(1) +D1+o(1) )-time algorithm for approximating a network's weighted diameter which almost matches the lower bound by Holzer and Pinsker [in Proceedings of OPODIS, 2015, Schloss Dagstuhl. Leibniz-Zent. Inform., Wadern, Germany, 2016, 6]. In achieving this result, we develop two techniques which might be of independent interest and useful in other settings: (i) a deterministic process that replaces the "hitting set argument" commonly used for shortest paths computation in various settings, and (ii) a simple, deterministic construction of an (no(1) , o(1))-hop set of size n1+o(1) . We combine these techniques with many distributed algorithmic techniques, some of which are from problems that are not directly related to shortest paths, e.g., ruling sets [A. V. Goldberg, S. A. Plotkin, and G. E. Shannon, SIAM J. Discrete Math., 1 (1988), pp. 434-446], source detection [C. Lenzen and D. Peleg, in Proceedings of PODC, 2013, pp. 375-382], and partial distance estimation [C. Lenzen and B. Patt-Shamir, in Proceedings of PODC, 2015, pp. 153-162]. Our hop set construction also leads to single-source shortest paths algorithms in two other settings: (i) a (1+o(1))-approximation no(1)-time algorithm on congested cliques, and (ii) a (1+o(1))-approximation no(1) -pass n1+o(1) -space streaming algorithm. The first result answers an open problem in [D. Nanongkai, in Proceedings of STOC, 2014, pp. 565-573]. The second result partially answers an open problem raised by McGregor in 2006 [List of Open Problems in Sublinear Algorithms: Problem 14]. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
45. A new combined statistical method for bias adjustment and downscaling making use of multi-variate bias adjustment and PCA-driven rescaling.
- Author
-
KRÄHENMANN, STEFAN, HALLER, MICHAEL, and WALTER, ANDREAS
- Subjects
STATISTICAL bias ,DOWNSCALING (Climatology) ,ATMOSPHERIC models ,SUPPLY & demand ,ALGORITHMS ,EXTREME value theory ,PARETO distribution ,RANDOM forest algorithms - Abstract
One major concern of climate modeling is the projection of future extreme events as they have the most severe impact on society and environment. This is a challenging task for modeling and due to the low occurrence rate of extreme values. Furthermore, the local-scale characteristics of extreme events demand for high resolution model data. In the framework of the EURO-CORDEX initiative, climate model ensemble data on 0.11° grid resolution are produced. In order to provide climate data on a higher resolution in an efficient and reliable way, a statistical downscaling method has been developed, which combines bias adjustment and downscaling. With this method, an ensemble of climate model data on a target resolution of 5 km has been built and it was established as a reference ensemble for Germany. The ensemble consists of the three scenarios RCP 2.6, 4.5 and 8.5, 44 members in total. The method is comprehensible and of minimum complexity. It involves objective predictor selection and it can be applied for different areas, horizontal resolutions, target variables and predictor data sets, and, thus, providing high flexibility. While the methodology imposes refined structures onto modeled data, it does not affect the models data range and therefore allows for extrapolation beyond observed values. The raw model data show for threshold-based indices a rather large spread and bias, which was tremendously improved in the bias adjustment step. Downscaling is challenging as local terrain features can introduce unpredictable residual variation without localized information from e.g. observations. In particular, temperature-based extreme values were well captured by the downscaling algorithm, as temperature is strongly elevation dependent. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
46. Bored Techies Being Casually Racist: Race as Algorithm.
- Author
-
Amrute, Sareeta
- Subjects
ALGORITHMS ,RACE ,DIVISION of labor ,HORSE racetracks ,SOFTWARE engineers ,INTERNATIONAL competition - Abstract
Connecting corporate software work in the United States and Germany, this essay tracks the racialization of mostly male Indian software engineers through the casualization of their labor. In doing so, I show the connections between overt, anti-immigrant violence today and the ongoing use of race to sediment divisions of labor in the industry as a whole. To explain racialization in the tech industry, I develop the concept of race-as-algorithm as a device to unpack how race is made productive within digital economies and to show the flexibility of race as it works to create orders of classification that are sensitive to context. Using evidence collected through observation in tech offices and through interviews with programmers over five years, I track race as an essential but continually disavowed variable within the construction of global tech economies. Historical racializations of casual labor in plantation economies illuminates how casualness marks laborers whose rights can be muted and allows corporations to deny their culpability in promoting discrimination within and outside of the tech industry. These denials occur across a political field that divides "good" from "bad" migrants. Using the ethnographic symptoms that Indian tech workers identify in their environments, this essay reads these signs as an antidote to these continued denials. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
47. Effects of electrical anisotropy on long-offset transient electromagnetic data.
- Author
-
Liu, Yajun, Yogeshwar, Pritam, Hu, Xiangyun, Peng, Ronghua, Tezkan, Bülent, Mörbe, Wiebke, and Li, Jianhui
- Subjects
ANISOTROPY ,BLACK shales ,GEOLOGICAL modeling ,ELECTRIC fields ,ALGORITHMS ,ELECTROMAGNETIC theory - Abstract
Electrical anisotropy of formations has been long recognized by field and laboratory evidence. However, most interpretations of long-offset transient electromagnetic (LOTEM) data are based on the assumption of an electrical isotropic earth. Neglecting electrical anisotropy of formations may cause severe misleading interpretations in regions with strong electrical anisotropy. During a large scale LOTEM survey in a former mining area in Eastern Germany, data was acquired over black shale formations. These black shales are expected to produce a pronounced bulk anisotropy. Here, we investigate the effects of electrical anisotropy on LOTEM responses through numerical simulation using a finite-volume time-domain (FVTD) algorithm. On the basis of isotropic models obtained from LOTEM field data, various anisotropic models are developed and analysed. Numerical results demonstrate that the presence of electrical anisotropy has a significant influence on LOTEM responses. Based on the numerical modelling results, an isolated deep conductive anomaly presented in the 2-D isotropic LOTEM electric field data inversion result is identified as a possible artifact introduced by using an isotropic inversion scheme. Trial-and-error forward modelling of the LOTEM electric field data using an anisotropic conductivity model can explain the data and results in a reasonable quantitative data fit. The derived anisotropic 2-D model is consistent with the prior geological information. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
48. Clinical application of dynamic contrast enhanced ultrasound in monitoring the treatment response of chemoradiotherapy of pancreatic ductal adenocarcinoma.
- Author
-
Zhang, Qi, Wu, Lili, Yang, Daohui, Qiu, Yijie, Yu, Lingyun, Dong, Yi, and Wang, Wen-Ping
- Subjects
CONTRAST-enhanced ultrasound ,RECTAL cancer ,IMAGING systems ,ALGORITHMS ,PERFUSION ,THERAPEUTICS - Abstract
OBJECTIVES: To investigate the value of dynamic contrast enhanced ultrasound (D-CEUS) in monitoring the chemoradiotherapy (CRT) therapeutic response of local advanced pancreatic ductal adenocarcinoma (LAPC). PATIENTS AND METHODS: From October 2017 to December 2018, 11 patients diagnosed as LAPC were included (7 men, 4 women; mean age: 61.1±8.6 years). The algorithm of CRT was as following: the radiotherapy dose was 50.4 Gy/28Fx with S-1 40 mg bid orally taken in radiotherapy day. Conventional ultrasound scan and CEUS were performed before and 4 weeks after CRT. All ultrasound examinations were performed by an ACUSON Oxana 2 ultrasound equipment (Siemens Medical Solutions, Germany) with a C 6-1 convex array transducer (1–6 MHz). Time intensity curves (TICs) were generated in the region of interests (ROIs) both in LAPC lesions and in its surrounding pancreas parenchyma by SonoLiver software (TOMTEC Imaging Systems). Quantitative perfusion parameters including maximum intensity (MI), rise time (RT), mean transit time (mTT) and time to peak (TTP) were analyzed and compared before and after CRT. RESULTS: No significant difference could be found by conventional B mode ultrasound scan after CRT. TICs of CEUS showed lower ascending and descending slopes rate after CRT. Among all perfusion quantitative parameters, MI decreased significantly after CRT (42.1±18.8% vs 27.8±17.2%, P < 0.05). CONCLUSIONS: Depending on its unique advantages as non-radiation, effective and convenient, D-CEUS analysis and quantitative parameters, particularly MI, has potential application value in following up of the CRT treatment response in LAPC patients. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
49. Soft Presentation of Hard News? A Content Analysis of Political Facebook Posts.
- Author
-
Steiner, Miriam
- Subjects
CONTENT analysis ,ATTRIBUTION of news ,QUALITY of service ,ALGORITHMS ,MUNICIPAL services ,SOCIAL media ,ELECTRONIC newspapers - Abstract
The current media environment is primarily characterised by a large amount of information and, in contrast, rather fragmented audience attention. This is especially true for social media, particularly Facebook, which have become important news sources for many people. Journalists cannot help but publish content on Facebook if they want to reach the part of their audience that mainly--or even only--consumes news there. On Facebook, journalists are at the mercy of the algorithm that determines the visibility of their content. Because user engagement is a crucial factor in the algorithm, concerns have been raised that journalists are abandoning their normative quality standards to make the news as attractive as possible to the audience--at the expense of media performance. A softened presentation of the news, particularly in Facebook posts, may help achieve this aim, but research on this subject is lacking. The present study analyses this practice of softening the news in four German media outlets' (BILD, FAZ, Der Spiegel, Tagesschau) political Facebook posts. The results show that the overall level of news softening is low to medium. Furthermore, comparing them to website teasers reveals that news softening is only slightly higher on Facebook (mainly BILD and Der Spiegel), and that there are no converging trends between quality or public service media and tabloid media. Exaggerated fears about news softening are therefore unnecessary. Continued analysis of news softening, as well as ongoing adaption of the concept according to dynamic developments, is nevertheless important. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
50. Concordance-analysis and evaluation of different diagnostic algorithms used in first trimester screening for late-onset preeclampsia.
- Author
-
Schaller, Sabrina, Knippel, Alexander Johannes, Verde, Pablo Emilio, and Kozlowski, Peter
- Subjects
PREECLAMPSIA ,CROSS-sectional method ,ALGORITHMS ,LOGISTIC regression analysis ,PREECLAMPSIA diagnosis ,FIRST trimester of pregnancy ,RETROSPECTIVE studies ,RISK assessment - Abstract
Objective: Concordance-analysis and evaluation of existing algorithms detecting late-onset preeclampsia during first trimester screeningMethods: Retrospective cohort study investigating risk algorithms of late-onset preeclampsia during first trimester screening in a German prenatal center. Three previously developed algorithms including anamnestic factors (Apriori) and biophysical markers (BioM) were investigated by using detection rates (DR) with fixed FPR 10% and fixed cutoff >1:100. Furthermore, we set up a concordance-analysis of test results in late-onset preeclampsia cases to examine the effect of influencing factors and to detect potential weaknesses of the algorithms. Therefore, we modeled the probability of discordances as a function of the influencing factors based on a logistic regression, that was fitted using a Bayesian approach.Results: 6,113 pregnancies were considered, whereof 700 have been excluded and 5,413 pregnancies were analyzed. 98 (1.8%) patients developed preeclampsia (79 late-onsets, 19 early-onsets). The Apriori-algorithm reaches a DR of 34.2%, by adding BioM (MAP and UtA-PI) the DR improves to 57.0% (FPR of 10%). In concordance-analysis of Apriori algorithm and Apriori+BioM algorithms, influencing factor BMI<25 increases the chance of discordances sigificantly. Additional, in the subgroup of late-onset preeclampsias with BMI<25 the DR is higher in Apriori+BioM algorithms than in Apriori algorithm alone. If both compared algorithms include BioM, influencing factor MAP decreases the chance of discordances significantly. All other tested influencing factors do not have a statistically significant effect on discordancesConclusion: Normal-weight patients benefit more from the integration of MAP and UtA-PI compared to overweight/obese patients. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.